WorldWideScience

Sample records for model validation streamflow

  1. Validity of Differently Bias-Corrected Regional Climate Model Simulations for Streamflow Simulations under Changing Climate Conditions

    Science.gov (United States)

    Teutschbein, C.; Seibert, J.

    2012-04-01

    The direct application of Regional Climate Model (RCM) simulations in hydrological climate-change impact studies can be questionable due to the potential risk for considerable biases. Several bias correction approaches - ranging from simple scaling to rather sophisticated methods - have been developed to help impact modelers coping with the various problems linked to biased RCM output. The main disadvantage of any of these correction procedures is their underlying assumption of stationarity: the correction algorithm and its parameterization for current climate are expected to also be valid for future climate conditions. Whether or not this presupposition is actually fulfilled for future conditions cannot be evaluated - given our lack of time machines. Nevertheless, systematic testing of how well bias correction procedures perform for conditions different from those used for calibration can be done by applying a differential split-sample as proposed by Klemeš ["Operational testing of hydrological simulation models", Hydrological Sciences Journal 31, no. 1 (1986): 13-24]. This contribution summarizes shortly available bias correction methods and demonstrates their application using the example of an ensemble of 11 different RCM-simulated temperature and precipitation series. We then applied a differential split-sample test which enabled us to evaluate the performance of different bias correction procedures under changing climate conditions with only a limited amount of data (30-year records). Furthermore, we evaluated the different correction methods based on their combined influence on hydrological simulations of monthly mean streamflow as well as spring and autumn flood peaks for five meso-scale catchments in Sweden under current (1961-1990) and future (2021-2050) climate conditions. This differential split-sample test resulted in a large spread and a clear bias for some of the correction methods during validation based on an independent data set. More

  2. Climate Change Impacts to Hydro Power Reservoir Systems in British Columbia, Canada: Modelling, Validation and Projection of Historic and Future Streamflow and Snowpack

    Science.gov (United States)

    Bennett, K. E.; Schnorbus, M.; Werner, A. T.; Berland, A. J.

    2010-12-01

    The British Columbia Hydro Electric Corporation (BC Hydro) has a mandate to provide clean, renewable and reliable sources of hydro-electric power into the future, hence managing those resources in the context of climate change will be an important component of reservoir operational planning in British Columbia. The Pacific Climate Impacts Consortium (www.PacificClimate.org) has implemented the Variable Infiltration Capacity hydrologic model parameterized at 1/16th degree (~32 km2) to provide BC Hydro with future projections of changes to streamflow and snowpack to the 2050s. The headwaters of the Peace, Columbia, and Campbell River basins were selected for study; the Upper Peace River basin (101,000 km2) is a snowmelt-dominated watershed, and the Upper Columbia River Basin (104,000 km2) has a mixed snowmelt-glacier melt runoff regime, with glacier runoff contributing up to 15 to 20% of late summer discharge. The Upper Campbell River watershed (1,200 km2) has a mixed rainfall and snowmelt (hybrid) hydrologic regime. The model has been calibrated using historical streamflow observations and validated against these observations, as well as automated snow pillow measurements. Future streamflow changes are estimated based on eight Global Climate Models (GCMs) from the CMIP3 suite, downscaled using the Bias Correction Spatial Downscaling (BCSD) technique, run under three emissions scenarios (A2, A1B and B1; A1B is specifically reported on herein). Climate impacts by the 2050s in the three watersheds illustrate an increase in annual average temperature and precipitation ranging between +2.2°C to +2.8°C and +2% to +10% depending on basin, and an annual change in streamflow of -1% to +12% for the three watersheds. Changes are more profound on the seasonal time-scale and differ across basins. Summer streamflow in the Upper Campbell River watershed is projected to decline by -60%, where as the Upper Peace and Columbia systems are projected to decline by -25% and -22

  3. Calibration of Deterministic Streamflow Models in Ungaged Basins Using Statistically-Derived At-Site Streamflow Simulations

    Science.gov (United States)

    Farmer, William; Hay, Lauren; Kiang, Julie

    2017-04-01

    The continental-scale streamflow network provides vital information for quantifying water resources in the U.S., however, large portions of the U.S. remain ungaged and require streamflow to be estimated. Validation of streamflow models in ungaged basins is difficult due to the lack of observed daily streamflow data for model calibration. This difficulty often leads researchers to rely on regional statistical models (regionalizations) that are limited to the time series of the streamflow variable of interest and offer limited additional hydrologic outputs beyond the variable of interest compared to deterministic streamflow models, which are capable of modeling multiple elements of the water budget simultaneously. This work combines statistically-derived streamflows at ungaged locations to calibrate a popular hydrologic model, PRMS (Precipitation-Runoff Modeling System). Combining this statistically-derived, at-site information with observations of other hydrologic variables, including evapotranspiration and snow covered area, produced simulations as reasonable as alternative approaches to ungaged calibration. Improvements from calibrating to statistical time series are limited by maximum performance of an idealized at-site calibration, i.e., if observed streamflows were known for calibration. The degree of performance when calibrated with observed data is correlated with the degree of performance when calibrated with statistically-derived alternatives. Regionally-varying hydrologic processes limit the applicability of deterministic models, statistical models and the combination thereof; coupled calibration may improve identification of these underlying processes. The advantages and disadvantages of statistically-informed calibration of deterministic models are discussed with emphasis on continental application across the conterminous United States.

  4. Modeling multisite streamflow dependence with maximum entropy copula

    Science.gov (United States)

    Hao, Z.; Singh, V. P.

    2013-10-01

    Synthetic streamflows at different sites in a river basin are needed for planning, operation, and management of water resources projects. Modeling the temporal and spatial dependence structure of monthly streamflow at different sites is generally required. In this study, the maximum entropy copula method is proposed for multisite monthly streamflow simulation, in which the temporal and spatial dependence structure is imposed as constraints to derive the maximum entropy copula. The monthly streamflows at different sites are then generated by sampling from the conditional distribution. A case study for the generation of monthly streamflow at three sites in the Colorado River basin illustrates the application of the proposed method. Simulated streamflow from the maximum entropy copula is in satisfactory agreement with observed streamflow.

  5. Free internet datasets for streamflow modelling using SWAT in the Johor river basin, Malaysia

    International Nuclear Information System (INIS)

    Tan, M L

    2014-01-01

    Streamflow modelling is a mathematical computational approach that represents terrestrial hydrology cycle digitally and is used for water resources assessment. However, such modelling endeavours require a large amount of data. Generally, governmental departments produce and maintain these data sets which make it difficult to obtain this data due to bureaucratic constraints. In some countries, the availability and quality of geospatial and climate datasets remain a critical issue due to many factors such as lacking of ground station, expertise, technology, financial support and war time. To overcome this problem, this research used public domain datasets from the Internet as ''input'' to a streamflow model. The intention is simulate daily and monthly streamflow of the Johor River Basin in Malaysia. The model used is the Soil and Water Assessment Tool (SWAT). As input free data including a digital elevation model (DEM), land use information, soil and climate data were used. The model was validated by in-situ streamflow information obtained from Rantau Panjang station for the year 2006. The coefficient of determination and Nash-Sutcliffe efficiency were 0.35/0.02 for daily simulated streamflow and 0.92/0.21 for monthly simulated streamflow, respectively. The results show that free data can provide a better simulation at a monthly scale compared to a daily basis in a tropical region. A sensitivity analysis and calibration procedure should be conducted in order to maximize the ''goodness-of-fit'' between simulated and observed streamflow. The application of Internet datasets promises an acceptable performance of streamflow modelling. This research demonstrates that public domain data is suitable for streamflow modelling in a tropical river basin within acceptable accuracy

  6. Free internet datasets for streamflow modelling using SWAT in the Johor river basin, Malaysia

    Science.gov (United States)

    Tan, M. L.

    2014-02-01

    Streamflow modelling is a mathematical computational approach that represents terrestrial hydrology cycle digitally and is used for water resources assessment. However, such modelling endeavours require a large amount of data. Generally, governmental departments produce and maintain these data sets which make it difficult to obtain this data due to bureaucratic constraints. In some countries, the availability and quality of geospatial and climate datasets remain a critical issue due to many factors such as lacking of ground station, expertise, technology, financial support and war time. To overcome this problem, this research used public domain datasets from the Internet as "input" to a streamflow model. The intention is simulate daily and monthly streamflow of the Johor River Basin in Malaysia. The model used is the Soil and Water Assessment Tool (SWAT). As input free data including a digital elevation model (DEM), land use information, soil and climate data were used. The model was validated by in-situ streamflow information obtained from Rantau Panjang station for the year 2006. The coefficient of determination and Nash-Sutcliffe efficiency were 0.35/0.02 for daily simulated streamflow and 0.92/0.21 for monthly simulated streamflow, respectively. The results show that free data can provide a better simulation at a monthly scale compared to a daily basis in a tropical region. A sensitivity analysis and calibration procedure should be conducted in order to maximize the "goodness-of-fit" between simulated and observed streamflow. The application of Internet datasets promises an acceptable performance of streamflow modelling. This research demonstrates that public domain data is suitable for streamflow modelling in a tropical river basin within acceptable accuracy.

  7. An Hourly Streamflow Forecasting Model Coupled with an Enforced Learning Strategy

    Directory of Open Access Journals (Sweden)

    Ming-Chang Wu

    2015-10-01

    Full Text Available Floods, one of the most significant natural hazards, often result in loss of life and property. Accurate hourly streamflow forecasting is always a key issue in hydrology for flood hazard mitigation. To improve the performance of hourly streamflow forecasting, a methodology concerning the development of neural network (NN based models with an enforced learning strategy is proposed in this paper. Firstly, four different NNs, namely back propagation network (BPN, radial basis function network (RBFN, self-organizing map (SOM, and support vector machine (SVM, are used to construct streamflow forecasting models. Through the cross-validation test, NN-based models with superior performance in streamflow forecasting are detected. Then, an enforced learning strategy is developed to further improve the performance of the superior NN-based models, i.e., SOM and SVM in this study. Finally, the proposed flow forecasting model is obtained. Actual applications are conducted to demonstrate the potential of the proposed model. Moreover, comparison between the NN-based models with and without the enforced learning strategy is performed to evaluate the effect of the enforced learning strategy on model performance. The results indicate that the NN-based models with the enforced learning strategy indeed improve the accuracy of hourly streamflow forecasting. Hence, the presented methodology is expected to be helpful for developing improved NN-based streamflow forecasting models.

  8. Testing and modelling autoregressive conditional heteroskedasticity of streamflow processes

    Directory of Open Access Journals (Sweden)

    W. Wang

    2005-01-01

    Full Text Available Conventional streamflow models operate under the assumption of constant variance or season-dependent variances (e.g. ARMA (AutoRegressive Moving Average models for deseasonalized streamflow series and PARMA (Periodic AutoRegressive Moving Average models for seasonal streamflow series. However, with McLeod-Li test and Engle's Lagrange Multiplier test, clear evidences are found for the existence of autoregressive conditional heteroskedasticity (i.e. the ARCH (AutoRegressive Conditional Heteroskedasticity effect, a nonlinear phenomenon of the variance behaviour, in the residual series from linear models fitted to daily and monthly streamflow processes of the upper Yellow River, China. It is shown that the major cause of the ARCH effect is the seasonal variation in variance of the residual series. However, while the seasonal variation in variance can fully explain the ARCH effect for monthly streamflow, it is only a partial explanation for daily flow. It is also shown that while the periodic autoregressive moving average model is adequate in modelling monthly flows, no model is adequate in modelling daily streamflow processes because none of the conventional time series models takes the seasonal variation in variance, as well as the ARCH effect in the residuals, into account. Therefore, an ARMA-GARCH (Generalized AutoRegressive Conditional Heteroskedasticity error model is proposed to capture the ARCH effect present in daily streamflow series, as well as to preserve seasonal variation in variance in the residuals. The ARMA-GARCH error model combines an ARMA model for modelling the mean behaviour and a GARCH model for modelling the variance behaviour of the residuals from the ARMA model. Since the GARCH model is not followed widely in statistical hydrology, the work can be a useful addition in terms of statistical modelling of daily streamflow processes for the hydrological community.

  9. Simulation of daily streamflow for nine river basins in eastern Iowa using the Precipitation-Runoff Modeling System

    Science.gov (United States)

    Haj, Adel E.; Christiansen, Daniel E.; Hutchinson, Kasey J.

    2015-10-14

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, constructed Precipitation-Runoff Modeling System models to estimate daily streamflow for nine river basins in eastern Iowa that drain into the Mississippi River. The models are part of a suite of methods for estimating daily streamflow at ungaged sites. The Precipitation-Runoff Modeling System is a deterministic, distributed- parameter, physical-process-based modeling system developed to evaluate the response of streamflow and general drainage basin hydrology to various combinations of climate and land use. Calibration and validation periods used in each basin mostly were October 1, 2002, through September 30, 2012, but differed depending on the period of record available for daily mean streamflow measurements at U.S. Geological Survey streamflow-gaging stations.

  10. Generalized martingale model of the uncertainty evolution of streamflow forecasts

    Science.gov (United States)

    Zhao, Tongtiegang; Zhao, Jianshi; Yang, Dawen; Wang, Hao

    2013-07-01

    Streamflow forecasts are dynamically updated in real-time, thus facilitating a process of forecast uncertainty evolution. Forecast uncertainty generally decreases over time and as more hydrologic information becomes available. The process of forecasting and uncertainty updating can be described by the martingale model of forecast evolution (MMFE), which formulates the total forecast uncertainty of a streamflow in one future period as the sum of forecast improvements in the intermediate periods. This study tests the assumptions, i.e., unbiasedness, Gaussianity, temporal independence, and stationarity, of MMFE using real-world streamflow forecast data. The results show that (1) real-world forecasts can be biased and tend to underestimate the actual streamflow, and (2) real-world forecast uncertainty is non-Gaussian and heavy-tailed. Based on these statistical tests, this study proposes a generalized martingale model GMMFE for the simulation of biased and non-Gaussian forecast uncertainties. The new model combines the normal quantile transform (NQT) with MMFE to formulate the uncertainty evolution of real-world streamflow forecasts. Reservoir operations based on a synthetic forecast by GMMFE illustrates that applications of streamflow forecasting facilitate utility improvements and that special attention should be focused on the statistical distribution of forecast uncertainty.

  11. Streamflow forecast in the Alto do Rio Doce watershed in Brazil, using hydrological and atmospheric model

    Science.gov (United States)

    Silva, J. M.; Saad, S. I.; Palma, G.; Rocha, H.; Palmeira, R. M.; Silva, B. L.; Pessoa, A. A.; Ramos, C. G.; Cecchini, M. A.

    2013-05-01

    Electrical energy in Brazil depends essentially on the streamflow, as hydropowers accounts for up to 79% of the total electrical energy installed capacity. Therefore, streamflow forecasts are very important tools to assist in the planning and operation of Brazilian hydroelectric reservoirs. This study evaluated the performance of a distributed hydrological model, Soil and Water Assessment Tool (SWAT) daily streamflow forecasts into four Reservoirs sited in the Alto do Rio Doce Watershed, in Southeast of Brazil. SWAT model was used with precipitation forecast from the regional meteorological model MM5. The calibration and validation processes of SWAT were accomplished using data from four monitoring stations. The model has been run for the 2010-2012 period, and while the apr/2010-set/2011 period has been used for calibration conducted manually, the validation reached the rest of the period. The manual calibration was conducted by the means of sensibility tests of parameters that control surface runoff and groundwater flow, specially the surlag and alpha_bf, respectively the surface runoff lag coefficient and the baseflow recession constant. The daily and monthly Nash-Sutcliffe, R2 and the mean relative error performance indicators were used to assess the relative performance of the model. Results showed that streamflow forecast was very similar toobservations, except in reservoirs with lower drainage areas, where the model did not simulated the beginning of the flood (Dec-Feb). The streamflow forecasts was strongly dependent on the quality of precipitation forecasts used. Given that no correction in the simulated rainfall by the MM5 model in the Alto do Rio Doce watershed has been conducted and no automated calibration method was applied to the parameters of the hydrologic model, we can conclude that the application of the SWAT hydrologic model employing the output data from the MM5 atmospheric model for the streamflow forecast was shown to be a tool of great

  12. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  13. Validation of streamflow measurements made with M9 and RiverRay acoustic Doppler current profilers

    Science.gov (United States)

    Boldt, Justin A.; Oberg, Kevin A.

    2015-01-01

    The U.S. Geological Survey (USGS) Office of Surface Water (OSW) previously validated the use of Teledyne RD Instruments (TRDI) Rio Grande (in 2007), StreamPro (in 2006), and Broadband (in 1996) acoustic Doppler current profilers (ADCPs) for streamflow (discharge) measurements made by the USGS. Two new ADCPs, the SonTek M9 and the TRDI RiverRay, were first used in the USGS Water Mission Area programs in 2009. Since 2009, the OSW and USGS Water Science Centers (WSCs) have been conducting field measurements as part of their stream-gaging program using these ADCPs. The purpose of this paper is to document the results of USGS OSW analyses for validation of M9 and RiverRay ADCP streamflow measurements. The OSW required each participating WSC to make comparison measurements over the range of operating conditions in which the instruments were used until sufficient measurements were available. The performance of these ADCPs was evaluated for validation and to identify any present and potential problems. Statistical analyses of streamflow measurements indicate that measurements made with the SonTek M9 ADCP using firmware 2.00–3.00 or the TRDI RiverRay ADCP using firmware 44.12–44.15 are unbiased, and therefore, can continue to be used to make streamflow measurements in the USGS stream-gaging program. However, for the M9 ADCP, there are some important issues to be considered in making future measurements. Possible future work may include additional validation of streamflow measurements made with these instruments from other locations in the United States and measurement validation using updated firmware and software.

  14. Inferring Soil Moisture Memory from Streamflow Observations Using a Simple Water Balance Model

    Science.gov (United States)

    Orth, Rene; Koster, Randal Dean; Seneviratne, Sonia I.

    2013-01-01

    Soil moisture is known for its integrative behavior and resulting memory characteristics. Soil moisture anomalies can persist for weeks or even months into the future, making initial soil moisture a potentially important contributor to skill in weather forecasting. A major difficulty when investigating soil moisture and its memory using observations is the sparse availability of long-term measurements and their limited spatial representativeness. In contrast, there is an abundance of long-term streamflow measurements for catchments of various sizes across the world. We investigate in this study whether such streamflow measurements can be used to infer and characterize soil moisture memory in respective catchments. Our approach uses a simple water balance model in which evapotranspiration and runoff ratios are expressed as simple functions of soil moisture; optimized functions for the model are determined using streamflow observations, and the optimized model in turn provides information on soil moisture memory on the catchment scale. The validity of the approach is demonstrated with data from three heavily monitored catchments. The approach is then applied to streamflow data in several small catchments across Switzerland to obtain a spatially distributed description of soil moisture memory and to show how memory varies, for example, with altitude and topography.

  15. Assimilating uncertain, dynamic and intermittent streamflow observations in hydrological models

    Science.gov (United States)

    Mazzoleni, Maurizio; Alfonso, Leonardo; Chacon-Hurtado, Juan; Solomatine, Dimitri

    2015-09-01

    Catastrophic floods cause significant socio-economical losses. Non-structural measures, such as real-time flood forecasting, can potentially reduce flood risk. To this end, data assimilation methods have been used to improve flood forecasts by integrating static ground observations, and in some cases also remote sensing observations, within water models. Current hydrologic and hydraulic research works consider assimilation of observations coming from traditional, static sensors. At the same time, low-cost, mobile sensors and mobile communication devices are becoming also increasingly available. The main goal and innovation of this study is to demonstrate the usefulness of assimilating uncertain streamflow observations that are dynamic in space and intermittent in time in the context of two different semi-distributed hydrological model structures. The developed method is applied to the Brue basin, where the dynamic observations are imitated by the synthetic observations of discharge. The results of this study show how model structures and sensors locations affect in different ways the assimilation of streamflow observations. In addition, it proves how assimilation of such uncertain observations from dynamic sensors can provide model improvements similar to those of streamflow observations coming from a non-optimal network of static physical sensors. This can be a potential application of recent efforts to build citizen observatories of water, which can make the citizens an active part in information capturing, evaluation and communication, helping simultaneously to improvement of model-based flood forecasting.

  16. A multivariate conditional model for streamflow prediction and spatial precipitation refinement

    Science.gov (United States)

    Liu, Zhiyong; Zhou, Ping; Chen, Xiuzhi; Guan, Yinghui

    2015-10-01

    The effective prediction and estimation of hydrometeorological variables are important for water resources planning and management. In this study, we propose a multivariate conditional model for streamflow prediction and the refinement of spatial precipitation estimates. This model consists of high dimensional vine copulas, conditional bivariate copula simulations, and a quantile-copula function. The vine copula is employed because of its flexibility in modeling the high dimensional joint distribution of multivariate data by building a hierarchy of conditional bivariate copulas. We investigate two cases to evaluate the performance and applicability of the proposed approach. In the first case, we generate one month ahead streamflow forecasts that incorporate multiple predictors including antecedent precipitation and streamflow records in a basin located in South China. The prediction accuracy of the vine-based model is compared with that of traditional data-driven models such as the support vector regression (SVR) and the adaptive neuro-fuzzy inference system (ANFIS). The results indicate that the proposed model produces more skillful forecasts than SVR and ANFIS. Moreover, this probabilistic model yields additional information concerning the predictive uncertainty. The second case involves refining spatial precipitation estimates derived from the tropical rainfall measuring mission precipitationproduct for the Yangtze River basin by incorporating remotely sensed soil moisture data and the observed precipitation from meteorological gauges over the basin. The validation results indicate that the proposed model successfully refines the spatial precipitation estimates. Although this model is tested for specific cases, it can be extended to other hydrometeorological variables for predictions and spatial estimations.

  17. STREAMFLOW AND WATER QUALITY REGRESSION MODELING ...

    African Journals Online (AJOL)

    Journal of Modeling, Design and Management of Engineering Systems ... Consistency tests, trend analyses and mathematical modeling of water quality constituents and riverflow characteristics at upstream Nekede station and downstream Obigbo station show: consistent time-trends in degree of contamination; linear and ...

  18. Numerical model for learning concepts of streamflow simulation

    Science.gov (United States)

    DeLong, L.L.; ,

    1993-01-01

    Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.

  19. Determining the importance of model calibration for forecasting absolute/relative changes in streamflow from LULC and climate changes

    Science.gov (United States)

    Niraula, Rewati; Meixner, Thomas; Norman, Laura M.

    2015-01-01

    Land use/land cover (LULC) and climate changes are important drivers of change in streamflow. Assessing the impact of LULC and climate changes on streamflow is typically done with a calibrated and validated watershed model. However, there is a debate on the degree of calibration required. The objective of this study was to quantify the variation in estimated relative and absolute changes in streamflow associated with LULC and climate changes with different calibration approaches. The Soil and Water Assessment Tool (SWAT) was applied in an uncalibrated (UC), single outlet calibrated (OC), and spatially-calibrated (SC) mode to compare the relative and absolute changes in streamflow at 14 gaging stations within the Santa Cruz River Watershed in southern Arizona, USA. For this purpose, the effect of 3 LULC, 3 precipitation (P), and 3 temperature (T) scenarios were tested individually. For the validation period, Percent Bias (PBIAS) values were >100% with the UC model for all gages, the values were between 0% and 100% with the OC model and within 20% with the SC model. Changes in streamflow predicted with the UC and OC models were compared with those of the SC model. This approach implicitly assumes that the SC model is “ideal”. Results indicated that the magnitude of both absolute and relative changes in streamflow due to LULC predicted with the UC and OC results were different than those of the SC model. The magnitude of absolute changes predicted with the UC and SC models due to climate change (both P and T) were also significantly different, but were not different for OC and SC models. Results clearly indicated that relative changes due to climate change predicted with the UC and OC were not significantly different than that predicted with the SC models. This result suggests that it is important to calibrate the model spatially to analyze the effect of LULC change but not as important for analyzing the relative change in streamflow due to climate change. This

  20. Artificial Neural Network Models for Long Lead Streamflow Forecasts using Climate Information

    Science.gov (United States)

    Kumar, J.; Devineni, N.

    2007-12-01

    developed between the identified predictors and the predictand. Predictors used are the scores of Principal Components Analysis (PCA). The models were tested and validated. The feed- forward multi-layer perceptron (MLP) type neural networks trained using the back-propagation algorithms are employed in the current study. The performance of the ANN-model forecasts are evaluated using various performance evaluation measures such as correlation coefficient, root mean square error (RMSE). The preliminary results shows that ANNs are efficient to forecast long lead time streamflows using climatic predictors.

  1. Multivariate synthetic streamflow generation using a hybrid model based on artificial neural networks

    Directory of Open Access Journals (Sweden)

    J. C. Ochoa-Rivera

    2002-01-01

    Full Text Available A model for multivariate streamflow generation is presented, based on a multilayer feedforward neural network. The structure of the model results from two components, the neural network (NN deterministic component and a random component which is assumed to be normally distributed. It is from this second component that the model achieves the ability to incorporate effectively the uncertainty associated with hydrological processes, making it valuable as a practical tool for synthetic generation of streamflow series. The NN topology and the corresponding analytical explicit formulation of the model are described in detail. The model is calibrated with a series of monthly inflows to two reservoir sites located in the Tagus River basin (Spain, while validation is performed through estimation of a set of statistics that is relevant for water resources systems planning and management. Among others, drought and storage statistics are computed and compared for both the synthetic and historical series. The performance of the NN-based model was compared to that of a standard autoregressive AR(2 model. Results show that NN represents a promising modelling alternative for simulation purposes, with interesting potential in the context of water resources systems management and optimisation. Keywords: neural networks, perceptron multilayer, error backpropagation, hydrological scenario generation, multivariate time-series..

  2. From spatially variable streamflow to distributed hydrological models: Analysis of key modeling decisions

    Science.gov (United States)

    Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent

    2016-02-01

    This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the

  3. A metric for attributing variability in modelled streamflows

    Science.gov (United States)

    Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish

    2016-10-01

    Significant gaps in our present understanding of hydrological systems lead to enhanced uncertainty in key modelling decisions. This study proposes a method, namely ;Quantile Flow Deviation (QFD);, for the attribution of forecast variability to different sources across different streamflow regimes. By using a quantile based metric, we can assess the change in uncertainty across individual percentiles, thereby allowing uncertainty to be expressed as a function of magnitude and time. As a result, one can address selective sources of uncertainty depending on whether low or high flows (say) are of interest. By way of a case study, we demonstrate the usefulness of the approach for estimating the relative importance of model parameter identification, objective functions and model structures as sources of stream flow forecast uncertainty. We use FUSE (Framework for Understanding Structural Errors) to implement our methods, allowing selection of multiple different model structures. Cross-catchment comparison is done for two different catchments: Leaf River in Mississippi, USA and Bass River of Victoria, Australia. Two different approaches to parameter estimation are presented that demonstrate the statistic- one based on GLUE, the other one based on optimization. The results presented in this study suggest that the determination of the model structure with the design catchment should be given priority but that objective function selection with parameter identifiability can lead to significant variability in results. By examining the QFD across multiple flow quantiles, the ability of certain models and optimization routines to constrain variability for different flow conditions is demonstrated.

  4. Comparing large-scale hydrological model predictions with observed streamflow in the Pacific Northwest: effects of climate and groundwater

    Science.gov (United States)

    Mohammad Safeeq; Guillaume S. Mauger; Gordon E. Grant; Ivan Arismendi; Alan F. Hamlet; Se-Yeun Lee

    2014-01-01

    Assessing uncertainties in hydrologic models can improve accuracy in predicting future streamflow. Here, simulated streamflows using the Variable Infiltration Capacity (VIC) model at coarse (1/16°) and fine (1/120°) spatial resolutions were evaluated against observed streamflows from 217 watersheds. In...

  5. Climate model assessment of changes in winter-spring streamflow timing over North America

    Science.gov (United States)

    Kam, Jonghun; Knutson, Thomas R.; Milly, Paul C. D.

    2018-01-01

    Over regions where snow-melt runoff substantially contributes to winter-spring streamflows, warming can accelerate snow melt and reduce dry-season streamflows. However, conclusive detection of changes and attribution to anthropogenic forcing is hindered by brevity of observational records, model uncertainty, and uncertainty concerning internal variability. In this study, a detection/attribution of changes in mid-latitude North American winter-spring streamflow timing is examined using nine global climate models under multiple forcing scenarios. In this study, robustness across models, start/end dates for trends, and assumptions about internal variability is evaluated. Marginal evidence for an emerging detectable anthropogenic influence (according to four or five of nine models) is found in the north-central U.S., where winter-spring streamflows have been coming earlier. Weaker indications of detectable anthropogenic influence (three of nine models) are found in the mountainous western U.S./southwestern Canada and in extreme northeastern U.S./Canadian Maritimes. In the former region, a recent shift toward later streamflows has rendered the full-record trend toward earlier streamflows only marginally significant, with possible implications for previously published climate change detection findings for streamflow timing in this region. In the latter region, no forced model shows as large a shift toward earlier streamflow timing as the detectable observed shift. In other (including warm, snow-free) regions, observed trends are typically not detectable, although in the U.S. central plains we find detectable delays in streamflow, which are inconsistent with forced model experiments.

  6. Simulation-optimization framework for multi-site multi-season hybrid stochastic streamflow modeling

    Science.gov (United States)

    Srivastav, Roshan; Srinivasan, K.; Sudheer, K. P.

    2016-11-01

    A simulation-optimization (S-O) framework is developed for the hybrid stochastic modeling of multi-site multi-season streamflows. The multi-objective optimization model formulated is the driver and the multi-site, multi-season hybrid matched block bootstrap model (MHMABB) is the simulation engine within this framework. The multi-site multi-season simulation model is the extension of the existing single-site multi-season simulation model. A robust and efficient evolutionary search based technique, namely, non-dominated sorting based genetic algorithm (NSGA - II) is employed as the solution technique for the multi-objective optimization within the S-O framework. The objective functions employed are related to the preservation of the multi-site critical deficit run sum and the constraints introduced are concerned with the hybrid model parameter space, and the preservation of certain statistics (such as inter-annual dependence and/or skewness of aggregated annual flows). The efficacy of the proposed S-O framework is brought out through a case example from the Colorado River basin. The proposed multi-site multi-season model AMHMABB (whose parameters are obtained from the proposed S-O framework) preserves the temporal as well as the spatial statistics of the historical flows. Also, the other multi-site deficit run characteristics namely, the number of runs, the maximum run length, the mean run sum and the mean run length are well preserved by the AMHMABB model. Overall, the proposed AMHMABB model is able to show better streamflow modeling performance when compared with the simulation based SMHMABB model, plausibly due to the significant role played by: (i) the objective functions related to the preservation of multi-site critical deficit run sum; (ii) the huge hybrid model parameter space available for the evolutionary search and (iii) the constraint on the preservation of the inter-annual dependence. Split-sample validation results indicate that the AMHMABB model is

  7. A hybrid model to assess the impact of climate variability on streamflow for an ungauged mountainous basin

    Science.gov (United States)

    Wang, Chong; Xu, Jianhua; Chen, Yaning; Bai, Ling; Chen, Zhongsheng

    2018-04-01

    To quantitatively assess the impact of climate variability on streamflow in an ungauged mountainous basin is a difficult and challenging work. In this study, a hybrid model combing downscaling method based on earth data products, back propagation artificial neural networks (BPANN) and weights connection method was developed to explore an approach for solving this problem. To validate the applicability of the hybrid model, the Kumarik River and Toshkan River, two headwaters of the Aksu River, were employed to assess the impact of climate variability on streamflow by using this hybrid model. The conclusion is that the hybrid model presented a good performance, and the quantitative assessment results for the two headwaters are: (1) the precipitation respectively increased by 48.5 and 41.0 mm in the Kumarik catchment and Toshkan catchment, and the average annual temperature both increased by 0.1 °C in the two catchments during each decade from 1980 to 2012; (2) with the warming and wetting climate, the streamflow respectively increased 1.5 × 108 and 3.3 × 108 m3 per decade in the Kumarik River and the Toshkan River; and (3) the contribution of the temperature and precipitation to the streamflow, which were 64.01 ± 7.34, 35.99 ± 7.34 and 47.72 ± 8.10, 52.26 ± 8.10%, respectively in the Kumarik catchment and Toshkan catchment. Our study introduced a feasible hybrid model for the assessment of the impact of climate variability on streamflow, which can be used in the ungauged mountainous basin of Northwest China.

  8. Precipitation-runoff and streamflow-routing models for the Willamette River basin, Oregon

    Science.gov (United States)

    Laenen, Antonius; Risley, John C.

    1997-01-01

    Precipitation-runoff and streamflow-routing models were constructed and assessed as part of a water-quality study of the Willamette River Basin. The study was a cooperative effort between the U.S. Geological Survey (USGS) and the Oregon Department of Environmental Quality (ODEQ) and was coordinated with the USGS National Water-Quality Assessment (NAWQA) study of the Willamette River. Routing models are needed to estimate streamflow so that water-quality constituent loads can be calculated from measured concentrations and so that sources, sinks, and downstream changes in those loads can be identified. Runoff models are needed to estimate ungaged-tributary inflows for routing models and to identify flow contributions from different parts of the basin. The runoff and routing models can be run either separately or together to simulate streamflow at various locations and to examine streamflow contributions from overland flow, shallow-subsurface flow, and ground-water flow.

  9. Wavelet-linear genetic programming: A new approach for modeling monthly streamflow

    Science.gov (United States)

    Ravansalar, Masoud; Rajaee, Taher; Kisi, Ozgur

    2017-06-01

    The streamflows are important and effective factors in stream ecosystems and its accurate prediction is an essential and important issue in water resources and environmental engineering systems. A hybrid wavelet-linear genetic programming (WLGP) model, which includes a discrete wavelet transform (DWT) and a linear genetic programming (LGP) to predict the monthly streamflow (Q) in two gauging stations, Pataveh and Shahmokhtar, on the Beshar River at the Yasuj, Iran were used in this study. In the proposed WLGP model, the wavelet analysis was linked to the LGP model where the original time series of streamflow were decomposed into the sub-time series comprising wavelet coefficients. The results were compared with the single LGP, artificial neural network (ANN), a hybrid wavelet-ANN (WANN) and Multi Linear Regression (MLR) models. The comparisons were done by some of the commonly utilized relevant physical statistics. The Nash coefficients (E) were found as 0.877 and 0.817 for the WLGP model, for the Pataveh and Shahmokhtar stations, respectively. The comparison of the results showed that the WLGP model could significantly increase the streamflow prediction accuracy in both stations. Since, the results demonstrate a closer approximation of the peak streamflow values by the WLGP model, this model could be utilized for the simulation of cumulative streamflow data prediction in one month ahead.

  10. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    Science.gov (United States)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  11. Historical Streamflow Series Analysis Applied to Furnas HPP Reservoir Watershed Using the SWAT Model

    Directory of Open Access Journals (Sweden)

    Viviane de Souza Dias

    2018-04-01

    Full Text Available Over the last few years, the operation of the Furnas Hydropower Plant (HPP reservoir, located in the Grande River Basin, has been threatened due to a significant reduction in inflow. In the region, hydrological modelling tools are being used and tested to support decision making and water sustainability. In this study, the streamflow was modelled in the area of direct influence of the Furnas HPP reservoir, and the Soil and Water Assessment Tool (SWAT model performance was verified for studies in the region. Analyses of sensitivity and uncertainty were undertaken using the Sequential Uncertainty Fitting algorithm (SUFI-2 with a Calibration Uncertainty Program (SWAT-CUP. The hydrological modelling, at a monthly scale, presented good results in the calibration (NS 0.86, with a slight reduction of the coefficient in the validation period (NS 0.64. The results suggested that this tool could be applied in future hydrological studies in the region of study. With the consideration that special attention should be given to the historical series used in the calibration and validation of the models. It is important to note that this region has high demands for water resources, primarily for agricultural use. Water demands must also be taken into account in future hydrological simulations. The validation of this methodology led to important contributions to the management of water resources in regions with tropical climates, whose climatological and geological reality resembles the one studied here.

  12. Reconstructed streamflow in the eastern United States: validity, drivers, and challenges

    Science.gov (United States)

    Maxwell, S.; Harley, G. L.; Maxwell, J. T.; Rayback, S. A.; Pederson, N.; Cook, E. R.; Barclay, D. J.; Li, W.; Rayburn, J. A.

    2015-12-01

    Tree-ring reconstructions of streamflow are uncommon in the eastern US compared to the western US. While the eastern US does not experience severe drought on the scale of the west, multi-year droughts have stressed the water management systems throughout the east. Here, we reconstruct three rivers serving population centers in the northeast (Beaver Kill River serving New York City, NY), mid-Atlantic (Potomac River serving Washington, D.C.), and southeast (Flint River serving Atlanta, GA) to demonstrate the ability to reconstruct in the eastern US. Then, we conducted an interbasin comparison to identify periods of common variability and examined synoptic scale drivers of drought and pluvial events. Finally, we discuss the utility of multi-species reconstructions in the moist, biodiverse eastern US. Our calibration models explained 66 - 68% of the variance in the instrumental record and passed verification tests in all basins to 1675 CE. Drought and pluvial events showed some synchrony across all basins but the mid-Atlantic acted as a hinge, sometimes behaving more like the northeast, and other times like the southeast. Weak correlations with oceanic-atmospheric oscillations made identification of synoptic scale drivers difficult. However, there appears to be a relationship between the position of the western ridge of the North Atlantic Subtropical High and streamflow across the basins of the east. Given the many factors influencing tree growth in closed canopy systems, we have shown that careful standardization of individual tree-ring series, nested regression models, and the use of multiple species can produce robust proxies in the east.

  13. Modeled future peak streamflows in four coastal Maine rivers

    Science.gov (United States)

    Hodgkins, Glenn A.; Dudley, Robert W.

    2013-01-01

    To safely and economically design bridges and culverts, it is necessary to compute the magnitude of peak streamflows that have specified annual exceedance probabilities (AEPs). Annual precipitation and air temperature in the northeastern United States are, in general, projected to increase during the 21st century. It is therefore important for engineers and resource managers to understand how peak flows may change in the future. This report, prepared in cooperation with the Maine Department of Transportation (MaineDOT), presents modeled changes in peak flows at four basins in coastal Maine on the basis of projected changes in air temperature and precipitation. To estimate future peak streamflows at the four basins in this study, historical values for climate (temperature and precipitation) in the basins were adjusted by different amounts and input to a hydrologic model of each study basin. To encompass the projected changes in climate in coastal Maine by the end of the 21st century, air temperatures were adjusted by four different amounts, from -3.6 degrees Fahrenheit (ºF) (-2 degrees Celsius (ºC)) to +10.8 ºF (+6 ºC) of observed temperatures. Precipitation was adjusted by three different percentage values from -15 percent to +30 percent of observed precipitation. The resulting 20 combinations of temperature and precipitation changes (includes the no-change scenarios) were input to Precipitation-Runoff Modeling System (PRMS) watershed models, and annual daily maximum peak flows were calculated for each combination. Modeled peak flows from the adjusted changes in temperature and precipitation were compared to unadjusted (historical) modeled peak flows. Annual daily maximum peak flows increase or decrease, depending on whether temperature or precipitation is adjusted; increases in air temperature (with no change in precipitation) lead to decreases in peak flows, whereas increases in precipitation (with no change in temperature) lead to increases in peak flows. As

  14. Simulation of streamflow in the Pleasant, Narraguagus, Sheepscot, and Royal Rivers, Maine, using watershed models

    Science.gov (United States)

    Dudley, Robert W.; Nielsen, Martha G.

    2011-01-01

    The U.S. Geological Survey (USGS) began a study in 2008 to investigate anticipated changes in summer streamflows and stream temperatures in four coastal Maine river basins and the potential effects of those changes on populations of endangered Atlantic salmon. To achieve this purpose, it was necessary to characterize the quantity and timing of streamflow in these rivers by developing and evaluating a distributed-parameter watershed model for a part of each river basin by using the USGS Precipitation-Runoff Modeling System (PRMS). The GIS (geographic information system) Weasel, a USGS software application, was used to delineate the four study basins and their many subbasins, and to derive parameters for their geographic features. The models were calibrated using a four-step optimization procedure in which model output was evaluated against four datasets for calibrating solar radiation, potential evapotranspiration, annual and seasonal water balances, and daily streamflows. The calibration procedure involved thousands of model runs that used the USGS software application Luca (Let us calibrate). Luca uses the Shuffled Complex Evolution (SCE) global search algorithm to calibrate the model parameters. The calibrated watershed models performed satisfactorily, in that Nash-Sutcliffe efficiency (NSE) statistic values for the calibration periods ranged from 0.59 to 0.75 (on a scale of negative infinity to 1) and NSE statistic values for the evaluation periods ranged from 0.55 to 0.73. The calibrated watershed models simulate daily streamflow at many locations in each study basin. These models enable natural resources managers to characterize the timing and amount of streamflow in order to support a variety of water-resources efforts including water-quality calculations, assessments of water use, modeling of population dynamics and migration of Atlantic salmon, modeling and assessment of habitat, and simulation of anticipated changes to streamflow and water temperature

  15. Improving characterization of streamflow by conceptual modeling of rating curve uncertainty

    Science.gov (United States)

    Weijs, Steven; Galindo, Luis

    2017-04-01

    Streamflow timeseries are an important source of information for hydrological predictions, both through direct use in extreme value analysis and through streamflow records used in calibration of hydrological models. In this research we look at ways to best represent uncertainties in the rating curve and ways to constrain them using additional information apart from the Q,h pairs used traditionally. One of the possible avenues to enable use of such information is a more physically based representation of rating curves and explicit accounting of the dynamic nature of the stage-discharge relation. We present these representations and the reduction in uncertainty that can be achieved by the introduction of various pieces of external information. The influence of variable streamflow uncertainty for hydrological model calibration will also be explored.

  16. Improving the Distributed Hydrological Model Performance in Upper Huai River Basin: Using Streamflow Observations to Update the Basin States via the Ensemble Kalman Filter

    Directory of Open Access Journals (Sweden)

    Yongwei Liu

    2016-01-01

    Full Text Available This study investigates the capability of improving the distributed hydrological model performance by assimilating the streamflow observations. Incorrectly estimated model states will lead to discrepancies between the observed and estimated streamflow. Consequently, streamflow observations can be used to update the model states, and the improved model states will eventually benefit the streamflow predictions. This study tests this concept in upper Huai River basin. We assimilate the streamflow observations sequentially into the Soil and Water Assessment Tool (SWAT using the ensemble Kalman filter (EnKF to update the model states. Both synthetic experiments and real data application are used to demonstrate the benefit of this data assimilation scheme. The experiment shows that assimilating the streamflow observations at interior sites significantly improves the streamflow predictions for the whole basin. Assimilating the catchment outlet streamflow improves the streamflow predictions near the catchment outlet. In real data case, the estimated streamflow at the catchment outlet is significantly improved by assimilating the in situ streamflow measurements at interior gauges. Assimilating the in situ catchment outlet streamflow also improves the streamflow prediction of one interior location on the main reach. This may demonstrate that updating model states using streamflow observations can constrain the flux estimates in distributed hydrological modeling.

  17. Coupled daily streamflow and water temperature modelling in large river basins

    Directory of Open Access Journals (Sweden)

    M. T. H. van Vliet

    2012-11-01

    Full Text Available Realistic estimates of daily streamflow and water temperature are required for effective management of water resources (e.g. for electricity and drinking water production and freshwater ecosystems. Although hydrological and process-based water temperature modelling approaches have been successfully applied to small catchments and short time periods, much less work has been done at large spatial and temporal scales. We present a physically based modelling framework for daily river discharge and water temperature simulations applicable to large river systems on a global scale. Model performance was tested globally at 1/2 × 1/2° spatial resolution and a daily time step for the period 1971–2000. We made specific evaluations on large river basins situated in different hydro-climatic zones and characterized by different anthropogenic impacts. Effects of anthropogenic heat discharges on simulated water temperatures were incorporated by using global gridded thermoelectric water use datasets and representing thermal discharges as point sources into the heat advection equation. This resulted in a significant increase in the quality of the water temperature simulations for thermally polluted basins (Rhine, Meuse, Danube and Mississippi. Due to large reservoirs in the Columbia which affect streamflow and thermal regimes, a reservoir routing model was used. This resulted in a significant improvement in the performance of the river discharge and water temperature modelling. Overall, realistic estimates were obtained at daily time step for both river discharge (median normalized BIAS = 0.3; normalized RMSE = 1.2; r = 0.76 and water temperature (median BIAS = −0.3 °C; RMSE = 2.8 °C; r = 0.91 for the entire validation period, with similar performance during warm, dry periods. Simulated water temperatures are sensitive to headwater temperature, depending on resolution and flow velocity. A high sensitivity of water temperature to river

  18. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    Science.gov (United States)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  19. A comparative analysis of 9 multi-model averaging approaches in hydrological continuous streamflow simulation

    Science.gov (United States)

    Arsenault, Richard; Gatien, Philippe; Renaud, Benoit; Brissette, François; Martel, Jean-Luc

    2015-10-01

    This study aims to test whether a weighted combination of several hydrological models can simulate flows more accurately than the models taken individually. In addition, the project attempts to identify the most efficient model averaging method and the optimal number of models to include in the weighting scheme. In order to address the first objective, streamflow was simulated using four lumped hydrological models (HSAMI, HMETS, MOHYSE and GR4J-6), each of which were calibrated with three different objective functions on 429 watersheds. The resulting 12 hydrographs (4 models × 3 metrics) were weighted and combined with the help of 9 averaging methods which are the simple arithmetic mean (SAM), Akaike information criterion (AICA), Bates-Granger (BGA), Bayes information criterion (BICA), Bayesian model averaging (BMA), Granger-Ramanathan average variant A, B and C (GRA, GRB and GRC) and the average by SCE-UA optimization (SCA). The same weights were then applied to the hydrographs in validation mode, and the Nash-Sutcliffe Efficiency metric was measured between the averaged and observed hydrographs. Statistical analyses were performed to compare the accuracy of weighted methods to that of individual models. A Kruskal-Wallis test and a multi-objective optimization algorithm were then used to identify the most efficient weighted method and the optimal number of models to integrate. Results suggest that the GRA, GRB, GRC and SCA weighted methods perform better than the individual members. Model averaging from these four methods were superior to the best of the individual members in 76% of the cases. Optimal combinations on all watersheds included at least one of each of the four hydrological models. None of the optimal combinations included all members of the ensemble of 12 hydrographs. The Granger-Ramanathan average variant C (GRC) is recommended as the best compromise between accuracy, speed of execution, and simplicity.

  20. Coupled daily streamflow and water temperature modelling in large river basins

    NARCIS (Netherlands)

    Vliet, van M.T.H.; Yearsley, J.R.; Franssen, W.H.P.; Ludwig, F.; Haddeland, I.; Kabat, P.

    2012-01-01

    Realistic estimates of daily streamflow and water temperature are required for effective management of water resources (e.g. for electricity and drinking water production) and freshwater ecosystems. Although hydrological and process-based water temperature modelling approaches have been successfully

  1. Improving streamflow simulations and forecasting performance of SWAT model by assimilating remotely sensed soil moisture observations

    Science.gov (United States)

    Patil, Amol; Ramsankaran, RAAJ

    2017-12-01

    This article presents a study carried out using EnKF based assimilation of coarser-scale SMOS soil moisture retrievals to improve the streamflow simulations and forecasting performance of SWAT model in a large catchment. This study has been carried out in Munneru river catchment, India, which is about 10,156 km2. In this study, an EnkF based new approach is proposed for improving the inherent vertical coupling of soil layers of SWAT hydrological model during soil moisture data assimilation. Evaluation of the vertical error correlation obtained between surface and subsurface layers indicates that the vertical coupling can be improved significantly using ensemble of soil storages compared to the traditional static soil storages based EnKF approach. However, the improvements in the simulated streamflow are moderate, which is due to the limitations in SWAT model in reflecting the profile soil moisture updates in surface runoff computations. Further, it is observed that the durability of streamflow improvements is longer when the assimilation system effectively updates the subsurface flow component. Overall, the results of the present study indicate that the passive microwave-based coarser-scale soil moisture products like SMOS hold significant potential to improve the streamflow estimates when assimilating into large-scale distributed hydrological models operating at a daily time step.

  2. SWAT-based streamflow and embayment modeling of Karst-affected Chapel branch watershed, South Carolina

    Science.gov (United States)

    Devendra Amatya; M. Jha; A.E. Edwards; T.M. Williams; D.R. Hitchcock

    2011-01-01

    SWAT is a GIS-based basin-scale model widely used for the characterization of hydrology and water quality of large, complex watersheds; however, SWAT has not been fully tested in watersheds with karst geomorphology and downstream reservoir-like embayment. In this study, SWAT was applied to test its ability to predict monthly streamflow dynamics for a 1,555 ha karst...

  3. Stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural streamflow

    Science.gov (United States)

    Kolars, Kelsey A.; Vecchia, Aldo V.; Ryberg, Karen R.

    2016-02-24

    The Souris River Basin is a 61,000-square-kilometer basin in the Provinces of Saskatchewan and Manitoba and the State of North Dakota. In May and June of 2011, record-setting rains were seen in the headwater areas of the basin. Emergency spillways of major reservoirs were discharging at full or nearly full capacity, and extensive flooding was seen in numerous downstream communities. To determine the probability of future extreme floods and droughts, the U.S. Geological Survey, in cooperation with the North Dakota State Water Commission, developed a stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural (unregulated) streamflow. Simulations from the model can be used in future studies to simulate regulated streamflow, design levees, and other structures; and to complete economic cost/benefit analyses.Long-term climatic variability was analyzed using tree-ring chronologies to hindcast precipitation to the early 1700s and compare recent wet and dry conditions to earlier extreme conditions. The extended precipitation record was consistent with findings from the Devils Lake and Red River of the North Basins (southeast of the Souris River Basin), supporting the idea that regional climatic patterns for many centuries have consisted of alternating wet and dry climate states.A stochastic climate simulation model for precipitation, temperature, and potential evapotranspiration for the Souris River Basin was developed using recorded meteorological data and extended precipitation records provided through tree-ring analysis. A significant climate transition was seen around1970, with 1912–69 representing a dry climate state and 1970–2011 representing a wet climate state. Although there were some distinct subpatterns within the basin, the predominant differences between the two states were higher spring through early fall precipitation and higher spring potential evapotranspiration for the wet compared to the dry state.A water

  4. Streamflow forecasting using the modular modeling system and an object-user interface

    Science.gov (United States)

    Jeton, A.E.

    2001-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Bureau of Reclamation (BOR), developed a computer program to provide a general framework needed to couple disparate environmental resource models and to manage the necessary data. The Object-User Interface (OUI) is a map-based interface for models and modeling data. It provides a common interface to run hydrologic models and acquire, browse, organize, and select spatial and temporal data. One application is to assist river managers in utilizing streamflow forecasts generated with the Precipitation-Runoff Modeling System running in the Modular Modeling System (MMS), a distributed-parameter watershed model, and the National Weather Service Extended Streamflow Prediction (ESP) methodology.

  5. Novel approach for streamflow forecasting using a hybrid ANFIS-FFA model

    Science.gov (United States)

    Yaseen, Zaher Mundher; Ebtehaj, Isa; Bonakdari, Hossein; Deo, Ravinesh C.; Danandeh Mehr, Ali; Mohtar, Wan Hanna Melini Wan; Diop, Lamine; El-shafie, Ahmed; Singh, Vijay P.

    2017-11-01

    The present study proposes a new hybrid evolutionary Adaptive Neuro-Fuzzy Inference Systems (ANFIS) approach for monthly streamflow forecasting. The proposed method is a novel combination of the ANFIS model with the firefly algorithm as an optimizer tool to construct a hybrid ANFIS-FFA model. The results of the ANFIS-FFA model is compared with the classical ANFIS model, which utilizes the fuzzy c-means (FCM) clustering method in the Fuzzy Inference Systems (FIS) generation. The historical monthly streamflow data for Pahang River, which is a major river system in Malaysia that characterized by highly stochastic hydrological patterns, is used in the study. Sixteen different input combinations with one to five time-lagged input variables are incorporated into the ANFIS-FFA and ANFIS models to consider the antecedent seasonal variations in historical streamflow data. The mean absolute error (MAE), root mean square error (RMSE) and correlation coefficient (r) are used to evaluate the forecasting performance of ANFIS-FFA model. In conjunction with these metrics, the refined Willmott's Index (Drefined), Nash-Sutcliffe coefficient (ENS) and Legates and McCabes Index (ELM) are also utilized as the normalized goodness-of-fit metrics. Comparison of the results reveals that the FFA is able to improve the forecasting accuracy of the hybrid ANFIS-FFA model (r = 1; RMSE = 0.984; MAE = 0.364; ENS = 1; ELM = 0.988; Drefined = 0.994) applied for the monthly streamflow forecasting in comparison with the traditional ANFIS model (r = 0.998; RMSE = 3.276; MAE = 1.553; ENS = 0.995; ELM = 0.950; Drefined = 0.975). The results also show that the ANFIS-FFA is not only superior to the ANFIS model but also exhibits a parsimonious modelling framework for streamflow forecasting by incorporating a smaller number of input variables required to yield the comparatively better performance. It is construed that the FFA optimizer can thus surpass the accuracy of the traditional ANFIS model in general

  6. Evaluation of model-based seasonal streamflow and water allocation forecasts for the Elqui Valley, Chile

    Science.gov (United States)

    Delorit, Justin; Cristian Gonzalez Ortuya, Edmundo; Block, Paul

    2017-09-01

    In many semi-arid regions, multisectoral demands often stress available water supplies. Such is the case in the Elqui River valley of northern Chile, which draws on a limited-capacity reservoir to allocate 25 000 water rights. Delayed infrastructure investment forces water managers to address demand-based allocation strategies, particularly in dry years, which are realized through reductions in the volume associated with each water right. Skillful season-ahead streamflow forecasts have the potential to inform managers with an indication of future conditions to guide reservoir allocations. This work evaluates season-ahead statistical prediction models of October-January (growing season) streamflow at multiple lead times associated with manager and user decision points, and links predictions with a reservoir allocation tool. Skillful results (streamflow forecasts outperform climatology) are produced for short lead times (1 September: ranked probability skill score (RPSS) of 0.31, categorical hit skill score of 61 %). At longer lead times, climatological skill exceeds forecast skill due to fewer observations of precipitation. However, coupling the 1 September statistical forecast model with a sea surface temperature phase and strength statistical model allows for equally skillful categorical streamflow forecasts to be produced for a 1 May lead, triggered for 60 % of years (1950-2015), suggesting forecasts need not be strictly deterministic to be useful for water rights holders. An early (1 May) categorical indication of expected conditions is reinforced with a deterministic forecast (1 September) as more observations of local variables become available. The reservoir allocation model is skillful at the 1 September lead (categorical hit skill score of 53 %); skill improves to 79 % when categorical allocation prediction certainty exceeds 80 %. This result implies that allocation efficiency may improve when forecasts are integrated into reservoir decision frameworks. The

  7. Watershed-scale modeling of streamflow change in incised montane meadows

    Science.gov (United States)

    Essaid, Hedeff I.; Hill, Barry R.

    2014-01-01

    Land use practices have caused stream channel incision and water table decline in many montane meadows of the Western United States. Incision changes the magnitude and timing of streamflow in water supply source watersheds, a concern to resource managers and downstream water users. The hydrology of montane meadows under natural and incised conditions was investigated using watershed simulation for a range of hydrologic conditions. The results illustrate the interdependence between: watershed and meadow hydrology; bedrock and meadow aquifers; and surface and groundwater flow through the meadow for the modeled scenarios. During the wet season, stream incision resulted in less overland flow and interflow and more meadow recharge causing a net decrease in streamflow and increase in groundwater storage relative to natural meadow conditions. During the dry season, incision resulted in less meadow evapotranspiration and more groundwater discharge to the stream causing a net increase in streamflow and a decrease in groundwater storage relative to natural meadow conditions. In general, for a given meadow setting, the magnitude of change in summer streamflow and long-term change in watershed groundwater storage due to incision will depend on the combined effect of: reduced evapotranspiration in the eroded meadow; induced groundwater recharge; replenishment of dry season groundwater storage depletion in meadow and bedrock aquifers by precipitation during wet years; and groundwater storage depletion that is not replenished by precipitation during wet years.

  8. Statistical-dynamical long-range seasonal forecasting of streamflow with the North-American Multi Model Ensemble (NMME)

    Science.gov (United States)

    Slater, Louise; Villarini, Gabriele

    2017-04-01

    There are two main approaches to long-range (monthly to seasonal) streamflow forecasting: statistical approaches that typically relate climate precursors directly to streamflow, and dynamical physically-based approaches in which spatially distributed models are forced with downscaled meteorological forecasts. While the former approach is potentially limited by a lack of physical causality, the latter tends to be complex and time-consuming to implement. In contrast, hybrid statistical-dynamical techniques that use global climate model (GCM) ensemble forecasts as inputs to statistical models are both physically-based and rapid to run, but are a relatively new field of research. Here, we conduct the first systematic multimodel statistical-dynamical forecasting of streamflow using NMME climate forecasts from eight GCMs (CCSM3, CCSM4, CanCM3, CanCM4, GFDL2.1, FLORb01, GEOS5, and CFSv2) across a broad region. At several hundred U.S. Midwest stream gauges with long (50+ continuous years) streamflow records, we fit probabilistic statistical models for seasonal streamflow percentiles ranging from minimum to maximum flows. As predictors, we use basin-averaged values of precipitation, antecedent wetness, temperature, agricultural row crop acreage, and population density. Using the observed data, we select the best-fitting probabilistic model for every site, season, and streamflow percentile (ranging from low to high flows). The best-fitting models are then used to obtain streamflow predictions by incorporating the NMME climate forecasts and the extrapolated agricultural and population time series as predictors. The forecasting skill of our models is assessed using both deterministic and probabilistic verification measures. The influence of the different predictors is evaluated for all streamflow percentiles and across the full range of lead times. Our findings reveal that statistical-dynamical streamflow forecasting produces promising results, which may enable water managers

  9. Assessment of land-use change on streamflow using GIS, remote sensing and a physically-based model, SWAT

    Directory of Open Access Journals (Sweden)

    J. Y. G. Dos Santos

    2014-09-01

    Full Text Available This study aims to assess the impact of the land-use changes between the periods 1967−1974 and 1997−2008 on the streamflow of Tapacurá catchment (northeastern Brazil using the Soil and Water Assessment Tool (SWAT model. The results show that the most sensitive parameters were the baseflow, Manning factor, time of concentration and soil evaporation compensation factor, which affect the catchment hydrology. The model calibration and validation were performed on a monthly basis, and the streamflow simulation showed a good level of accuracy for both periods. The obtained R2 and Nash-Sutcliffe Efficiency values for each period were respectively 0.82 and 0.81 for 1967−1974, and 0.93 and 0.92 for the period 1997−2008. The evaluation of the SWAT model response to the land cover has shown that the mean monthly flow, during the rainy seasons for 1967−1974, decreased when compared to 1997−2008.

  10. Artificial intelligence based models for stream-flow forecasting: 2000-2015

    Science.gov (United States)

    Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba

    2015-11-01

    The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.

  11. Validated Metrics of Quick Flow Improve Assessments of Streamflow Generation Processes at the Long-Term Sleepers River Research Site

    Science.gov (United States)

    Sebestyen, S. D.; Shanley, J. B.

    2015-12-01

    There are multiple approaches to quantify quick flow components of streamflow. Physical hydrograph separations of quick flow using recession analysis (RA) are objective, reproducible, and easily calculated for long-duration streamflow records (years to decades). However, this approach has rarely been validated to have a physical basis for interpretation. In contrast, isotopic hydrograph separation (IHS) and end member mixing analysis using multiple solutes (EMMA) have been used to identify flow components and flowpath routing through catchment soils. Nonetheless, these two approaches are limited by data from limited and isolated periods (hours to weeks) during which more-intensive grab samples were analyzed. These limitations oftentimes make IHS and EMMA difficult to generalize beyond brief windows of time. At the Sleepers River Research Watershed (SRRW) in northern Vermont, USA, we have data from multiple snowmelt events over a two decade period and from multiple nested catchments to assess relationships among RA, IHS, and EMMA. Quick flow separations were highly correlated among the three techniques, which shows links among metrics of quick flow, water sources, and flow path routing in a small (41 ha), forested catchment (W-9) The similarity in responses validates a physical interpretation for a particular RA approach (the Ekhardt recursive RA filter). This validation provides a new tool to estimate new water inputs and flowpath routing for more and longer periods when chemical or isotopic tracers may not have been measured. At three other SRRW catchments, we found similar strong correlations among the three techniques. Consistent responses across four catchments provide evidence to support other research at the SRRW that shows that runoff generation mechanisms are similar despite differences in catchment sizes and land covers.

  12. Toward Improved Calibration of Hydrologic Models: Multi-objective Analysis of Streamflow and SWE Modeling Errors in Mountainous Regions

    Science.gov (United States)

    Barth, C.; Boyle, D. P.; Bastidas, L. A.; Schumer, R.

    2008-12-01

    In many of the mountainous regions of the western United States, much of the streamflow runoff at the mountain front originates as melt water from snow. As a result, many hydrologic models applied in these regions have components that represent the snow water equivalent (SWE) throughout the accumulation and depletion processes of the snow pack. The limited number of point observations of SWE in these regions, however, generally precludes an accurate estimate of the spatial and temporal distribution of SWE in most model applications. As a result, hydrologic model calibration and evaluation is generally focused on the fitting of simulated streamflow to observed streamflow data. In this study, we examine the utility of SWE estimates obtained from the Snow Data Assimilation System (SNODAS) product as a surrogate for SWE observations in the calibration and evaluation of the Precipitation-Runoff Modeling System (PRMS). Specifically, we employ a multi-objective analysis of several streamflow behaviors (e.g., rising limb, falling limb, and baseflow) and snow pack behaviors (e.g., accumulation, depletion, and no snow) aimed at better understanding the sensitivities of the different behaviors to changes in values of specific PRMS model parameters. The multi-objective approach is carried out with the Multi-Objective Generalized Sensitivity Analysis (MOGSA) algorithm and the Multi-Objective Complex Evolution (MOCOM).

  13. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  14. A simple model for assessing utilisable streamflow allocations in the ...

    African Journals Online (AJOL)

    -sized water resource systems (without major storage) and displays the results as flow duration curves so that they can be compared with the standard information available for the Ecological Reserve requirements. The model is designed to ...

  15. A Generalized Martingale Model of Streamflow Forecast Uncertainty Evolution and its Application in the Three Gorge Reservoir Operation

    Science.gov (United States)

    Zhao, J.; Zhao, T.

    2012-12-01

    Streamflow forecasts are dynamically updated in real-time, which leads to a process of forecast uncertainty evolution. Generally, forecast uncertainty reduces as time progresses and more hydrologic information becomes available. This process of forecasting and uncertainty updating can be described by the martingale model of forecast evolution (MMFE), which formulates the total forecast uncertainty of streamflow in one future period as the sum of forecast improvements in the intermediate periods. This study tests the basic assumptions of MMFE with the streamflow forecast data from the Three Gorge Reservoir and shows that 1) real-world forecasts can be biased and tend to underestimate the actual streamflow and 2) real-world forecast uncertainty can be non-Gaussian and heavy-tailed. Based on these statistical tests, this study incorporates the normal quantile transform (NQT) method and issues a generalized NQT-MMFE model to simulate biased and non-Gaussian forecast uncertainties. The simulated streamflow forecast is similar to the real-world forecast in terms of NSE, MAE, and RMSE, which illustrates the effectiveness of the NQT-MMFE model. The simulated forecasts are further applied to a Monte-Carlo experiment of the Three Gorge Reservoir re-operation. The results illustrate that NQT-MMFE model within a rolling horizon decision making framework can efficiently exploit forecast information and make more robust decisions. The real-time streamflow forecast of TGR in 2008

  16. Skills of General Circulation and Earth System Models in reproducing streamflow to the ocean: the case of Congo river

    Science.gov (United States)

    Santini, M.; Caporaso, L.

    2017-12-01

    Although the importance of water resources in the context of climate change, it is still difficult to correctly simulate the freshwater cycle over the land via General Circulation and Earth System Models (GCMs and ESMs). Existing efforts from the Climate Model Intercomparison Project 5 (CMIP5) were mainly devoted to the validation of atmospheric variables like temperature and precipitation, with low attention to discharge.Here we investigate the present-day performances of GCMs and ESMs participating to CMIP5 in simulating the discharge of the river Congo to the sea thanks to: i) the long-term availability of discharge data for the Kinshasa hydrological station representative of more than 95% of the water flowing in the whole catchment; and ii) the River's still low influence by human intervention, which enables comparison with the (mostly) natural streamflow simulated within CMIP5.Our findings suggest how most of models appear overestimating the streamflow in terms of seasonal cycle, especially in the late winter and spring, while overestimation and variability across models are lower in late summer. Weighted ensemble means are also calculated, based on simulations' performances given by several metrics, showing some improvements of results.Although simulated inter-monthly and inter-annual percent anomalies do not appear significantly different from those in observed data, when translated into well consolidated indicators of drought attributes (frequency, magnitude, timing, duration), usually adopted for more immediate communication to stakeholders and decision makers, such anomalies can be misleading.These inconsistencies produce incorrect assessments towards water management planning and infrastructures (e.g. dams or irrigated areas), especially if models are used instead of measurements, as in case of ungauged basins or for basins with insufficient data, as well as when relying on models for future estimates without a preliminary quantification of model biases.

  17. A simple model for assessing utilisable streamflow allocations in the ...

    African Journals Online (AJOL)

    2006-07-03

    Jul 3, 2006 ... Ecological Reserve) in South Africa recognises the natural vari- ability of flow regimes and that some of ... using a system yield model in which the Ecological Reserve is treated as a high priority user and ..... metadata within memo type attributes that can be used to store comments about data sources and ...

  18. Being an honest broker of hydrology: Uncovering, communicating and addressing model error in a climate change streamflow dataset

    Science.gov (United States)

    Chegwidden, O.; Nijssen, B.; Pytlak, E.

    2017-12-01

    Any model simulation has errors, including errors in meteorological data, process understanding, model structure, and model parameters. These errors may express themselves as bias, timing lags, and differences in sensitivity between the model and the physical world. The evaluation and handling of these errors can greatly affect the legitimacy, validity and usefulness of the resulting scientific product. In this presentation we will discuss a case study of handling and communicating model errors during the development of a hydrologic climate change dataset for the Pacific Northwestern United States. The dataset was the result of a four-year collaboration between the University of Washington, Oregon State University, the Bonneville Power Administration, the United States Army Corps of Engineers and the Bureau of Reclamation. Along the way, the partnership facilitated the discovery of multiple systematic errors in the streamflow dataset. Through an iterative review process, some of those errors could be resolved. For the errors that remained, honest communication of the shortcomings promoted the dataset's legitimacy. Thoroughly explaining errors also improved ways in which the dataset would be used in follow-on impact studies. Finally, we will discuss the development of the "streamflow bias-correction" step often applied to climate change datasets that will be used in impact modeling contexts. We will describe the development of a series of bias-correction techniques through close collaboration among universities and stakeholders. Through that process, both universities and stakeholders learned about the others' expectations and workflows. This mutual learning process allowed for the development of methods that accommodated the stakeholders' specific engineering requirements. The iterative revision process also produced a functional and actionable dataset while preserving its scientific merit. We will describe how encountering earlier techniques' pitfalls allowed us

  19. Future streamflow droughts in glacierized catchments: the impact of dynamic glacier modelling and changing thresholds

    Science.gov (United States)

    Van Tiel, Marit; Van Loon, Anne; Wanders, Niko; Vis, Marc; Teuling, Ryan; Stahl, Kerstin

    2017-04-01

    In glacierized catchments, snowpack and glaciers function as an important storage of water and hydrographs of highly glacierized catchments in mid- and high latitudes thus show a clear seasonality with low flows in winter and high flows in summer. Due to the ongoing climate change we expect this type of storage capacity to decrease with resultant consequences for the discharge regime. In this study we focus on streamflow droughts, here defined as below average water availability specifically in the high flow season, and which methods are most suitable to characterize future streamflow droughts as regimes change. Two glacierized catchments, Nigardsbreen (Norway) and Wolverine (Alaska), are used as case study and streamflow droughts are compared between two periods, 1975-2004 and 2071-2100. Streamflow is simulated with the HBV light model, calibrated on observed discharge and seasonal glacier mass balances, for two climate change scenarios (RCP 4.5 & RCP 8.5). In studies on future streamflow drought often the same variable threshold of the past has been applied to the future, but in regions where a regime shift is expected this method gives severe "droughts" in the historic high-flow period. We applied the new alternative transient variable threshold, a threshold that adapts to the changing hydrological regime and is thus better able to cope with this issue, but has never been thoroughly tested in glacierized catchments. As the glacier area representation in the hydrological modelling can also influence the modelled discharge and the derived streamflow droughts, we evaluated in this study both the difference between the historical variable threshold (HVT) and transient variable threshold (TVT) and two different glacier area conceptualisations (constant area (C) and dynamical area (D)), resulting in four scenarios: HVT-C, HVT-D, TVT-C and TVT-D. Results show a drastic decrease in the number of droughts in the HVT-C scenario due to increased glacier melt. The deficit

  20. A modified simple dynamic model: Derived from the information embedded in observed streamflows

    Science.gov (United States)

    Li, Wei; Nieber, John L.

    2017-09-01

    A zero-dimension hydrological model has been developed to simulate the discharge (Q) from watershed groundwater storage(S). The model is a modified version of the original model developed by Kirchner in 2009 which uses a unique sensitivity function, g (Q) to represent the relation between rate of flow recession and the instantaneous flow rate. The modified dynamic model instead uses a normalized sensitivity function g (Qnorm) which provides the model the flexibility to encompass the hysteretic effect of initial water storage on flow during recession periods. The sensitivity function is normalized based on a correlation function F (Q) which implicitly quantifies the influence of initial storage conditions on recession flow dynamics. For periods of either positive or negative net recharge to groundwater the model applies a term similar in form to an analytical solution based on solution to the linearized Boussinesq equation. The combination of these two streamflow components, the recession component and the net recharge response, provides the model with the flexibility to realistically mimic the hysteresis in the Q vs. S relations for a watershed. The model is applied to the Sagehen Creek watershed, a hilly watershed located in the Sierra Mountains of California. The results show that the modified model has an improved performance to simulate the discharge dynamic encompassing a wide range of water storage (degree of wetness) representing an almost ten-fold variation in annual streamflow.

  1. A hybrid conceptual-fuzzy inference streamflow modelling for the Letaba River system in South Africa

    Science.gov (United States)

    Katambara, Zacharia; Ndiritu, John G.

    There has been considerable water resources developments in South Africa and other regions in the world in order to meet the ever-increasing water demands. These developments have not been matched with a similar development of hydrological monitoring systems and hence there is inadequate data for managing the developed water resources systems. The Letaba River system ( Fig. 1) is a typical case of such a system in South Africa. The available water on this river is over-allocated and reliable daily streamflow modelling of the Letaba River that adequately incorporates the main components and processes would be an invaluable aid to optimal operation of the system. This study describes the development of a calibrated hybrid conceptual-fuzzy-logic model and explores its capability in reproducing the natural processes and human effects on the daily stream flow in the Letaba River. The model performance is considered satisfactory in view of the complexity of the system and inadequacy of relevant data. Performance in modelling streamflow improves towards the downstream and matches that of a stand-alone fuzzy-logic model. The hybrid model obtains realistic estimates of the major system components and processes including the capacities of the farm dams and storage weirs and their trajectories. This suggests that for complex data-scarce River systems, hybrid conceptual-fuzzy-logic modelling may be used for more detailed and dependable operational and planning analysis than stand-alone fuzzy modelling. Further work will include developing and testing other hybrid model configurations.

  2. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  3. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  4. Streamflow data

    Science.gov (United States)

    Holmes, Robert R.; Singh, Vijay P.

    2016-01-01

    The importance of streamflow data to the world’s economy, environmental health, and public safety continues to grow as the population increases. The collection of streamflow data is often an involved and complicated process. The quality of streamflow data hinges on such things as site selection, instrumentation selection, streamgage maintenance and quality assurance, proper discharge measurement techniques, and the development and continued verification of the streamflow rating. This chapter serves only as an overview of the streamflow data collection process as proper treatment of considerations, techniques, and quality assurance cannot be addressed adequately in the space limitations of this chapter. Readers with the need for the detailed information on the streamflow data collection process are referred to the many references noted in this chapter. 

  5. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  6. The impact of black carbon deposition on snowpack and streamflow in the Wasatch mountains in Utah: A study using MODIS albedo data, statistical modeling and machine learning

    Science.gov (United States)

    Panthail, Jai Kanth

    Salt Lake City, located at the base of the Wasatch mountain range in Utah, receives a majority of its potable water from a system of mountain creeks. Snowmelt runoff from mountain watersheds provides the city a clean and relatively inexpensive water supply, and has been a key driver in the city's growth and prosperity. There has been keen interest recently on the possible impact of the deposition of darkening matter, such as dust and black carbon (BC) on the snow, which might lead to a decrease in its 'albedo' or reflective capacity. Such a decrease is expected to result in faster melting of the snow, shifting springtime streamflows to winter. This study aimed to develop a modeling framework to estimate the impact on snowmelt-driven runoff due to various BC deposition scenarios. An albedo simulation model, Snow, Ice, and Aerosol Radiation (SNICAR) model, was used to understand the evolution of albedo under different BC loadings. An Albedo-Snow Water Equivalent (A-SWE) model was developed using a machine learning technique, 'Random Forests', to quantify the effect on the state of snowpack under various albedo-change scenarios. An Albedo-Snow Water Equivalent-Streamflow (A-SWE-S) model was designed using an advanced statistical modeling technique, 'Generalized Additive Models (GAMs)', to extend the analysis to streamflow variations. All models were tested and validated using robust k-fold cross-validation. Albedo data were obtained from NASA's MODIS satellite platform. The key results found the snowpack to be depleted 2-3 weeks later with an albedo increase between 5-10% above current conditions, and 1-2 weeks earlier under albedo decrease of 5-10% below current conditions. Future work will involve improving the A-SWE-S model by better accounting for lagged effects, and the use of results from both models in a city-wide systems model to understand water supply reliability under combined deposition and climate change scenarios.

  7. Regression models of ecological streamflow characteristics in the Cumberland and Tennessee River Valleys

    Science.gov (United States)

    Knight, Rodney R.; Gain, W. Scott; Wolfe, William J.

    2011-01-01

    Predictive equations were developed using stepbackward regression for 19 ecologically relevant streamflow characteristics grouped in five major classes (magnitude, ratio, frequency, variability, and date) for use in the Tennessee and Cumberland River watersheds. Basin characteristics explain 50 percent or more of the variation for 10 of the 19 equations. Independent variables identified through stepbackward regression were statistically significant in 81 of 304 coefficients tested across 19 models (⬚ Ridge streams, similar hydrologic behavior for watersheds with widely varying degrees of forest cover, and distinct hydrologic profiles for streams in different geographic regions.

  8. Improving the performance of streamflow forecasting model using data-preprocessing technique in Dungun River Basin

    Science.gov (United States)

    Khai Tiu, Ervin Shan; Huang, Yuk Feng; Ling, Lloyd

    2018-03-01

    An accurate streamflow forecasting model is important for the development of flood mitigation plan as to ensure sustainable development for a river basin. This study adopted Variational Mode Decomposition (VMD) data-preprocessing technique to process and denoise the rainfall data before putting into the Support Vector Machine (SVM) streamflow forecasting model in order to improve the performance of the selected model. Rainfall data and river water level data for the period of 1996-2016 were used for this purpose. Homogeneity tests (Standard Normal Homogeneity Test, the Buishand Range Test, the Pettitt Test and the Von Neumann Ratio Test) and normality tests (Shapiro-Wilk Test, Anderson-Darling Test, Lilliefors Test and Jarque-Bera Test) had been carried out on the rainfall series. Homogenous and non-normally distributed data were found in all the stations, respectively. From the recorded rainfall data, it was observed that Dungun River Basin possessed higher monthly rainfall from November to February, which was during the Northeast Monsoon. Thus, the monthly and seasonal rainfall series of this monsoon would be the main focus for this research as floods usually happen during the Northeast Monsoon period. The predicted water levels from SVM model were assessed with the observed water level using non-parametric statistical tests (Biased Method, Kendall's Tau B Test and Spearman's Rho Test).

  9. Precipitation-Runoff Modeling System (PRMS) and Streamflow Response to Spatially Distributed Precipitation in Two Large Watersheds in Northern California

    Science.gov (United States)

    Dhakal, A. S.; Adera, S.; Niswonger, R. G.; Gardner, M.

    2016-12-01

    The ability of the Precipitation-Runoff Modeling System (PRMS) to predict peak intensity, peak timing, base flow, and volume of streamflow was examined in Arroyo Hondo (180 km2) and Upper Alameda Creek (85 km2), two sub-watersheds of the Alameda Creek watershed in Northern California. Rainfall-runoff volume ratios vary widely, and can exceed 0.85 during mid-winter flashy rainstorm events. Due to dry antecedent soil moisture conditions, the first storms of the hydrologic year often produce smaller rainfall-runoff volume ratios. Runoff response in this watershed is highly hysteretic; large precipitation events are required to generate runoff following a 4-week period without precipitation. After about 150 mm of cumulative rainfall, streamflow responds quickly to subsequent storms, with variations depending on rainstorm intensity. Inputs to PRMS included precipitation, temperature, topography, vegetation, soils, and land cover data. The data was prepared for input into PRMS using a suite of data processing Python scripts written by the Desert Research Institute and U.S. Geological Survey. PRMS was calibrated by comparing simulated streamflow to measured streamflow at a daily time step during the period 1995 - 2014. The PRMS model is being used to better understand the different patterns of streamflow observed in the Alameda Creek watershed. Although Arroyo Hondo receives more rainfall than Upper Alameda Creek, it is not clear whether the differences in streamflow patterns are a result of differences in rainfall or other variables, such as geology, slope and aspect. We investigate the ability of PRMS to simulate daily streamflow in the two sub-watersheds for a variety of antecedent soil moisture conditions and rainfall intensities. After successful simulation of watershed runoff processes, the model will be expanded using GSFLOW to simulate integrated surface water and groundwater to support water resources planning and management in the Alameda Creek watershed.

  10. Streamflow changes in the Sierra Nevada, California, simulated using a statistically downscaled general circulation model scenario of climate change

    Science.gov (United States)

    Wilby, Robert L.; Dettinger, Michael D.

    2000-01-01

    Simulations of future climate using general circulation models (GCMs) suggest that rising concentrations of greenhouse gases may have significant consequences for the global climate. Of less certainty is the extent to which regional scale (i.e., sub-GCM grid) environmental processes will be affected. In this chapter, a range of downscaling techniques are critiqued. Then a relatively simple (yet robust) statistical downscaling technique and its use in the modelling of future runoff scenarios for three river basins in the Sierra Nevada, California, is described. This region was selected because GCM experiments driven by combined greenhouse-gas and sulphate-aerosol forcings consistently show major changes in the hydro-climate of the southwest United States by the end of the 21st century. The regression-based downscaling method was used to simulate daily rainfall and temperature series for streamflow modelling in three Californian river basins under current-and future-climate conditions. The downscaling involved just three predictor variables (specific humidity, zonal velocity component of airflow, and 500 hPa geopotential heights) supplied by the U.K. Meteorological Office couple ocean-atmosphere model (HadCM2) for the grid point nearest the target basins. When evaluated using independent data, the model showed reasonable skill at reproducing observed area-average precipitation, temperature, and concomitant streamflow variations. Overall, the downscaled data resulted in slight underestimates of mean annual streamflow due to underestimates of precipitation in spring and positive temperature biases in winter. Differences in the skill of simulated streamflows amongst the three basins were attributed to the smoothing effects of snowpack on streamflow responses to climate forcing. The Merced and American River basins drain the western, windward slope of the Sierra Nevada and are snowmelt dominated, whereas the Carson River drains the eastern, leeward slope and is a mix of

  11. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  12. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    model structure suggested by University of Lund the WP4 leader. This particular model structure has the advantages that it fits better into the control design frame work used by WP3-4 compared to the model structures previously developed in WP2. The different model structures are first summarised....... Then issues dealing with optimal experimental design is considered. Finally the parameters are estimated in the chosen static and dynamic models and a validation is performed. Two of the static models, one of them the additive model, explains the data well. In case of dynamic models the suggested additive...

  13. Application of artificial neural network, fuzzy logic and decision tree algorithms for modelling of streamflow at Kasol in India.

    Science.gov (United States)

    Senthil Kumar, A R; Goyal, Manish Kumar; Ojha, C S P; Singh, R D; Swamee, P K

    2013-01-01

    The prediction of streamflow is required in many activities associated with the planning and operation of the components of a water resources system. Soft computing techniques have proven to be an efficient alternative to traditional methods for modelling qualitative and quantitative water resource variables such as streamflow, etc. The focus of this paper is to present the development of models using multiple linear regression (MLR), artificial neural network (ANN), fuzzy logic and decision tree algorithms such as M5 and REPTree for predicting the streamflow at Kasol located at the upstream of Bhakra reservoir in Sutlej basin in northern India. The input vector to the various models using different algorithms was derived considering statistical properties such as auto-correlation function, partial auto-correlation and cross-correlation function of the time series. It was found that REPtree model performed well compared to other soft computing techniques such as MLR, ANN, fuzzy logic, and M5P investigated in this study and the results of the REPTree model indicate that the entire range of streamflow values were simulated fairly well. The performance of the naïve persistence model was compared with other models and the requirement of the development of the naïve persistence model was also analysed by persistence index.

  14. Phosphorus Export Model Development in a Terminal Lake Basin using Concentration-Streamflow Relationship

    Science.gov (United States)

    Jeannotte, T.; Mahmood, T. H.; Matheney, R.; Hou, X.

    2017-12-01

    Nutrient export to streams and lakes by anthropogenic activities can lead to eutrophication and degradation of surface water quality. In Devils Lake, ND, the only terminal lake in the Northern Great Plains, the algae boom is of great concern due to the recent increase in streamflow and consequent rise in phosphorus (P) export from prairie agricultural fields. However, to date, very few studies explored the concentration (c) -streamflow (q) relationship in the headwater catchments of the Devils Lake basin. A robust watershed-scale quantitative framework would aid understanding of the c-q relationship, simulating P concentration and load. In this study, we utilize c-q relationships to develop a simple model to estimate phosphorus concentration and export from two headwater catchments of different size (Mauvais Coulee: 1032 km2 and Trib 3: 160 km2) draining to Devils Lake. Our goal is to link the phosphorus export model with a physically based hydrologic model to identify major drivers of phosphorus export. USGS provided the streamflow measurements, and we collected water samples (filtered and unfiltered) three times daily during the spring snowmelt season (March 31, 2017- April 12, 2017) at the outlets of both headwater catchments. Our results indicate that most P is dissolved and very little is particulate, suggesting little export of fine-grained sediment from agricultural fields. Our preliminary analyses in the Mauvais Coulee catchment show a chemostatic c-q relationship in the rising limb of the hydrograph, while the recession limb shows a linear and positive c-q relationship. The poor correlation in the rising limb of the hydrograph suggests intense flushing of P by spring snowmelt runoff. Flushing then continues in the recession limb of the hydrograph, but at a more constant rate. The estimated total P load for the Mauvais Coulee basin is 193 kg/km2, consistent with other catchments of similar size across the Red River of the North basin to the east. We expect

  15. Spatial Streamflow Forecasting in a Large River Basin in Northwestern Mexico using a Fully-distributed Hydrologic Model

    Science.gov (United States)

    Robles-Morua, A.; Vivoni, E. R.; Mayer, A. S.

    2010-12-01

    Spatial forecasting of streamflows in large watersheds in northwest Mexico is a challenge due to limited stream gauge stations and the high spatiotemporal variability of precipitation and landscape characteristics. To adequately manage water resources in this region, it is important to understand the spatiotemporal variability of streamflows. In this study, a distributed hydrologic model, the TIN-Based Real Time Basin Simulator (tRIBS), was parameterized to generate estimates of streamflow as they relate to rainfall variability and landscape characteristics in the Rio Sonora (~ 9,500 km2) in northwest Mexico. Our distributed approach divided the watershed into 291 un-gauged subbasins (92% of total basin area). For each subbasin, tRIBS was forced using sparse ground observations from June 1, 2007 to May 31, 2008. To improve the model forcing, we explored the use of the North American Land Data Assimilation System (NLDAS) as an alternative method. Our simulations included spatiotemporal forcing from: (1) a sparse network of ground-based stations (hourly resolution), (2) raw model products (12 km pixel, hourly resolution) from NLDAS, and (3) the NLDAS product adjusted using available ground data. Simulations for the ungauged sub-basins were coupled to a Muskingum-Cunge model that routed the resulting streamflows to the watershed outlet for comparison with the only available stream gauge. Our continuous simulations provide spatially distributed estimates of streamflow, which allowed distinguishing regions with different contributions to the main stem of the river. Comparisons between the simulations illustrate the impact of different rainfall forcings on the overall magnitude of streamflow estimates. Ground-based forcings typically overestimate streamflow predictions in the northern regions of the basin relative to the adjusted NLDAS dataset. Furthermore, we explore the relationship between the spatiotemporal variability of runoff generation mechanisms and landscape

  16. A general model for the influence of daily rainfall intensity and timing on streamflow variability and flood risk

    Science.gov (United States)

    Deal, E.; Dralle, D.; Braun, J.; Botter, G.

    2017-12-01

    The response of river basins to rainfall changes significantly from one basin to another and one storm to the next. This is because many time-varying processes influence how rainfall makes its way into streams, such as soil moisture dynamics, antecedent conditions as well as subsurface flow and storage. Using an established, parsimonious stochastic model of catchment hydrology simplifies the problem enough to handle this complexity and search for a general response of river basins to predicted changes in rainfall resulting from climate change. The model is applicable in basins without significant snowfall. Included are simplified representations of rainfall, soil moisture dynamics and evapotranspiration. Additionally, deep water storage and subsurface hydrology are parameterized with a simple model for streamflow recessions. The model suggests a general relationship between rainfall timing and intensity and the variability of streamflow. Specifically, we predict that basins with intense rainfall or rainfall that is correlated in time (rainfall over several consecutive days) will have streamflow with a higher coefficient of variation than basins with less intense or less correlated rainfall, all else equal. We test this prediction using a database of USGS gauged rivers minimally impacted by human activity. In basins without significant snowfall, the observed relationship between rainfall and streamflow variability matches the theory well. Further, the manner in which this relationship is modulated by streamflow recession characteristics agrees with our theory. Most importantly, we find the effect of rainfall intensity is sensitive to the nonlinearity of streamflow recessions. We use our new understanding of the effect of rainfall intensity and timing on streamflow variability and the controls on this relationship to quantify the sensitivity of streamflow variability to changes in rainfall. It has been predicted that rainfall intensity will increase in many places in

  17. A past discharge assimilation system for ensemble streamflow forecasts over France – Part 2: Impact on the ensemble streamflow forecasts

    Directory of Open Access Journals (Sweden)

    G. Thirel

    2010-08-01

    Full Text Available The use of ensemble streamflow forecasts is developing in the international flood forecasting services. Ensemble streamflow forecast systems can provide more accurate forecasts and useful information about the uncertainty of the forecasts, thus improving the assessment of risks. Nevertheless, these systems, like all hydrological forecasts, suffer from errors on initialization or on meteorological data, which lead to hydrological prediction errors. This article, which is the second part of a 2-part article, concerns the impacts of initial states, improved by a streamflow assimilation system, on an ensemble streamflow prediction system over France. An assimilation system was implemented to improve the streamflow analysis of the SAFRAN-ISBA-MODCOU (SIM hydro-meteorological suite, which initializes the ensemble streamflow forecasts at Météo-France. This assimilation system, using the Best Linear Unbiased Estimator (BLUE and modifying the initial soil moisture states, showed an improvement of the streamflow analysis with low soil moisture increments. The final states of this suite were used to initialize the ensemble streamflow forecasts of Météo-France, which are based on the SIM model and use the European Centre for Medium-range Weather Forecasts (ECMWF 10-day Ensemble Prediction System (EPS. Two different configurations of the assimilation system were used in this study: the first with the classical SIM model and the second using improved soil physics in ISBA. The effects of the assimilation system on the ensemble streamflow forecasts were assessed for these two configurations, and a comparison was made with the original (i.e. without data assimilation and without the improved physics ensemble streamflow forecasts. It is shown that the assimilation system improved most of the statistical scores usually computed for the validation of ensemble predictions (RMSE, Brier Skill Score and its decomposition, Ranked Probability Skill Score, False Alarm

  18. Simulation of daily streamflow for 12 river basins in western Iowa using the Precipitation-Runoff Modeling System

    Science.gov (United States)

    Christiansen, Daniel E.; Haj, Adel E.; Risley, John C.

    2017-10-24

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, constructed Precipitation-Runoff Modeling System models to estimate daily streamflow for 12 river basins in western Iowa that drain into the Missouri River. The Precipitation-Runoff Modeling System is a deterministic, distributed-parameter, physical-process-based modeling system developed to evaluate the response of streamflow and general drainage basin hydrology to various combinations of climate and land use. Calibration periods for each basin varied depending on the period of record available for daily mean streamflow measurements at U.S. Geological Survey streamflow-gaging stations.A geographic information system tool was used to delineate each basin and estimate initial values for model parameters based on basin physical and geographical features. A U.S. Geological Survey automatic calibration tool that uses a shuffled complex evolution algorithm was used for initial calibration, and then manual modifications were made to parameter values to complete the calibration of each basin model. The main objective of the calibration was to match daily discharge values of simulated streamflow to measured daily discharge values. The Precipitation-Runoff Modeling System model was calibrated at 42 sites located in the 12 river basins in western Iowa.The accuracy of the simulated daily streamflow values at the 42 calibration sites varied by river and by site. The models were satisfactory at 36 of the sites based on statistical results. Unsatisfactory performance at the six other sites can be attributed to several factors: (1) low flow, no flow, and flashy flow conditions in headwater subbasins having a small drainage area; (2) poor representation of the groundwater and storage components of flow within a basin; (3) lack of accounting for basin withdrawals and water use; and (4) limited availability and accuracy of meteorological input data. The Precipitation-Runoff Modeling System

  19. Assessing the Use of Remote Sensing and a Crop Growth Model to Improve Modeled Streamflow in Central Asia

    Science.gov (United States)

    Richey, A. S.; Richey, J. E.; Tan, A.; Liu, M.; Adam, J. C.; Sokolov, V.

    2015-12-01

    Central Asia presents a perfect case study to understand the dynamic, and often conflicting, linkages between food, energy, and water in natural systems. The destruction of the Aral Sea is a well-known environmental disaster, largely driven by increased irrigation demand on the rivers that feed the endorheic sea. Continued reliance on these rivers, the Amu Darya and Syr Darya, often place available water resources at odds between hydropower demands upstream and irrigation requirements downstream. A combination of tools is required to understand these linkages and how they may change in the future as a function of climate change and population growth. In addition, the region is geopolitically complex as the former Soviet basin states develop management strategies to sustainably manage shared resources. This complexity increases the importance of relying upon publically available information sources and tools. Preliminary work has shown potential for the Variable Infiltration Capacity (VIC) model to recreate the natural water balance in the Amu Darya and Syr Darya basins by comparing results to total terrestrial water storage changes observed from NASA's Gravity Recovery and Climate Experiment (GRACE) satellite mission. Modeled streamflow is well correlated to observed streamflow at upstream gauges prior to the large-scale expansion of irrigation and hydropower. However, current modeled results are unable to capture the human influence of water use on downstream flow. This study examines the utility of a crop simulation model, CropSyst, to represent irrigation demand and GRACE to improve modeled streamflow estimates in the Amu Darya and Syr Darya basins. Specifically we determine crop water demand with CropSyst utilizing available data on irrigation schemes and cropping patterns. We determine how this demand can be met either by surface water, modeled by VIC with a reservoir operation scheme, and/or by groundwater derived from GRACE. Finally, we assess how the

  20. Ecohydrologic Response of a Wetland Indicator Species to Climate Change and Streamflow Regulation: A Conceptual Model

    Science.gov (United States)

    Ward, E. M.; Gorelick, S.

    2015-12-01

    The Peace-Athabasca Delta ("Delta") in northeastern Alberta, Canada, is a UNESCO World Heritage Site and a Ramsar Wetland of International Importance. Delta ecohydrology is expected to respond rapidly to upstream water demand and climate change, with earlier spring meltwater, decreased springtime peak flow, and a decline in springtime ice-jam flooding. We focus on changes in the population and distribution of muskrat (Ondatra zibethicus), an ecohydrologic indicator species. We present a conceptual model linking hydrology and muskrat ecology. Our conceptual model links seven modules representing (1) upstream water demand, (2) streamflow and snowmelt, (3) floods, (4) the water balance of floodplain lakes, (5) muskrat habitat suitability, (6) wetland vegetation, and (7) muskrat population dynamics predicted using an agent-based model. Our goal is to evaluate the effects of different climate change and upstream water demand scenarios on the abundance and distribution of Delta muskrat, from present-2100. Moving from the current conceptual model to a predictive quantitative model, we will rely on abundant existing data and Traditional Ecological Knowledge of muskrat and hydrology in the Delta.

  1. Neural networks modelling of streamflow, phosphorus, and suspended solids: application to the Canadian Boreal forest.

    Science.gov (United States)

    Nour, M H; Smith, D W; Gamal El-Din, M; Prepas, E E

    2006-01-01

    Sediment has long been identified as an important vector for the transport of nutrients and contaminants such as heavy metals and microorganisms. The respective nutrient loading to water bodies can potentially lead to dissolved oxygen depletion, cyanobacteria toxin production and ultimately eutrophication. This study proposed an artificial neural network (ANN) modelling algorithm that relies on low cost readily available meteorological data for simulating streamflow (Q), total suspended solids (TSS) concentration, and total phosphorus (TP) concentration. The models were applied to a 130-km2 watershed in the Canadian Boreal Plain. Our results demonstrated that through careful manipulation of time series analysis and rigorous optimization of ANN configuration, it is possible to simulate Q, TSS, and TP reasonably well. R2 values exceeding 0.89 were obtained for all modelled data cases. The proposed models can provide real time predictions of the modelled parameters, can answer questions related to the impact of climate change scenarios on water quantity and quality, and can be implemented in water resources management through Monte Carlo simulations.

  2. Streamflow characteristics from modelled runoff time series: Importance of calibration criteria selection

    Science.gov (United States)

    Poole, Sandra; Vis, Marc; Knight, Rodney; Seibert, Jan

    2017-01-01

    Ecologically relevant streamflow characteristics (SFCs) of ungauged catchments are often estimated from simulated runoff of hydrologic models that were originally calibrated on gauged catchments. However, SFC estimates of the gauged donor catchments and subsequently the ungauged catchments can be substantially uncertain when models are calibrated using traditional approaches based on optimization of statistical performance metrics (e.g., Nash–Sutcliffe model efficiency). An improved calibration strategy for gauged catchments is therefore crucial to help reduce the uncertainties of estimated SFCs for ungauged catchments. The aim of this study was to improve SFC estimates from modeled runoff time series in gauged catchments by explicitly including one or several SFCs in the calibration process. Different types of objective functions were defined consisting of the Nash–Sutcliffe model efficiency, single SFCs, or combinations thereof. We calibrated a bucket-type runoff model (HBV – Hydrologiska Byråns Vattenavdelning – model) for 25 catchments in the Tennessee River basin and evaluated the proposed calibration approach on 13 ecologically relevant SFCs representing major flow regime components and different flow conditions. While the model generally tended to underestimate the tested SFCs related to mean and high-flow conditions, SFCs related to low flow were generally overestimated. The highest estimation accuracies were achieved by a SFC-specific model calibration. Estimates of SFCs not included in the calibration process were of similar quality when comparing a multi-SFC calibration approach to a traditional model efficiency calibration. For practical applications, this implies that SFCs should preferably be estimated from targeted runoff model calibration, and modeled estimates need to be carefully interpreted.

  3. Streamflow characteristics from modeled runoff time series - importance of calibration criteria selection

    Science.gov (United States)

    Pool, Sandra; Vis, Marc J. P.; Knight, Rodney R.; Seibert, Jan

    2017-11-01

    Ecologically relevant streamflow characteristics (SFCs) of ungauged catchments are often estimated from simulated runoff of hydrologic models that were originally calibrated on gauged catchments. However, SFC estimates of the gauged donor catchments and subsequently the ungauged catchments can be substantially uncertain when models are calibrated using traditional approaches based on optimization of statistical performance metrics (e.g., Nash-Sutcliffe model efficiency). An improved calibration strategy for gauged catchments is therefore crucial to help reduce the uncertainties of estimated SFCs for ungauged catchments. The aim of this study was to improve SFC estimates from modeled runoff time series in gauged catchments by explicitly including one or several SFCs in the calibration process. Different types of objective functions were defined consisting of the Nash-Sutcliffe model efficiency, single SFCs, or combinations thereof. We calibrated a bucket-type runoff model (HBV - Hydrologiska Byråns Vattenavdelning - model) for 25 catchments in the Tennessee River basin and evaluated the proposed calibration approach on 13 ecologically relevant SFCs representing major flow regime components and different flow conditions. While the model generally tended to underestimate the tested SFCs related to mean and high-flow conditions, SFCs related to low flow were generally overestimated. The highest estimation accuracies were achieved by a SFC-specific model calibration. Estimates of SFCs not included in the calibration process were of similar quality when comparing a multi-SFC calibration approach to a traditional model efficiency calibration. For practical applications, this implies that SFCs should preferably be estimated from targeted runoff model calibration, and modeled estimates need to be carefully interpreted.

  4. Regionalization of subsurface stormflow parameters of hydrologic models: Derivation from regional analysis of streamflow recession curves

    Energy Technology Data Exchange (ETDEWEB)

    Ye, Sheng; Li, Hongyi; Huang, Maoyi; Ali, Melkamu; Leng, Guoyong; Leung, Lai-Yung R.; Wang, Shaowen; Sivapalan, Murugesu

    2014-07-21

    Subsurface stormflow is an important component of the rainfall–runoff response, especially in steep terrain. Its contribution to total runoff is, however, poorly represented in the current generation of land surface models. The lack of physical basis of these common parameterizations precludes a priori estimation of the stormflow (i.e. without calibration), which is a major drawback for prediction in ungauged basins, or for use in global land surface models. This paper is aimed at deriving regionalized parameterizations of the storage–discharge relationship relating to subsurface stormflow from a top–down empirical data analysis of streamflow recession curves extracted from 50 eastern United States catchments. Detailed regression analyses were performed between parameters of the empirical storage–discharge relationships and the controlling climate, soil and topographic characteristics. The regression analyses performed on empirical recession curves at catchment scale indicated that the coefficient of the power-law form storage–discharge relationship is closely related to the catchment hydrologic characteristics, which is consistent with the hydraulic theory derived mainly at the hillslope scale. As for the exponent, besides the role of field scale soil hydraulic properties as suggested by hydraulic theory, it is found to be more strongly affected by climate (aridity) at the catchment scale. At a fundamental level these results point to the need for more detailed exploration of the co-dependence of soil, vegetation and topography with climate.

  5. Modeling streamflow from coupled airborne laser scanning and acoustic Doppler current profiler data

    Science.gov (United States)

    Norris, Lam; Kean, Jason W.; Lyon, Steve

    2016-01-01

    The rating curve enables the translation of water depth into stream discharge through a reference cross-section. This study investigates coupling national scale airborne laser scanning (ALS) and acoustic Doppler current profiler (ADCP) bathymetric survey data for generating stream rating curves. A digital terrain model was defined from these data and applied in a physically based 1-D hydraulic model to generate rating curves for a regularly monitored location in northern Sweden. Analysis of the ALS data showed that overestimation of the streambank elevation could be adjusted with a root mean square error (RMSE) block adjustment using a higher accuracy manual topographic survey. The results of our study demonstrate that the rating curve generated from the vertically corrected ALS data combined with ADCP data had lower errors (RMSE = 0.79 m3/s) than the empirical rating curve (RMSE = 1.13 m3/s) when compared to streamflow measurements. We consider these findings encouraging as hydrometric agencies can potentially leverage national-scale ALS and ADCP instrumentation to reduce the cost and effort required for maintaining and establishing rating curves at gauging station sites similar to the Röån River.

  6. An initial abstraction and constant loss model, and methods for estimating unit hydrographs, peak streamflows, and flood volumes for urban basins in Missouri

    Science.gov (United States)

    Huizinga, Richard J.

    2014-01-01

    Streamflow data, basin characteristics, and rainfall data from 39 streamflow-gaging stations for urban areas in and adjacent to Missouri were used by the U.S. Geological Survey in cooperation with the Metropolitan Sewer District of St. Louis to develop an initial abstraction and constant loss model (a time-distributed basin-loss model) and a gamma unit hydrograph (GUH) for urban areas in Missouri. Study-specific methods to determine peak streamflow and flood volume for a given rainfall event also were developed.

  7. A precipitation-runoff model for simulating natural streamflow conditions in the Smith River watershed, Montana, water years 1996-2008

    Science.gov (United States)

    Chase, Katherine J.; Caldwell, Rodney R.; Stanley, Andrea K.

    2014-01-01

    This report documents the construction of a precipitation-runoff model for simulating natural streamflow in the Smith River watershed, Montana. This Precipitation-Runoff Modeling System model, constructed in cooperation with the Meagher County Conservation District, can be used to examine the general hydrologic framework of the Smith River watershed, including quantification of precipitation, evapotranspiration, and streamflow; partitioning of streamflow between surface runoff and subsurface flow; and quantifying contributions to streamflow from several parts of the watershed. The model was constructed by using spatial datasets describing watershed topography, the streams, and the hydrologic characteristics of the basin soils and vegetation. Time-series data (daily total precipitation, and daily minimum and maximum temperature) were input to the model to simulate daily streamflow. The model was calibrated for water years 2002–2007 and evaluated for water years 1996–2001. Though water year 2008 was included in the study period to evaluate water-budget components, calibration and evaluation data were unavailable for that year. During the calibration and evaluation periods, simulated-natural flow values were compared to reconstructed-natural streamflow data. These reconstructed-natural streamflow data were calculated by adding Bureau of Reclamation’s depletions data to the observed streamflows. Reconstructed-natural streamflows represent estimates of streamflows for water years 1996–2007 assuming there was no agricultural water-resources development in the watershed. Additional calibration targets were basin mean monthly solar radiation and potential evapotranspiration. The model estimated the hydrologic processes in the Smith River watershed during the calibration and evaluation periods. Simulated-natural mean annual and mean monthly flows generally were the same or higher than the reconstructed-natural streamflow values during the calibration period, whereas

  8. Daily Streamflow Predictions in an Ungauged Watershed in Northern California Using the Precipitation-Runoff Modeling System (PRMS): Calibration Challenges when nearby Gauged Watersheds are Hydrologically Dissimilar

    Science.gov (United States)

    Dhakal, A. S.; Adera, S.

    2017-12-01

    Accurate daily streamflow prediction in ungauged watersheds with sparse information is challenging. The ability of a hydrologic model calibrated using nearby gauged watersheds to predict streamflow accurately depends on hydrologic similarities between the gauged and ungauged watersheds. This study examines daily streamflow predictions using the Precipitation-Runoff Modeling System (PRMS) for the largely ungauged San Antonio Creek watershed, a 96 km2 sub-watershed of the Alameda Creek watershed in Northern California. The process-based PRMS model is being used to improve the accuracy of recent San Antonio Creek streamflow predictions generated by two empirical methods. Although San Antonio Creek watershed is largely ungauged, daily streamflow data exists for hydrologic years (HY) 1913 - 1930. PRMS was calibrated for HY 1913 - 1930 using streamflow data, modern-day land use and PRISM precipitation distribution, and gauged precipitation and temperature data from a nearby watershed. The PRMS model was then used to generate daily streamflows for HY 1996-2013, during which the watershed was ungauged, and hydrologic responses were compared to two nearby gauged sub-watersheds of Alameda Creek. Finally, the PRMS-predicted daily flows between HY 1996-2013 were compared to the two empirically-predicted streamflow time series: (1) the reservoir mass balance method and (2) correlation of historical streamflows from 80 - 100 years ago between San Antonio Creek and a nearby sub-watershed located in Alameda Creek. While the mass balance approach using reservoir storage and transfers is helpful for estimating inflows to the reservoir, large discrepancies in daily streamflow estimation can arise. Similarly, correlation-based predicted daily flows which rely on a relationship from flows collected 80-100 years ago may not represent current watershed hydrologic conditions. This study aims to develop a method of streamflow prediction in the San Antonio Creek watershed by examining PRMS

  9. An Integrated Modeling System for Estimating Glacier and Snow Melt Driven Streamflow from Remote Sensing and Earth System Data Products in the Himalayas

    Science.gov (United States)

    Brown, M. E.; Racoviteanu, A. E.; Tarboton, D. G.; Sen Gupta, A.; Nigro, J.; Policelli, F.; Habib, S.; Tokay, M.; Shrestha, M. S.; Bajracharya, S.

    2014-01-01

    Quantification of the contribution of the hydrologic components (snow, ice and rain) to river discharge in the Hindu Kush Himalayan (HKH) region is important for decision-making in water sensitive sectors, and for water resources management and flood risk reduction. In this area, access to and monitoring of the glaciers and their melt outflow is challenging due to difficult access, thus modeling based on remote sensing offers the potential for providing information to improve water resources management and decision making. This paper describes an integrated modeling system developed using downscaled NASA satellite based and earth system data products coupled with in-situ hydrologic data to assess the contribution of snow and glaciers to the flows of the rivers in the HKH region. Snow and glacier melt was estimated using the Utah Energy Balance (UEB) model, further enhanced to accommodate glacier ice melt over clean and debris-covered tongues, then meltwater was input into the USGS Geospatial Stream Flow Model (Geo- SFM). The two model components were integrated into Better Assessment Science Integrating point and Nonpoint Sources modeling framework (BASINS) as a user-friendly open source system and was made available to countries in high Asia. Here we present a case study from the Langtang Khola watershed in the monsoon-influenced Nepal Himalaya, used to validate our energy balance approach and to test the applicability of our modeling system. The snow and glacier melt model predicts that for the eight years used for model evaluation (October 2003-September 2010), the total surface water input over the basin was 9.43 m, originating as 62% from glacier melt, 30% from snowmelt and 8% from rainfall. Measured streamflow for those years were 5.02 m, reflecting a runoff coefficient of 0.53. GeoSFM simulated streamflow was 5.31 m indicating reasonable correspondence between measured and model confirming the capability of the integrated system to provide a quantification

  10. Efficient multi-scenario Model Predictive Control for water resources management with ensemble streamflow forecasts

    Science.gov (United States)

    Tian, Xin; Negenborn, Rudy R.; van Overloop, Peter-Jules; María Maestre, José; Sadowska, Anna; van de Giesen, Nick

    2017-11-01

    Model Predictive Control (MPC) is one of the most advanced real-time control techniques that has been widely applied to Water Resources Management (WRM). MPC can manage the water system in a holistic manner and has a flexible structure to incorporate specific elements, such as setpoints and constraints. Therefore, MPC has shown its versatile performance in many branches of WRM. Nonetheless, with the in-depth understanding of stochastic hydrology in recent studies, MPC also faces the challenge of how to cope with hydrological uncertainty in its decision-making process. A possible way to embed the uncertainty is to generate an Ensemble Forecast (EF) of hydrological variables, rather than a deterministic one. The combination of MPC and EF results in a more comprehensive approach: Multi-scenario MPC (MS-MPC). In this study, we will first assess the model performance of MS-MPC, considering an ensemble streamflow forecast. Noticeably, the computational inefficiency may be a critical obstacle that hinders applicability of MS-MPC. In fact, with more scenarios taken into account, the computational burden of solving an optimization problem in MS-MPC accordingly increases. To deal with this challenge, we propose the Adaptive Control Resolution (ACR) approach as a computationally efficient scheme to practically reduce the number of control variables in MS-MPC. In brief, the ACR approach uses a mixed-resolution control time step from the near future to the distant future. The ACR-MPC approach is tested on a real-world case study: an integrated flood control and navigation problem in the North Sea Canal of the Netherlands. Such an approach reduces the computation time by 18% and up in our case study. At the same time, the model performance of ACR-MPC remains close to that of conventional MPC.

  11. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  12. Accounting for seasonal isotopic patterns of forest canopy intercepted precipitation in streamflow modeling

    Science.gov (United States)

    Stockinger, Michael P.; Lücke, Andreas; Vereecken, Harry; Bogena, Heye R.

    2017-12-01

    Forest canopy interception alters the isotopic tracer signal of precipitation leading to significant isotopic differences between open precipitation (δOP) and throughfall (δTF). This has important consequences for the tracer-based modeling of streamwater transit times. Some studies have suggested using a simple static correction to δOP by uniformly increasing it because δTF is rarely available for hydrological modeling. Here, we used data from a 38.5 ha spruce forested headwater catchment where three years of δOP and δTF were available to develop a data driven method that accounts for canopy effects on δOP. Changes in isotopic composition, defined as the difference δTF-δOP, varied seasonally with higher values during winter and lower values during summer. We used this pattern to derive a corrected δOP time series and analyzed the impact of using (1) δOP, (2) reference throughfall data (δTFref) and (3) the corrected δOP time series (δOPSine) in estimating the fraction of young water (Fyw), i.e., the percentage of streamflow younger than two to three months. We found that Fyw derived from δOPSine came closer to δTFref in comparison to δOP. Thus, a seasonally-varying correction for δOP can be successfully used to infer δTF where it is not available and is superior to the method of using a fixed correction factor. Seasonal isotopic enrichment patterns should be accounted for when estimating Fyw and more generally in catchment hydrology studies using other tracer methods to reduce uncertainty.

  13. Remote Sensing-based Methodologies for Snow Model Adjustments in Operational Streamflow Prediction

    Science.gov (United States)

    Bender, S.; Miller, W. P.; Bernard, B.; Stokes, M.; Oaida, C. M.; Painter, T. H.

    2015-12-01

    Water management agencies rely on hydrologic forecasts issued by operational agencies such as NOAA's Colorado Basin River Forecast Center (CBRFC). The CBRFC has partnered with the Jet Propulsion Laboratory (JPL) under funding from NASA to incorporate research-oriented, remotely-sensed snow data into CBRFC operations and to improve the accuracy of CBRFC forecasts. The partnership has yielded valuable analysis of snow surface albedo as represented in JPL's MODIS Dust Radiative Forcing in Snow (MODDRFS) data, across the CBRFC's area of responsibility. When dust layers within a snowpack emerge, reducing the snow surface albedo, the snowmelt rate may accelerate. The CBRFC operational snow model (SNOW17) is a temperature-index model that lacks explicit representation of snowpack surface albedo. CBRFC forecasters monitor MODDRFS data for emerging dust layers and may manually adjust SNOW17 melt rates. A technique was needed for efficient and objective incorporation of the MODDRFS data into SNOW17. Initial development focused in Colorado, where dust-on-snow events frequently occur. CBRFC forecasters used retrospective JPL-CBRFC analysis and developed a quantitative relationship between MODDRFS data and mean areal temperature (MAT) data. The relationship was used to generate adjusted, MODDRFS-informed input for SNOW17. Impacts of the MODDRFS-SNOW17 MAT adjustment method on snowmelt-driven streamflow prediction varied spatially and with characteristics of the dust deposition events. The largest improvements occurred in southwestern Colorado, in years with intense dust deposition events. Application of the method in other regions of Colorado and in "low dust" years resulted in minimal impact. The MODDRFS-SNOW17 MAT technique will be implemented in CBRFC operations in late 2015, prior to spring 2016 runoff. Collaborative investigation of remote sensing-based adjustment methods for the CBRFC operational hydrologic forecasting environment will continue over the next several years.

  14. Modeling Potential Impacts of Climate Change on Streamflow Using Projections of the 5th Assessment Report for the Bernam River Basin, Malaysia

    Directory of Open Access Journals (Sweden)

    Nkululeko Simeon Dlamini

    2017-03-01

    Full Text Available Potential impacts of climate change on the streamflow of the Bernam River Basin in Malaysia are assessed using ten Global Climate Models (GCMs under three Representative Concentration Pathways (RCP4.5, RCP6.0 and RCP8.5. A graphical user interface was developed that integrates all of the common procedures of assessing climate change impacts, to generate high resolution climate variables (e.g., rainfall, temperature, etc. at the local scale from large-scale climate models. These are linked in one executable module to generate future climate sequences that can be used as inputs to various models, including hydrological and crop models. The generated outputs were used as inputs to the SWAT hydrological model to simulate the hydrological processes. The evaluation results indicated that the model performed well for the watershed with a monthly R2, Nash–Sutcliffe Efficiency (NSE and Percent Bias (PBIAS values of 0.67, 0.62 and −9.4 and 0.62, 0.61 and −4.2 for the calibration and validation periods, respectively. The multi-model projections show an increase in future temperature (tmax and tmin in all respective scenarios, up to an average of 2.5 °C for under the worst-case scenario (RC8.5. Rainfall is also predicted to change with clear variations between the dry and wet season. Streamflow projections also followed rainfall pattern to a great extent with a distinct change between the dry and wet season possibly due to the increase in evapotranspiration in the watershed. In principle, the interface can be customized for the application to other watersheds by incorporating GCMs’ baseline data and their corresponding future data for those particular stations in the new watershed. Methodological limitations of the study are also discussed.

  15. 78 FR 13874 - Watershed Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to...

    Science.gov (United States)

    2013-03-01

    ... characterize the sensitivity of streamflow, nutrient (nitrogen and phosphorus), and sediment loading to a range... (nitrogen and phosphorus) loading, and sediment loading to a range of plausible mid-21st century climate... Docket is located in the EPA Headquarters Docket Center, EPA West Building, Room 3334, 1301 Constitution...

  16. Modeling Change in Watershed Streamflow, Groundwater Recharge and Surface Water - Groundwater Interactions Due to Irrigation and Associated Diversions and Pumping

    Science.gov (United States)

    Essaid, H.; Caldwell, R. R.

    2015-12-01

    The impacts of irrigation and associated surface water (SW) diversions and groundwater (GW) pumping on instream flows, groundwater recharge and SW-GW interactions are being examined using a watershed-scale coupled SW-GW flow model. The U.S. Geological Survey (USGS) model GSFLOW (Markstrom et al., 2008), an integration of the USGS Precipitation-Runoff Modeling System (PRMS) and the Modular Ground-Water Flow Model (MODFLOW), is being utilized for this effort. Processes represented in this model include daily rain, snowfall, snowmelt, streamflow, surface runoff, interflow, infiltration, soil-zone evapotranspiration, and subsurface unsaturated and groundwater flow and evapotranspiration. The Upper Smith River watershed, an important agricultural and recreational area in west-central Montana, is being used as the basis for watershed climate, topography, hydrography, vegetation, soil properties as well as scenarios of irrigation and associated practices. The 640 square kilometer watershed area has been discretized into coincident 200 m by 200 m hydrologic response units (for climate and soil zone flow processes) and grid blocks (for unsaturated zone and GW flow processes). The subsurface GW system is discretized into 6 layers representing Quaternary alluvium, Tertiary sediments and bedrock. The model is being used to recreate natural, pre-development streamflows and GW conditions in the watershed. The results of this simulation are then compared to a simulation with flood and sprinkler irrigation supplied by SW diversion and GW pumping to examine the magnitude and timing of changes in streamflow, groundwater recharge and SW-GW interactions. Model results reproduce observed hydrologic responses to both natural climate variability and irrigation practices. Periodic irrigation creates increased evapotranspiration and GW recharge in cultivated areas of the watershed as well as SW-GW interactions that are more dynamic than under natural conditions.

  17. Comparative analysis of various real-time data assimilation approaches for assimilating streamflow into a hydrologic routing model

    Science.gov (United States)

    Noh, Seong Jin; Mazzoleni, Maurizio; Lee, Haksu; Liu, Yuqiong; Seo, Dong Jun; Solomatine, Dimitri

    2016-04-01

    Reliable water depth estimation is an extremely important issue in operational early flood warning systems. Different water system models have been implemented in the last decades, and, in parallel, data assimilation approaches have been introduced in order to reduce the uncertainty of such models. The goal of this study is to compare the performances of a distributed hydrologic routing model with streamflow assimilation using six different data assimilation methods, including direct insertion, nudging, Kalman filter, Ensemble Kalman filter, Asynchronous Ensemble Kalman filter and variational method. The model used in this study is a 3-parameter Muskingum (O'Donnell 1985) which was implemented in the Trinity River, within the Dallas-Fort-Worth Metroplex area in Texas, USA. The first methodological step is to discretize the river reach into multiple 1-km sub-reaches in order to estimate water depth in a distributed fashion. Then, different data assimilation approaches were implemented using the state-space approach formulation of the Muskingum model proposed by Georgakakos (1990). Finally, streamflow observations were assimilated at two points where flow sensors are located. The results of this work pointed out that assimilation of streamflow observations can noticeably improve the hydrologic routing model prediction and that ensemble definition is particularly important for both Ensemble Kalman filter and Asynchronous Ensemble Kalman filter. This study is part of the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/) and NSF Project Integrated Sensing and Prediction of urban Water for Sustainable Cities (http://ispuw.uta.edu/nsf)

  18. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  19. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    Science.gov (United States)

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-11-01

    SummaryThe Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamflow, total suspended sediment (TSS) losses and phosphorus load calibration and validation were performed using field survey information and water quantity and quality data recorded during 2008 and 2009 years in Del Reguero irrigated watershed in Spain. The goodness of the calibration and validation results was assessed using five statistical measures, including the Nash-Sutcliffe efficiency (NSE). Results indicated that the average annual crop yield and actual evapotranspiration estimations were quite satisfactory. On a monthly basis, the values of NSE were 0.90 (calibration) and 0.80 (validation) indicating that the modified model could reproduce accurately the observed streamflow. The TSS losses were also satisfactorily estimated (NSE = 0.72 and 0.52 for the calibration and validation steps). The monthly temporal patterns and all the statistical parameters indicated that the modified SWAT-IRRIG model adequately predicted the total phosphorus (TP) loading. Therefore, the model could be used to assess the impacts of different best management practices on nonpoint phosphorus losses in irrigated systems.

  20. Monthly hydrometeorological ensemble prediction of streamflow droughts and corresponding drought indices

    Science.gov (United States)

    Fundel, F.; Jörg-Hess, S.; Zappa, M.

    2013-01-01

    Streamflow droughts, characterized by low runoff as consequence of a drought event, affect numerous aspects of life. Economic sectors that are impacted by low streamflow are, e.g., power production, agriculture, tourism, water quality management and shipping. Those sectors could potentially benefit from forecasts of streamflow drought events, even of short events on the monthly time scales or below. Numerical hydrometeorological models have increasingly been used to forecast low streamflow and have become the focus of recent research. Here, we consider daily ensemble runoff forecasts for the river Thur, which has its source in the Swiss Alps. We focus on the evaluation of low streamflow and of the derived indices as duration, severity and magnitude, characterizing streamflow droughts up to a lead time of one month. The ECMWF VarEPS 5-member ensemble reforecast, which covers 18 yr, is used as forcing for the hydrological model PREVAH. A thorough verification reveals that, compared to probabilistic peak-flow forecasts, which show skill up to a lead time of two weeks, forecasts of streamflow droughts are skilful over the entire forecast range of one month. For forecasts at the lower end of the runoff regime, the quality of the initial state seems to be crucial to achieve a good forecast quality in the longer range. It is shown that the states used in this study to initialize forecasts satisfy this requirement. The produced forecasts of streamflow drought indices, derived from the ensemble forecasts, could be beneficially included in a decision-making process. This is valid for probabilistic forecasts of streamflow drought events falling below a daily varying threshold, based on a quantile derived from a runoff climatology. Although the forecasts have a tendency to overpredict streamflow droughts, it is shown that the relative economic value of the ensemble forecasts reaches up to 60%, in case a forecast user is able to take preventive action based on the forecast.

  1. Monthly hydrometeorological ensemble prediction of streamflow droughts and corresponding drought indices

    Directory of Open Access Journals (Sweden)

    F. Fundel

    2013-01-01

    Full Text Available Streamflow droughts, characterized by low runoff as consequence of a drought event, affect numerous aspects of life. Economic sectors that are impacted by low streamflow are, e.g., power production, agriculture, tourism, water quality management and shipping. Those sectors could potentially benefit from forecasts of streamflow drought events, even of short events on the monthly time scales or below. Numerical hydrometeorological models have increasingly been used to forecast low streamflow and have become the focus of recent research. Here, we consider daily ensemble runoff forecasts for the river Thur, which has its source in the Swiss Alps. We focus on the evaluation of low streamflow and of the derived indices as duration, severity and magnitude, characterizing streamflow droughts up to a lead time of one month.

    The ECMWF VarEPS 5-member ensemble reforecast, which covers 18 yr, is used as forcing for the hydrological model PREVAH. A thorough verification reveals that, compared to probabilistic peak-flow forecasts, which show skill up to a lead time of two weeks, forecasts of streamflow droughts are skilful over the entire forecast range of one month. For forecasts at the lower end of the runoff regime, the quality of the initial state seems to be crucial to achieve a good forecast quality in the longer range. It is shown that the states used in this study to initialize forecasts satisfy this requirement. The produced forecasts of streamflow drought indices, derived from the ensemble forecasts, could be beneficially included in a decision-making process. This is valid for probabilistic forecasts of streamflow drought events falling below a daily varying threshold, based on a quantile derived from a runoff climatology. Although the forecasts have a tendency to overpredict streamflow droughts, it is shown that the relative economic value of the ensemble forecasts reaches up to 60%, in case a forecast user is able to take preventive

  2. StreamFlow 1.0: an extension to the spatially distributed snow model Alpine3D for hydrological modelling and deterministic stream temperature prediction

    Science.gov (United States)

    Gallice, Aurélien; Bavay, Mathias; Brauchli, Tristan; Comola, Francesco; Lehning, Michael; Huwald, Hendrik

    2016-12-01

    Climate change is expected to strongly impact the hydrological and thermal regimes of Alpine rivers within the coming decades. In this context, the development of hydrological models accounting for the specific dynamics of Alpine catchments appears as one of the promising approaches to reduce our uncertainty of future mountain hydrology. This paper describes the improvements brought to StreamFlow, an existing model for hydrological and stream temperature prediction built as an external extension to the physically based snow model Alpine3D. StreamFlow's source code has been entirely written anew, taking advantage of object-oriented programming to significantly improve its structure and ease the implementation of future developments. The source code is now publicly available online, along with a complete documentation. A special emphasis has been put on modularity during the re-implementation of StreamFlow, so that many model aspects can be represented using different alternatives. For example, several options are now available to model the advection of water within the stream. This allows for an easy and fast comparison between different approaches and helps in defining more reliable uncertainty estimates of the model forecasts. In particular, a case study in a Swiss Alpine catchment reveals that the stream temperature predictions are particularly sensitive to the approach used to model the temperature of subsurface flow, a fact which has been poorly reported in the literature to date. Based on the case study, StreamFlow is shown to reproduce hourly mean discharge with a Nash-Sutcliffe efficiency (NSE) of 0.82 and hourly mean temperature with a NSE of 0.78.

  3. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  4. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  5. Quantifying Streamflow Variations in Ungauged Lake Basins by Integrating Remote Sensing and Water Balance Modelling: A Case Study of the Erdos Larus relictus National Nature Reserve, China

    Directory of Open Access Journals (Sweden)

    Kang Liang

    2017-06-01

    Full Text Available Hydrological predictions in ungauged lakes are one of the most important issues in hydrological sciences. The habitat of the Relict Gull (Larus relictus in the Erdos Larus relictus National Nature Reserve (ELRNNR has been seriously endangered by lake shrinkage, yet the hydrological processes in the catchment are poorly understood due to the lack of in-situ observations. Therefore, it is necessary to assess the variation in lake streamflow and its drivers. In this study, we employed the remote sensing technique and empirical equation to quantify the time series of lake water budgets, and integrated a water balance model and climate elasticity method to further examine ELRNNR basin streamflow variations from1974 to 2013. The results show that lake variations went through three phases with significant differences: The rapidly expanding sub-period (1974–1979, the relatively stable sub-period (1980–1999, and the dramatically shrinking sub-period (2000–2013. Both climate variation (expressed by precipitation and evapotranspiration and human activities were quantified as drivers of streamflow variation, and the driving forces in the three phases had different contributions. As human activities gradually intensified, the contributions of human disturbances on streamflow variation obviously increased, accounting for 22.3% during 1980–1999 and up to 59.2% during 2000–2013. Intensified human interferences and climate warming have jointly led to the lake shrinkage since 1999. This study provides a useful reference to quantify lake streamflow and its drivers in ungauged basins.

  6. Global-scale high-resolution ( 1 km) modelling of mean, maximum and minimum annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark; Hendriks, Jan; Beusen, Arthur; Clavreul, Julie; King, Henry; Schipper, Aafke

    2017-04-01

    Quantifying mean, maximum and minimum annual flow (AF) of rivers at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. AF metrics can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict AF metrics based on climate and catchment characteristics. Yet, so far, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. We developed global-scale regression models that quantify mean, maximum and minimum AF as function of catchment area and catchment-averaged slope, elevation, and mean, maximum and minimum annual precipitation and air temperature. We then used these models to obtain global 30 arc-seconds (˜ 1 km) maps of mean, maximum and minimum AF for each year from 1960 through 2015, based on a newly developed hydrologically conditioned digital elevation model. We calibrated our regression models based on observations of discharge and catchment characteristics from about 4,000 catchments worldwide, ranging from 100 to 106 km2 in size, and validated them against independent measurements as well as the output of a number of process-based global hydrological models (GHMs). The variance explained by our regression models ranged up to 90% and the performance of the models compared well with the performance of existing GHMs. Yet, our AF maps provide a level of spatial detail that cannot yet be achieved by current GHMs.

  7. Seasonal forecasts of the SINTEX-F coupled model applied to maize yield and streamflow estimates over north-eastern South Africa

    CSIR Research Space (South Africa)

    Malherbe, J

    2014-07-01

    Full Text Available Forecasts of a Global Coupled Model for austral summer with a 1 month lead are downscaled to end-of-season maize yields and accumulated streamflow over the Limpopo Province and adjacent districts in northeastern South Africa through application...

  8. Improving daily streamflow forecasts in mountainous Upper Euphrates basin by multi-layer perceptron model with satellite snow products

    Science.gov (United States)

    Uysal, Gökçen; Şensoy, Aynur; Şorman, A. Arda

    2016-12-01

    This paper investigates the contribution of Moderate Resolution Imaging Spectroradiometer (MODIS) satellite Snow Cover Area (SCA) product and in-situ snow depth measurements to Artificial Neural Network model (ANN) based daily streamflow forecasting in a mountainous river basin. In order to represent non-linear structure of the snowmelt process, Multi-Layer Perceptron (MLP) Feed-Forward Backpropagation (FFBP) architecture is developed and applied in Upper Euphrates River Basin (10,275 km2) of Turkey where snowmelt constitutes approximately 2/3 of total annual volume of runoff during spring and early summer months. Snowmelt season is evaluated between March and July; 7 years (2002-2008) seasonal daily data are used during training while 3 years (2009-2011) seasonal daily data are split for forecasting. One of the fastest ANN training algorithms, the Levenberg-Marquardt, is used for optimization of the network weights and biases. The consistency of the network is checked with four performance criteria: coefficient of determination (R2), Nash-Sutcliffe model efficiency (ME), root mean square error (RMSE) and mean absolute error (MAE). According to the results, SCA observations provide useful information for developing of a neural network model to predict snowmelt runoff, whereas snow depth data alone are not sufficient. The highest performance is experienced when total daily precipitation, average air temperature data are combined with satellite snow cover data. The data preprocessing technique of Discrete Wavelet Analysis (DWA) is coupled with MLP modeling to further improve the runoff peak estimates. As a result, Nash-Sutcliffe model efficiency is increased from 0.52 to 0.81 for training and from 0.51 to 0.75 for forecasting. Moreover, the results are compared with that of a conceptual model, Snowmelt Runoff Model (SRM), application using SCA as an input. The importance and the main contribution of this study is to use of satellite snow products and data

  9. Streamflow Forecasting using Satellite Products: A Benchmark Approach. Can We Reduce Uncertainty by using Multiple Products and Multiple Models?

    Science.gov (United States)

    Roy, T.; Serrat-Capdevila, A.; Gupta, H.; Valdes, J. B.

    2015-12-01

    Real-time satellite precipitation products can be used to drive hydrologic forecasts in downstream areas of poorly gauged basins. We present an improved approach to hydrologic modeling using satellite precipitation estimates to reduce uncertainty, consisting of: (1) bias-correction of satellite products, (2) re-calibration of hydrologic models using bias-corrected estimates, (3) bias-correction of streamflow outputs, and (4) plotting of uncertainty intervals. In addition, we evaluate the benefits of multi-product and model forecasts using four satellite precipitation products (CHIRPS, CMORPH, TMPA, and PERSIANN-CCS) to drive two hydrologic models (HYMOD and HBV-EDU), generating eight forecasts from different model-product-combinations following the approach described above. These probabilistic forecasts are then merged in an attempt to produce an improved forecast with higher accuracy and smaller uncertainty. These methods are applied in the Mara Basin in Kenya, facing serious water sustainability challenges, in an effort to support water management decisions balancing human and environmental needs, as part of the NASA SERVIR Applied Sciences Team.

  10. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  11. Statistical validation of stochastic models

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering

    1996-12-31

    It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

  12. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  13. Skilful seasonal forecasts of streamflow over Europe?

    Science.gov (United States)

    Arnal, Louise; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; Prudhomme, Christel; Neumann, Jessica; Krzeminski, Blazej; Pappenberger, Florian

    2018-04-01

    This paper considers whether there is any added value in using seasonal climate forecasts instead of historical meteorological observations for forecasting streamflow on seasonal timescales over Europe. A Europe-wide analysis of the skill of the newly operational EFAS (European Flood Awareness System) seasonal streamflow forecasts (produced by forcing the Lisflood model with the ECMWF System 4 seasonal climate forecasts), benchmarked against the ensemble streamflow prediction (ESP) forecasting approach (produced by forcing the Lisflood model with historical meteorological observations), is undertaken. The results suggest that, on average, the System 4 seasonal climate forecasts improve the streamflow predictability over historical meteorological observations for the first month of lead time only (in terms of hindcast accuracy, sharpness and overall performance). However, the predictability varies in space and time and is greater in winter and autumn. Parts of Europe additionally exhibit a longer predictability, up to 7 months of lead time, for certain months within a season. In terms of hindcast reliability, the EFAS seasonal streamflow hindcasts are on average less skilful than the ESP for all lead times. The results also highlight the potential usefulness of the EFAS seasonal streamflow forecasts for decision-making (measured in terms of the hindcast discrimination for the lower and upper terciles of the simulated streamflow). Although the ESP is the most potentially useful forecasting approach in Europe, the EFAS seasonal streamflow forecasts appear more potentially useful than the ESP in some regions and for certain seasons, especially in winter for almost 40 % of Europe. Patterns in the EFAS seasonal streamflow hindcast skill are however not mirrored in the System 4 seasonal climate hindcasts, hinting at the need for a better understanding of the link between hydrological and meteorological variables on seasonal timescales, with the aim of improving climate-model

  14. Long-range forecasting of intermittent streamflow

    OpenAIRE

    F. F. van Ogtrop; R. W. Vervoort; G. Z. Heller; D. M. Stasinopoulos; R. A. Rigby

    2011-01-01

    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine th...

  15. Long-range forecasting of intermittent streamflow

    OpenAIRE

    F. F. van Ogtrop; R. W. Vervoort; G. Z. Heller; D. M. Stasinopoulos; R. A. Rigby

    2011-01-01

    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a probabilistic statistical model to forecast streamflow 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine the probabil...

  16. Use of a precipitation-runoff model for simulating effects of forest management on streamflow in 11 small drainage basins, Oregon Coast Range

    Science.gov (United States)

    Risley, J.C.

    1994-01-01

    The Precipitation-Runoff Modeling System (PRMS) model of the U.S. Geological Survey was used to simulate the hydrologic effects of timber management in 11 small, upland drainage basins of the Coast Range in Oregon. The coefficients of determination for observed and simulated daily flow during the calibration periods ranged from 0.92 for the Flynn Creek Basin to 0.68 for the Priorli Creek Basin; percent error ranged from -0.25 for the Deer Creek Basin to -4.49 for the Nestucca River Basin. The coefficients of determination during the validation periods ranged from 0.90 for the Flynn Creek Basin to 0.66 for the Wind River Basin; percent error during the validation periods ranged from -0.91 for the Flynn Creek Basin to 22.3 for the Priorli Creek Basin. In addition to daily simulations, 42 storms were selected from the time-series periods in which the 11 basins were studied and used in hourly storm-mode simulations. Sources of simulation error included the quality of the input data, deficiencies in the PRMS model-algorithms, and the quality of parameter estimation. Times-series data from the Flynn Creek and Needle Branch Basins, collected during an earlier U.S. Geological Survey paired-watershed study, were used to evaluate the PRMS as a tool for predicting the hydrologic effects of timber-management practices. The Flynn Creek Basin remained forested and undisturbed during the data-collection period, while the Needle Branch Basin had been clearcut 82 percent at a midpoint during the period of data collection. Using the PRMS, streamflow at the Needle Branch Basin was simulated during the postlogging period using prelogging parameter values. Comparison of postlogging observed streamflow with the simulated data showed an increase in annual discharge volume of approximately 8 percent and a small increase in peak flows of from 1 to 2 percent. The simulated flows from the basins studied were generally insensitive to the number of hydrologic-response units used to replicate

  17. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  18. Measuring and modeling spatio-temporal patterns of groundwater storage dynamics to better understand nonlinear streamflow response

    Science.gov (United States)

    Rinderer, Michael; van Meerveld, Ilja; McGlynn, Brian

    2017-04-01

    Information about the spatial and temporal variability in catchment scale groundwater storage is needed to identify runoff source area dynamics and better understand variability in streamflow. However, information on groundwater levels is typically only available at a limited number of monitoring sites and interpolation or upscaling is necessary to obtain information on catchment scale groundwater dynamics. Here we used data from 51 spatially distributed groundwater monitoring sites in a Swiss pre-alpine catchment and time series clustering to define six groundwater response clusters. Each of the clusters was distinct in terms of the groundwater rise and recession but also had distinctly different topographic site characteristics, which allowed us to assign a groundwater response cluster to all non-monitored locations. Each of them was then assigned the mean groundwater response of the monitored cluster members. A site was considered active (i.e., enabling lateral subsurface flow) when the groundwater levels rose above the groundwater response threshold which was defined based on the depth of the more transmissive soil layers (typically between 10 cm and 30 cm below the soil surface). This allowed us to create maps of the active areas across the catchment at 15 min time intervals. The mean fraction of agreement between modeled groundwater activation (based on the mean cluster member time series) and measured groundwater activation (based on the measured groundwater level time series at a monitoring site) was 0.91 (25th percentile: 0.88, median: 0.92, 75th percentile: 0.95). The fraction of agreement dropped by 10 to 15 % at the beginning of events but was never lower than 0.4. Connectivity between all active areas and the stream network was determined using a graph theory approach. During rainfall events, the simulated active and connected area extended mainly laterally and longitudinally along the channel network, which is in agreement with the variable source

  19. Simulation of Streamflow in a Discontinuous Permafrost Environment Using a Modified First-order, Nonlinear Rainfall-runoff Model

    Science.gov (United States)

    Bolton, W. R.; Hinzman, L. D.

    2009-12-01

    The sub-arctic environment can be characterized by being located in the zone of discontinuous permafrost. Although the distribution of permafrost in this region is specific, it dominates the response of many of the hydrologic processes including stream flow, soil moisture dynamics, and water storage processes. In areas underlain by permafrost, ice-rich conditions at the permafrost table inhibit surface water percolation to the deep subsurface soils, resulting in an increased runoff generation generation during precipitation events, decreased baseflow between precipitation events, and relatively wetter soils compared to permafrost-free areas. Over the course of a summer season, the thawing of the active layer (the thin soil layer about the permafrost that seasonally freezes and thaws) increases the potential water holding capacity of the soil, resulting in a decreasing surface water contribution during precipitation events and a steadily increasing baseflow contribution between precipitation events. Simulation of stream flow in this region is challenging due to the rapidly changing thermal (permafrost versus non-permafrost, active layer development) and hydraulic (hydraulic conductivity and soil storage capacity) conditions in both time and space (x, y, and z-dimensions). Many of the factors that have a control on both permafrost distribution and the thawing/freezing of active layer (such as soil material, soil moisture, and ice content) are not easily quantified at scales beyond the point measurement. In this study, these issues are addressed through streamflow analysis - the only hydrologic process that is easily measured at the basin scale. Following the general procedure outlined in Kirchner (2008), a simple rainfall-runoff model was applied to three small head-water basins of varying permafrost coverage. A simple, first-order, non-linear differential equation that describes the storage-discharge relationship were derived from three years of stream flow data

  20. Streamflow alteration at selected sites in Kansas

    Science.gov (United States)

    Juracek, Kyle E.; Eng, Ken

    2017-06-26

    An understanding of streamflow alteration in response to various disturbances is necessary for the effective management of stream habitat for a variety of species in Kansas. Streamflow alteration can have negative ecological effects. Using a modeling approach, streamflow alteration was assessed for 129 selected U.S. Geological Survey streamgages in the State for which requisite streamflow and basin-characteristic information was available. The assessment involved a comparison of the observed condition from 1980 to 2015 with the predicted expected (least-disturbed) condition for 29 streamflow metrics. The metrics represent various characteristics of streamflow including average flow (annual, monthly) and low and high flow (frequency, duration, magnitude).Streamflow alteration in Kansas was indicated locally, regionally, and statewide. Given the absence of a pronounced trend in annual precipitation in Kansas, a precipitation-related explanation for streamflow alteration was not supported. Thus, the likely explanation for streamflow alteration was human activity. Locally, a flashier flow regime (typified by shorter lag times and more frequent and higher peak discharges) was indicated for three streamgages with urbanized basins that had higher percentages of impervious surfaces than other basins in the State. The combination of localized reservoir effects and regional groundwater pumping from the High Plains aquifer likely was responsible, in part, for diminished conditions indicated for multiple streamflow metrics in western and central Kansas. Statewide, the implementation of agricultural land-management practices to reduce runoff may have been responsible, in part, for a diminished duration and magnitude of high flows. In central and eastern Kansas, implemented agricultural land-management practices may have been partly responsible for an inflated magnitude of low flows at several sites.

  1. Disentangling the response of streamflow to forest management and climate

    Science.gov (United States)

    Dymond, S.; Miniat, C.; Bladon, K. D.; Keppeler, E.; Caldwell, P. V.

    2016-12-01

    Paired watershed studies have showcased the relationships between forests, management, and streamflow. However, classical analyses of paired-watershed studies have done little to disentangle the effects of management from overarching climatic signals, potentially masking the interaction between management and climate. Such approaches may confound our understanding of how forest management impacts streamflow. Here we use a 50-year record of streamflow and climate data from the Caspar Creek Experimental Watersheds (CCEW), California, USA to separate the effects of forest management and climate on streamflow. CCEW has two treatment watersheds that have been harvested in the past 50 years. We used a nonlinear mixed model to combine the pre-treatment relationship between streamflow and climate and the post-treatment relationship via an interaction between climate and management into one equation. Our results show that precipitation and potential evapotranspiration alone can account for >95% of the variability in pre-treatment streamflow. Including management scenarios into the model explained most of the variability in streamflow (R2 > 0.98). While forest harvesting altered streamflow in both of our modeled watersheds, removing 66% of the vegetation via selection logging using a tractor yarding system over the entire watershed had a more substantial impact on streamflow than clearcutting small portions of a watershed using cable-yarding. These results suggest that forest harvesting may result in differing impacts on streamflow and highlights the need to incorporate climate into streamflow analyses of paired-watershed studies.

  2. Streamflow loss quantification for groundwater flow modeling using a wading-rod-mounted acoustic Doppler current profiler in a headwater stream

    Science.gov (United States)

    Pflügl, Christian; Hoehn, Philipp; Hofmann, Thilo

    2017-04-01

    Irrespective of the availability of various field measurement and modeling approaches, the quantification of interactions between surface water and groundwater systems remains associated with high uncertainty. Such uncertainties on stream-aquifer interaction have a high potential to misinterpret the local water budget and water quality significantly. Due to typically considerable temporal variation of stream discharge rates, it is desirable for the measurement of streamflow to reduce the measuring duration while reducing uncertainty. Streamflow measurements, according to the velocity-area method, have been performed along reaches of a losing-disconnected, subalpine headwater stream using a 2-dimensional, wading-rod-mounted acoustic Doppler current profiler (ADCP). The method was chosen, with stream morphology not allowing for boat-mounted setups, to reduce uncertainty compared to conventional, single-point streamflow measurements of similar measurement duration. Reach-averaged stream loss rates were subsequently quantified between 12 cross sections. They enabled the delineation of strongly infiltrating stream reaches and their differentiation from insignificantly infiltrating reaches. Furthermore, a total of 10 near-stream observation wells were constructed and/or equipped with pressure and temperature loggers. The time series of near-stream groundwater temperature data were cross-correlated with stream temperature time series to yield supportive qualitative information on the delineation of infiltrating reaches. Subsequently, as a reference parameterization, the hydraulic conductivity and specific yield of a numerical, steady-state model of groundwater flow, in the unconfined glaciofluvial aquifer adjacent to the stream, were inversely determined incorporating the inferred stream loss rates. Applying synthetic sets of infiltration rates, resembling increasing levels of uncertainty associated with single-point streamflow measurements of comparable duration, the

  3. Can We Use Regression Modeling to Quantify Mean Annual Streamflow at a Global-Scale?

    Science.gov (United States)

    Barbarossa, V.; Huijbregts, M. A. J.; Hendriks, J. A.; Beusen, A.; Clavreul, J.; King, H.; Schipper, A.

    2016-12-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF using observations of discharge and catchment characteristics from 1,885 catchments worldwide, ranging from 2 to 106 km2 in size. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB [van Beek et al., 2011] by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area, mean annual precipitation and air temperature, average slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error values were lower (0.29 - 0.38 compared to 0.49 - 0.57) and the modified index of agreement was higher (0.80 - 0.83 compared to 0.72 - 0.75). Our regression model can be applied globally at any point of the river network, provided that the input parameters are within the range of values employed in the calibration of the model. The performance is reduced for water scarce regions and further research should focus on improving such an aspect for regression-based global hydrological models.

  4. Developing and testing a global-scale regression model to quantify mean annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Hendriks, A. Jan; Beusen, Arthur H. W.; Clavreul, Julie; King, Henry; Schipper, Aafke M.

    2017-01-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF based on a dataset unprecedented in size, using observations of discharge and catchment characteristics from 1885 catchments worldwide, measuring between 2 and 106 km2. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area and catchment averaged mean annual precipitation and air temperature, slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error (RMSE) values were lower (0.29-0.38 compared to 0.49-0.57) and the modified index of agreement (d) was higher (0.80-0.83 compared to 0.72-0.75). Our regression model can be applied globally to estimate MAF at any point of the river network, thus providing a feasible alternative to spatially explicit process-based global hydrological models.

  5. Toward Improving Streamflow Forecasts Using SNODAS Products

    Science.gov (United States)

    Barth, C.; Boyle, D. P.; Lamorey, G. W.; Bassett, S. D.

    2007-12-01

    As part of the Water 2025 initiative, researchers at the Desert Research Institute in collaboration with the U.S. Bureau of Reclamation are developing and improving water decision support system (DSS) tools to make seasonal streamflow forecasts for management and operations of water resources in the mountainous western United States. Streamflow forecasts in these areas may have errors that are directly related to uncertainties resulting from the lack of direct high resolution snow water equivalent (SWE) measurements. The purpose of this study is to investigate the possibility of improving the accuracy of streamflow forecasts through the use of Snow Data Assimilation System (SNODAS) products, which are high-resolution daily estimates of snow cover and associated hydrologic variables such as SWE and snowmelt runoff that are available for the coterminous United States. To evaluate the benefit of incorporating the SNODAS product into streamflow forecasts, a variety of Ensemble Streamflow Predictions (ESP) are generated using the Precipitation-Runoff Modeling System (PRMS). A series of manual and automatic calibrations of PRMS to different combinations of measured (streamflow) and estimated (SNODAS SWE) hydrologic variables is performed for several watersheds at various scales of spatial resolution. This study, which is embedded in the constant effort to improve streamflow forecasts and hence water operations DSS, shows the potential of using a product such as SNODAS SWE estimates to decrease parameter uncertainty related to snow variables and enhance forecast skills early in the forecast season.

  6. Parameter dimensionality reduction of a conceptual model for streamflow prediction in Canadian, snowmelt dominated ungauged basins

    Science.gov (United States)

    Arsenault, Richard; Poissant, Dominique; Brissette, François

    2015-11-01

    This paper evaluated the effects of parametric reduction of a hydrological model on five regionalization methods and 267 catchments in the province of Quebec, Canada. The Sobol' variance-based sensitivity analysis was used to rank the model parameters by their influence on the model results and sequential parameter fixing was performed. The reduction in parameter correlations improved parameter identifiability, however this improvement was found to be minimal and was not transposed in the regionalization mode. It was shown that 11 of the HSAMI models' 23 parameters could be fixed with little or no loss in regionalization skill. The main conclusions were that (1) the conceptual lumped models used in this study did not represent physical processes sufficiently well to warrant parameter reduction for physics-based regionalization methods for the Canadian basins examined and (2) catchment descriptors did not adequately represent the relevant hydrological processes, namely snow accumulation and melt.

  7. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  8. Modifications to a rainfall-streamflow model to handle "non-stationarity"

    Directory of Open Access Journals (Sweden)

    B. F. W. Croke

    2015-06-01

    Full Text Available This paper explores the variation in hydrological response (often termed as non-stationarity, though this is not necessarily the correct use in the statistical meaning of this term through time for the Bani catchment in Africa (mostly located in Mali. The objective is to identify deficiencies in the ability of the model to capture the variation in the hydrologic response of the catchment, and modify the model to capture this variation. Due to the large catchment area (approximately 103 000 km2, the unit hydrograph component of the model was modified to permit the model to be used at a daily timescale. Further, an additional driver (population growth needed to be included in order to adequately capture the transition from a perennial to an ephemeral river.

  9. Multi-variable SWAT model calibration with remotely sensed evapotranspiration and observed flow

    OpenAIRE

    Franco, Ana Clara Lazzari; Bonumá, Nadia Bernardi

    2017-01-01

    ABSTRACT Although intrinsic, uncertainty for hydrological model estimation is not always reported. The aim of this study is to evaluate the use of satellite-based evapotranspiration on SWAT model calibration, regarding uncertainty and model performance in streamflow simulation. The SWAT model was calibrated in a monthly step and validated in monthly (streamflow and evapotranspiration) and daily steps (streamflow only). The validation and calibration period covers the years from 2006 to 2009 a...

  10. Application of the Water Erosion Prediction Project (WEPP) Model to simulate streamflow in a PNW forest watershed

    Science.gov (United States)

    A. Srivastava; M. Dobre; E. Bruner; W. J. Elliot; I. S. Miller; J. Q. Wu

    2011-01-01

    Assessment of water yields from watersheds into streams and rivers is critical to managing water supply and supporting aquatic life. Surface runoff typically contributes the most to peak discharge of a hydrograph while subsurface flow dominates the falling limb of hydrograph and baseflow contributes to streamflow from shallow unconfined aquifers primarily during the...

  11. Analytic Hierarchy Process (AHP in Ranking Non-Parametric Stochastic Rainfall and Streamflow Models

    Directory of Open Access Journals (Sweden)

    Masengo Ilunga

    2015-08-01

    Full Text Available Analytic Hierarchy Process (AHP is used in the selection of categories of non-parametric stochastic models for hydrological data generation and its formulation is based on pairwise comparisons of models. These models or techniques are obtained from a recent study initiated by the Water Research Commission of South Africa (WRC and were compared predominantly based on their capability to extrapolate data beyond the range of historic hydrological data. The different categories of models involved in the selection process were: wavelet (A, reordering (B, K-nearest neighbor (C, kernel density (D and bootstrap (E. In the AHP formulation, criteria for the selection of techniques are: "ability for data to preserve historic characteristics", "ability to generate new hydrological data", "scope of applicability", "presence of negative data generated" and "user friendliness". The pairwise comparisons performed through AHP showed that the overall order of selection (ranking of models was D, C, A, B and C. The weights of these techniques were found to be 27.21%, 24.3 %, 22.15 %, 13.89 % and 11.80 % respectively. Hence, bootstrap category received the highest preference while nearest neighbor received the lowest preference when all selection criteria are taken into consideration.

  12. Multivariate power-law models for streamflow prediction in the Mekong Basin

    Directory of Open Access Journals (Sweden)

    Guillaume Lacombe

    2014-11-01

    New hydrological insights for the region: A combination of 3–6 explanatory variables – chosen among annual rainfall, drainage area, perimeter, elevation, slope, drainage density and latitude – is sufficient to predict a range of flow metrics with a prediction R-squared ranging from 84 to 95%. The inclusion of forest or paddy percentage coverage as an additional explanatory variable led to slight improvements in the predictive power of some of the low-flow models (lowest prediction R-squared = 89%. A physical interpretation of the model structure was possible for most of the resulting relationships. Compared to regional regression models developed in other parts of the world, this new set of equations performs reasonably well.

  13. Geochemistry Model Validation Report: External Accumulation Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Zarrabi

    2001-09-27

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  14. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  15. Streamflow Gaging Stations

    Data.gov (United States)

    Department of Homeland Security — This map layer shows selected streamflow gaging stations of the United States, Puerto Rico, and the U.S. Virgin Islands, in 2013. Gaging stations, or gages, measure...

  16. Unsteady streamflow simulation using a linear implicit finite-difference model

    Science.gov (United States)

    Land, Larry F.

    1978-01-01

    A computer program for simulating one-dimensional subcritical, gradually varied, unsteady flow in a stream has been developed and documented. Given upstream and downstream boundary conditions and channel geometry data, roughness coefficients, stage, and discharge can be calculated anywhere within the reach as a function of time. The program uses a linear implicit finite-difference technique that discritizes the partial differential equations. Then it arranges the coefficients of the continuity and momentum equations into a pentadiagonal matrix for solution. Because it is a reasonable compromise between computational accuracy, speed and ease of use,the technique is one of the most commonly used. The upstream boundary condition is a depth hydrograph. However, options also allow the boundary condition to be discharge or water-surface elevation. The downstream boundary condition is a depth which may be constant, self-setting, or unsteady. The reach may be divided into uneven increments and the cross sections may be nonprismatic and may vary from one to the other. Tributary and lateral inflow may enter the reach. The digital model will simulate such common problems as (1) flood waves, (2) releases from dams, and (3) channels where storage is a consideration. It may also supply the needed flow information for mass-transport simulation. (Woodard-USGS)

  17. Analysing the Effects of Forest Cover and Irrigation Farm Dams on Streamflows of Water-Scarce Catchments in South Australia through the SWAT Model

    Directory of Open Access Journals (Sweden)

    Hong Hanh Nguyen

    2017-01-01

    Full Text Available To assist water resource managers with future land use planning efforts, the eco-hydrological model Soil and Water Assessment Tool (SWAT was applied to three catchments in South Australia that experience extreme low flow conditions. Particular land uses and management issues of interest included forest covers, known to affect water yields, and farm dams, known to intercept and change the hydrological dynamics in a catchment. The study achieved a satisfactory daily calibration when irrigation farm dams were incorporated in the model. For the catchment dominated by extreme low flows, a better daily simulation across a range of qualitative and quantitative metrics was gained using the base-flow static threshold optimization technique. Scenario analysis on effects of forest cover indicated an increase of surface flow and a reduction of base-flow when native eucalyptus lands were replaced by pastures and vice versa. A decreasing trend was observed for the overall water yield of catchments with more forest plantation due to the higher evapotranspiration (ET rate and the decline in surface flow. With regards to effects of irrigation farm dams, assessment on a daily time step suggested that a significant volume of water is stored in these systems with the water loss rate highest in June and July. On an annual basis, the model indicated that approximately 13.1% to 22.0% of water has been captured by farm dams for irrigation. However, the scenario analysis revealed that the purposes of use of farm dams rather than their volumetric capacities in the catchment determined the magnitude of effects on streamflows. Water extracted from farm dams for irrigation of orchards and vineyards are more likely to diminish streamflows than other land uses. Outputs from this study suggest that the water use restrictions from farm dams during recent drought periods were an effective tool to minimize impacts on streamflows.

  18. Sensitivity of SWAT simulated streamflow to climatic changes within the Eastern Nile River basin

    Science.gov (United States)

    Mengistu, D. T.; Sorteberg, A.

    2012-02-01

    The hydrological model SWAT was run with daily station based precipitation and temperature data for the whole Eastern Nile basin including the three subbasins: the Abbay (Blue Nile), BaroAkobo and Tekeze. The daily and monthly streamflows were calibrated and validated at six outlets with station-based streamflow data in the three different subbasins. The model performed very well in simulating the monthly variability while the validation against daily data revealed a more diverse performance. The simulations indicated that around 60% of the average annual rainfalls of the subbasins were lost through evaporation while the estimated runoff coefficients were 0.24, 0.30 and 0.18 for Abbay, BaroAkobo and Tekeze subbasins, respectively. About half to two-thirds of the runoff could be attributed to surface runoff while the other contributions came from groundwater. Twenty hypothetical climate change scenarios (perturbed temperatures and precipitation) were conducted to test the sensitivity of SWAT simulated annual streamflow. The result revealed that the annual streamflow sensitivity to changes in precipitation and temperature differed among the basins and the dependence of the response on the strength of the changes was not linear. On average the annual streamflow responses to a change in precipitation with no temperature change were 19%, 17%, and 26% per 10% change in precipitation while the average annual streamflow responses to a change in temperature and no precipitation change were -4.4% K-1, -6.4% K-1, and -1.3% K-1 for Abbay, BaroAkobo and Tekeze river basins, respectively. 47 temperature and precipitation scenarios from 19 AOGCMs participating inCMIP3 were used to estimate future changes in streamflow due to climate changes. The climate models disagreed on both the strength and the direction of future precipitation changes. Thus, no clear conclusions could be made about future changes in the Eastern Nile streamflow. However, such types of assessment are important

  19. Operational Streamflow Forecasts Development Using GCM Predicted Precipitation Fields

    Science.gov (United States)

    Arumugam, S.; Lall, U.

    2004-12-01

    Monthly updates of streamflow forecasts are required for deriving reservoir operation strategies as well as for quantifying surplus and shortfall for the allocated water contracts. In this study, an operational streamflow forecasts are developed using Atmospheric General Circulation Models (AGCM) predicted precipitation for managing the Angat Reservoir System, Philippines. The methodology employs principal components regression (PCR) for downscaling the AGCM predicted precipitation fields to monthly streamflow forecasts. The performance of this downscaling approach is analyzed with AGCM being forced using the observed sea surface temperature (SST) conditions as well under persisted SST conditions. The ability of downscaled streamflow forecasts in explaining the intraseasonal variability is also explored. Conditional distribution of streamflows obtained from the PCR downscaling approach is also compared with a simple, semi-parametric resampling algorithm that obtains ensembles of streamflow forecasts by identifying similar conditions in that season's climatic predictors state space.

  20. Long Term Quantification of Climate and Land Cover Change Impacts on Streamflow in an Alpine River Catchment, Northwestern China

    Directory of Open Access Journals (Sweden)

    Zhenliang Yin

    2017-07-01

    Full Text Available Quantifying the long term impacts of climate and land cover change on streamflow is of great important for sustainable water resources management in inland river basins. The Soil and Water Assessment Tool (SWAT model was employed to simulate the streamflow in the upper reaches of Heihe River Basin, northwestern China, over the last half century. The Sequential Uncertainty Fitting algorithm (SUFI-2 was selected to calibrate and validate the SWAT model. The results showed that both Nash-Sutcliffe efficiency (NSE and determination coefficient (R2 were over 0.93 for calibration and validation periods, the percent bias (PBIAS of the two periods were—3.47% and 1.81%, respectively. The precipitation, average, maximum, and minimum air temperature were all showing increasing trends, with 14.87 mm/10 years, 0.30 °C/10 years, 0.27 °C/10 year, and 0.37 °C/10 years, respectively. Runoff coefficient has increased from 0.36 (averaged during 1964 to 1988 to 0.39 (averaged during 1989 to 2013. Based on the SWAT simulation, we quantified the contribution of climate and land cover change to streamflow change, indicated that the land cover change had a positive impact on river discharge by increasing 7.12% of the streamflow during 1964 to 1988, and climate change contributed 14.08% for the streamflow increasing over last 50 years. Meanwhile, the climate change impact was intensive after 2000s. The increasing of streamflow contributed to the increasing of total streamflow by 64.1% for cold season (November to following March and 35.9% for warm season (April to October. The results provide some references for dealing with climate and land cover change in an inland river basin for water resource management and planning.

  1. Characterizing the spatial correlation of streamflows

    Science.gov (United States)

    Betterle, Andrea; Schirmer, Mario; Botter, Gianluca

    2017-04-01

    The spatial variability of streamflow dynamics is the byproduct of complex interactions between heterogeneous climatic, morphological, geological and ecological conditions across the landscape. The spatial correlation or streamflows represents a synthetic statistical indicator of similarity between flow dynamics at two arbitrary river sites. Streamflow correlation can therefore be used to track changes in flow dynamics along river networks, with implications for studies where spatial patterns of flow regimes are critical. In this work we develop an analytical model to quantify the seasonal linear correlation between daily streamflow timeseries at the outlet of two arbitrary unregulated catchment. The framework is based on a parsimonious and physically based stochastic description of the main geomorphoclimatic drivers of flow dynamics, ultimately leading to analytical expressions for the flow correlation. Streamflow correlation between two rivers sites results as a function of the main physical drivers characterizing flow dynamics at the relevant sites, namely the frequency and intensity of runoff-generating rainfall, and the catchment recession rates. The performances of the model are assessed on a set of catchment in the Eastern United States providing satisfactory performances. Different parameter estimation techniques are also developed, including a method which enables the estimate of the streamflow correlation in absence of discharge data. The role played by the spatial heterogeneities of the hydrological processes considered in the model on the resulting streamflow correlation are analytically assessed and evaluated in the study sites. The analysis shows how seasonal spatial correlation of flow dynamics is mainly controlled by the frequency and intensity of runoff-generating rainfall events, whereas heterogeneous recession rates have a limited influence in the study area. Additionally, the framework accounts for the topological arrangement or river networks

  2. Drivers influencing streamflow changes in the Upper Turia basin, Spain.

    Science.gov (United States)

    Salmoral, Gloria; Willaarts, Bárbara A; Troch, Peter A; Garrido, Alberto

    2015-01-15

    Many rivers across the world have experienced a significant streamflow reduction over the last decades. Drivers of the observed streamflow changes are multiple, including climate change (CC), land use and land cover changes (LULCC), water transfers and river impoundment. Many of these drivers inter-act simultaneously, making it difficult to discern the impact of each driver individually. In this study we isolate the effects of LULCC on the observed streamflow reduction in the Upper Turia basin (east Spain) during the period 1973-2008. Regression models of annual streamflow are fitted with climatic variables and also additional time variant drivers like LULCC. The ecohydrological model SWAT is used to study the magnitude and sign of streamflow change when LULCC occurs. Our results show that LULCC does play a significant role on the water balance, but it is not the main driver underpinning the observed reduction on Turia's streamflow. Increasing mean temperature is the main factor supporting increasing evapotranspiration and streamflow reduction. In fact, LULCC and CC have had an offsetting effect on the streamflow generation during the study period. While streamflow has been negatively affected by increasing temperature, ongoing LULCC have positively compensated with reduced evapotranspiration rates, thanks to mainly shrubland clearing and forest degradation processes. These findings are valuable for the management of the Turia river basin, as well as a useful approach for the determination of the weight of LULCC on the hydrological response in other regions. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  4. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  5. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  6. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    Department of Mechanical Engineering, Imperial College of Science, ... If we are unable to obtain a satisfactory degree of correlation between the initial theoretical model and the test data, then it is extremely unlikely that any form of model updating (correcting the model to match the test data) will succeed. Thus, a successful ...

  7. Streamflow depletion by wells--Understanding and managing the effects of groundwater pumping on streamflow

    Science.gov (United States)

    Barlow, Paul M.; Leake, Stanley A.

    2012-11-02

    Groundwater is an important source of water for many human needs, including public supply, agriculture, and industry. With the development of any natural resource, however, adverse consequences may be associated with its use. One of the primary concerns related to the development of groundwater resources is the effect of groundwater pumping on streamflow. Groundwater and surface-water systems are connected, and groundwater discharge is often a substantial component of the total flow of a stream. Groundwater pumping reduces the amount of groundwater that flows to streams and, in some cases, can draw streamflow into the underlying groundwater system. Streamflow reductions (or depletions) caused by pumping have become an important water-resource management issue because of the negative impacts that reduced flows can have on aquatic ecosystems, the availability of surface water, and the quality and aesthetic value of streams and rivers. Scientific research over the past seven decades has made important contributions to the basic understanding of the processes and factors that affect streamflow depletion by wells. Moreover, advances in methods for simulating groundwater systems with computer models provide powerful tools for estimating the rates, locations, and timing of streamflow depletion in response to groundwater pumping and for evaluating alternative approaches for managing streamflow depletion. The primary objective of this report is to summarize these scientific insights and to describe the various field methods and modeling approaches that can be used to understand and manage streamflow depletion. A secondary objective is to highlight several misconceptions concerning streamflow depletion and to explain why these misconceptions are incorrect.

  8. Streamflow response to increasing precipitation extremes altered by forest management

    Science.gov (United States)

    Charlene N. Kelly; Kevin J. McGuire; Chelcy Ford Miniat; James M. Vose

    2016-01-01

    Increases in extreme precipitation events of floods and droughts are expected to occur worldwide. The increase in extreme events will result in changes in streamflow that are expected to affect water availability for human consumption and aquatic ecosystem function. We present an analysis that may greatly improve current streamflow models by quantifying the...

  9. Comparison of streamflow prediction skills from NOAH-MP/RAPID, VIC/RAPID and SWAT toward an ensemble flood forecasting framework over large scales

    Science.gov (United States)

    Rajib, M. A.; Tavakoly, A. A.; Du, L.; Merwade, V.; Lin, P.

    2015-12-01

    Considering the differences in how individual models represent physical processes for runoff generation and streamflow routing, use of ensemble output is desirable in an operational streamflow estimation and flood forecasting framework. To enable the use of ensemble streamflow, comparison of multiple hydrologic models at finer spatial resolution over a large domain is yet to be explored. The objective of this work is to compare streamflow prediction skills from three different land surface/hydrologic modeling frameworks: NOAH-MP/RAPID, VIC/RAPID and SWAT, over the Ohio River Basin with a drainage area of 491,000 km2. For a uniform comparison, all the three modeling frameworks share the same setup with common weather inputs, spatial resolution, and gauge stations being employed in the calibration procedure. The runoff output from NOAH-MP and VIC land surface models is routed through a vector-based river routing model named RAPID, that is set up on the high resolution NHDPlus reaches and catchments. SWAT model is used with its default tightly coupled surface-subsurface hydrology and channel routing components to obtain streamflow for each NHDPlus reach. Model simulations are performed in two modes, including: (i) hindcasting/calibration mode in which the models are calibrated against USGS daily streamflow observations at multiple locations, and (ii) validation mode in which the calibrated models are executed at 3-hourly time interval for historical flood events. In order to have a relative assessment on the model-specific nature of biases during storm events as well as dry periods, time-series of surface runoff and baseflow components at the specific USGS gauging locations are extracted from corresponding observed/simulated streamflow data using a recursive digital filter. The multi-model comparison presented here provides insights toward future model improvements and also serves as the first step in implementing an operational ensemble flood forecasting framework

  10. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  11. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  13. Evidential Model Validation under Epistemic Uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Deng

    2018-01-01

    Full Text Available This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.

  14. Long-range forecasting of intermittent streamflow

    Science.gov (United States)

    van Ogtrop, F. F.; Vervoort, R. W.; Heller, G. Z.; Stasinopoulos, D. M.; Rigby, R. A.

    2011-11-01

    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth) of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 6 and 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill.

  15. Long-range forecasting of intermittent streamflow

    Directory of Open Access Journals (Sweden)

    F. F. van Ogtrop

    2011-11-01

    Full Text Available Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 6 and 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill.

  16. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    of refining the theoretical model which will be used for the design optimisation process. There are many different names given to the tasks involved in this refinement. .... slightly from the ideal line but in a systematic rather than a random fashion as this situation suggests that there is a specific characteristic responsible for the ...

  17. Marginal economic value of streamflow: A case study for the Colorado River Basin

    Science.gov (United States)

    Thomas C. Brown; Benjamin L. Harding; Elizabeth A. Payton

    1990-01-01

    The marginal economic value of streamflow leaving forested areas in the Colorado River Basin was estimated by determining the impact on water use of a small change in streamflow and then applying economic value estimates to the water use changes. The effect on water use of a change in streamflow was estimated with a network flow model that simulated salinity levels and...

  18. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  19. Land cover and climate change effects on streamflow and sediment yield: a case study of Tapacurá River basin, Brazil

    Directory of Open Access Journals (Sweden)

    J. Y. G. Santos

    2015-06-01

    Full Text Available This study assesses the impact of the land use and climate changes between 1967–2008 on the streamflow and sediment yield in Tapacurá River basin (Brazil using the Soil and Water Assessment Tool (SWAT model. The model was calibrated and validated by comparing simulated mean monthly streamflow with observed long-term mean monthly streamflow. The obtained R2 and Nash–Sutcliffe efficiency values to streamflow data were respectively 0.82 and 0.71 for 1967–1974, and 0.84 and 0.82 for 1995–2008. The results show that the land cover and climate change affected the basin hydrology, decreasing the streamflow and sediment yield (227.39 mm and 18.21 t ha−1 yr−1 for 1967–1974 and 182.86 mm and 7.67 t ha−1 yr−1 for 1995–2008. The process changes are arising mainly due to the land cover/use variability, but, mainly due to the decreasing in the rainfall rates during 1995–2008 when compared with the first period analysed, which in turn decreased the streamflow and sediments during the wet seasons and reduced the base flow during the dry seasons.

  20. Analysis of streamflow response to land use and land cover changes using satellite data and hydrological modelling: case study of Dinder and Rahad tributaries of the Blue Nile (Ethiopia-Sudan)

    Science.gov (United States)

    Hassaballah, Khalid; Mohamed, Yasir; Uhlenbrook, Stefan; Biro, Khalid

    2017-10-01

    Understanding the land use and land cover changes (LULCCs) and their implication on surface hydrology of the Dinder and Rahad basins (D&R, approximately 77 504 km2) is vital for the management and utilization of water resources in the basins. Although there are many studies on LULCC in the Blue Nile Basin, specific studies on LULCC in the D&R are still missing. Hence, its impact on streamflow is unknown. The objective of this paper is to understand the LULCC in the Dinder and Rahad and its implications on streamflow response using satellite data and hydrological modelling. The hydrological model has been derived by different sets of land use and land cover maps from 1972, 1986, 1998 and 2011. Catchment topography, land cover and soil maps are derived from satellite images and serve to estimate model parameters. Results of LULCC detection between 1972 and 2011 indicate a significant decrease in woodland and an increase in cropland. Woodland decreased from 42 to 14 % and from 35 to 14 % for Dinder and Rahad, respectively. Cropland increased from 14 to 47 % and from 18 to 68 % in Dinder and Rahad, respectively. The model results indicate that streamflow is affected by LULCC in both the Dinder and the Rahad rivers. The effect of LULCC on streamflow is significant during 1986 and 2011. This could be attributed to the severe drought during the mid-1980s and the recent large expansion in cropland.

  1. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  2. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  3. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper

    2012-01-01

    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...

  4. Validation of models with proportional bias

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2017-01-01

    Full Text Available Objective. This paper presents extensions to Freese’s statistical method for model-validation when proportional bias (PB is present in the predictions. The method is illustrated with data from a model that simulates grassland growth. Materials and methods. The extensions to validate models with PB were: the maximum anticipated error for the original proposal, hypothesis testing, and the maximum anticipated error for the alternative proposal, and the confidence interval for a quantile of error distribution. Results. The tested model had PB, which once removed, and with a confidence level of 95%, the magnitude of error does not surpass 1225.564 kg ha-1. Therefore, the validated model can be used to predict grassland growth. However, it would require a fit of its structure based on the presence of PB. Conclusions. The extensions presented to validate models with PB are applied without modification in the model structure. Once PB is corrected, the confidence interval for the quantile 1-α of the error distribution enables a higher bound for the magnitude of the prediction error and it can be used to evaluate the evolution of the model for a system prediction.

  5. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  6. Validation of the IMS CORE Diabetes Model.

    Science.gov (United States)

    McEwan, Phil; Foos, Volker; Palmer, James L; Lamotte, Mark; Lloyd, Adam; Grant, David

    2014-09-01

    The IMS CORE Diabetes Model (CDM) is a widely published and validated simulation model applied in both type 1 diabetes mellitus (T1DM) and type 2 diabetes mellitus (T2DM) analyses. Validation to external studies is an important part of demonstrating model credibility. Because the CDM is widely used to estimate long-term clinical outcomes in diabetes patients, the objective of this analysis was to validate the CDM to contemporary outcomes studies, including those with long-term follow-up periods. A total of 112 validation simulations were performed, stratified by study follow-up duration. For long-term results (≥15-year follow-up), simulation cohorts representing baseline Diabetes Control and Complications Trial (DCCT) and United Kingdom Prospective Diabetes Study (UKPDS) cohorts were generated and intensive and conventional treatment arms were defined in the CDM. Predicted versus observed macrovascular and microvascular complications and all-cause mortality were assessed using the coefficient of determination (R(2)) goodness-of-fit measure. Across all validation studies, the CDM simulations produced an R(2) statistic of 0.90. For validation studies with a follow-up duration of less than 15 years, R(2) values of 0.90 and 0.88 were achieved for T1DM and T2DM respectively. In T1DM, validating against 30-year outcomes data (DCCT) resulted in an R(2) of 0.72. In T2DM, validating against 20-year outcomes data (UKPDS) resulted in an R(2) of 0.92. This analysis supports the CDM as a credible tool for predicting the absolute number of clinical events in DCCT- and UKPDS-like populations. With increasing incidence of diabetes worldwide, the CDM is particularly important for health care decision makers, for whom the robust evaluation of health care policies is essential. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  8. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  9. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    Science.gov (United States)

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  10. Bayesian Regression and Neuro-Fuzzy Methods Reliability Assessment for Estimating Streamflow

    Directory of Open Access Journals (Sweden)

    Yaseen A. Hamaamin

    2016-07-01

    Full Text Available Accurate and efficient estimation of streamflow in a watershed’s tributaries is prerequisite parameter for viable water resources management. This study couples process-driven and data-driven methods of streamflow forecasting as a more efficient and cost-effective approach to water resources planning and management. Two data-driven methods, Bayesian regression and adaptive neuro-fuzzy inference system (ANFIS, were tested separately as a faster alternative to a calibrated and validated Soil and Water Assessment Tool (SWAT model to predict streamflow in the Saginaw River Watershed of Michigan. For the data-driven modeling process, four structures were assumed and tested: general, temporal, spatial, and spatiotemporal. Results showed that both Bayesian regression and ANFIS can replicate global (watershed and local (subbasin results similar to a calibrated SWAT model. At the global level, Bayesian regression and ANFIS model performance were satisfactory based on Nash-Sutcliffe efficiencies of 0.99 and 0.97, respectively. At the subbasin level, Bayesian regression and ANFIS models were satisfactory for 155 and 151 subbasins out of 155 subbasins, respectively. Overall, the most accurate method was a spatiotemporal Bayesian regression model that outperformed other models at global and local scales. However, all ANFIS models performed satisfactory at both scales.

  11. Global precipitation measurements for validating climate models

    Science.gov (United States)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.

    2017-11-01

    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  12. Moving Beyond Streamflow Observations: Lessons From A Multi-Objective Calibration Experiment in the Mississippi Basin

    Science.gov (United States)

    Koppa, A.; Gebremichael, M.; Yeh, W. W. G.

    2017-12-01

    Calibrating hydrologic models in large catchments using a sparse network of streamflow gauges adversely affects the spatial and temporal accuracy of other water balance components which are important for climate-change, land-use and drought studies. This study combines remote sensing data and the concept of Pareto-Optimality to address the following questions: 1) What is the impact of streamflow (SF) calibration on the spatio-temporal accuracy of Evapotranspiration (ET), near-surface Soil Moisture (SM) and Total Water Storage (TWS)? 2) What is the best combination of fluxes that can be used to calibrate complex hydrological models such that both the accuracy of streamflow and the spatio-temporal accuracy of ET, SM and TWS is preserved? The study area is the Mississippi Basin in the United States (encompassing HUC-2 regions 5,6,7,9,10 and 11). 2003 and 2004, two climatologically average years are chosen for calibration and validation of the Noah-MP hydrologic model. Remotely sensed ET data is sourced from GLEAM, SM from ESA-CCI and TWS from GRACE. Single objective calibration is carried out using DDS Algorithm. For Multi objective calibration PA-DDS is used. First, the Noah-MP model is calibrated using a single objective function (Minimize Mean Square Error) for the outflow from the 6 HUC-2 sub-basins for 2003. Spatial correlograms are used to compare the spatial structure of ET, SM and TWS between the model and the remote sensing data. Spatial maps of RMSE and Mean Error are used to quantify the impact of calibrating streamflow on the accuracy of ET, SM and TWS estimates. Next, a multi-objective calibration experiment is setup to determine the pareto optimal parameter sets (pareto front) for the following cases - 1) SF and ET, 2) SF and SM, 3) SF and TWS, 4) SF, ET and SM, 5) SF, ET and TWS, 6) SF, SM and TWS, 7) SF, ET, SM and TWS. The best combination of fluxes that provides the optimal trade-off between accurate streamflow and preserving the spatio

  13. Simulation of streamflow and estimation of streamflow constituent loads in the San Antonio River watershed, Bexar County, Texas, 1997-2001

    Science.gov (United States)

    Ockerman, Darwin J.; McNamara, Kenna C.

    2003-01-01

    The U.S. Geological Survey developed watershed models (Hydrological Simulation Program—FORTRAN) to simulate streamflow and estimate streamflow constituent loads from five basins that compose the San Antonio River watershed in Bexar County, Texas. Rainfall and streamflow data collected during 1997–2001 were used to calibrate and test the model. The model was configured so that runoff from various land uses and discharges from other sources (such as wastewater recycling facilities) could be accounted for to indicate sources of streamflow. Simulated streamflow volumes were used with land-use-specific, water-quality data to compute streamflow loads of selected constituents from the various streamflow sources.Model simulations for 1997–2001 indicate that inflow from the upper Medina River (originating outside Bexar County) represents about 22 percent of total streamflow. Recycled wastewater discharges account for about 20 percent and base flow (ground-water inflow to streams) about 18 percent. Storm runoff from various land uses represents about 33 percent. Estimates of sources of streamflow constituent loads indicate recycled wastewater as the largest source of dissolved solids and nitrate plus nitrite nitrogen (about 38 and 66 percent, respectively, of the total loads) during 1997–2001. Stormwater runoff from urban land produced about 49 percent of the 1997–2001 total suspended solids load. Stormwater runoff from residential and commercial land (about 23 percent of the land area) produced about 70 percent of the total lead streamflow load during 1997–2001.

  14. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  15. Spatial patterns of March and September streamflow trends in Pacific Northwest Streams, 1958-2008

    Science.gov (United States)

    Chang, Heejun; Jung, Il-Won; Steele, Madeline; Gannett, Marshall

    2012-01-01

    Summer streamflow is a vital water resource for municipal and domestic water supplies, irrigation, salmonid habitat, recreation, and water-related ecosystem services in the Pacific Northwest (PNW) in the United States. This study detects significant negative trends in September absolute streamflow in a majority of 68 stream-gauging stations located on unregulated streams in the PNW from 1958 to 2008. The proportion of March streamflow to annual streamflow increases in most stations over 1,000 m elevation, with a baseflow index of less than 50, while absolute March streamflow does not increase in most stations. The declining trends of September absolute streamflow are strongly associated with seven-day low flow, January–March maximum temperature trends, and the size of the basin (19–7,260 km2), while the increasing trends of the fraction of March streamflow are associated with elevation, April 1 snow water equivalent, March precipitation, center timing of streamflow, and October–December minimum temperature trends. Compared with ordinary least squares (OLS) estimated regression models, spatial error regression and geographically weighted regression (GWR) models effectively remove spatial autocorrelation in residuals. The GWR model results show spatial gradients of local R 2 values with consistently higher local R 2 values in the northern Cascades. This finding illustrates that different hydrologic landscape factors, such as geology and seasonal distribution of precipitation, also influence streamflow trends in the PNW. In addition, our spatial analysis model results show that considering various geographic factors help clarify the dynamics of streamflow trends over a large geographical area, supporting a spatial analysis approach over aspatial OLS-estimated regression models for predicting streamflow trends. Results indicate that transitional rain–snow surface water-dominated basins are likely to have reduced summer streamflow under warming scenarios

  16. The HEPEX Seasonal Streamflow Forecast Intercomparison Project

    Science.gov (United States)

    Wood, A. W.; Schepen, A.; Bennett, J.; Mendoza, P. A.; Ramos, M. H.; Wetterhall, F.; Pechlivanidis, I.

    2016-12-01

    The Hydrologic Ensemble Prediction Experiment (HEPEX; www.hepex.org) has launched an international seasonal streamflow forecasting intercomparison project (SSFIP) with the goal of broadening community knowledge about the strengths and weaknesses of various operational approaches being developed around the world. While some of these approaches have existed for decades (e.g. Ensemble Streamflow Prediction - ESP - in the United States and elsewhere), recent years have seen the proliferation of new operational and experimental streamflow forecasting approaches. These have largely been developed independently in each country, thus it is difficult to assess whether the approaches employed in some centers offer more promise for development than others. This motivates us to establish a forecasting testbed to facilitate a diagnostic evaluation of a range of different streamflow forecasting approaches and their components over a common set of catchments, using a common set of validation methods. Rather than prescribing a set of scientific questions from the outset, we are letting the hindcast results and notable differences in methodologies on a watershed-specific basis motivate more targeted analyses and sub-experiments that may provide useful insights. The initial pilot of the testbed involved two approaches - CSIRO's Bayesian joint probability (BJP) and NCAR's sequential regression - for two catchments, each designated by one of the teams (the Murray River, Australia, and Hungry Horse reservoir drainage area, USA). Additional catchments/approaches are in the process of being added to the testbed. To support this CSIRO and NCAR have developed data and analysis tools, data standards and protocols to formalize the experiment. These include requirements for cross-validation, verification, reference climatologies, and common predictands. This presentation describes the SSFIP experiments, pilot basin results and scientific findings to date.

  17. Calibration and validation of rockfall models

    Science.gov (United States)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  18. Global analysis of seasonal streamflow predictability using an ensemble prediction system and observations from 6192 small catchments worldwide

    NARCIS (Netherlands)

    van Dijk, A.I.J.M.; Peña-Arancibia, J.L.; Wood, E.F.; Sheffield, J.; Beck, H.E.

    2013-01-01

    Key Points Global bimonthly streamflow forecasts show potentially valuable skill Initial catchment conditions are responsible for most skill Skill can be estimated from model performance and theoretical skill Ideally, a seasonal streamflow forecasting system would ingest skilful climate forecasts

  19. Future Climate Change Impacts on Streamflows of Two Main West Africa River Basins: Senegal and Gambia

    Directory of Open Access Journals (Sweden)

    Ansoumana Bodian

    2018-03-01

    Full Text Available This research investigated the effect of climate change on the two main river basins of Senegal in West Africa: the Senegal and Gambia River Basins. We used downscaled projected future rainfall and potential evapotranspiration based on projected temperature from six General Circulation Models (CanESM2, CNRM, CSIRO, HadGEM2-CC, HadGEM2-ES, and MIROC5 and two scenarios (RCP4.5 and RCP8.5 to force the GR4J model. The GR4J model was calibrated and validated using observed daily rainfall, potential evapotranspiration from observed daily temperature, and streamflow data. For the cross-validation, two periods for each river basin were considered: 1961–1982 and 1983–2004 for the Senegal River Basin at Bafing Makana, and 1969–1985 and 1986–2000 for the Gambia River Basin at Mako. Model efficiency is evaluated using a multi-criteria function (Fagg which aggregates Nash and Sutcliffe criteria, cumulative volume error, and mean volume error. Alternating periods of simulation for calibration and validation were used. This process allows us to choose the parameters that best reflect the rainfall-runoff relationship. Once the model was calibrated and validated, we simulated streamflow at Bafing Makana and Mako stations in the near future at a daily scale. The characteristic flow rates were calculated to evaluate their possible evolution under the projected climate scenarios at the 2050 horizon. For the near future (2050 horizon, compared to the 1971–2000 reference period, results showed that for both river basins, multi-model ensemble predicted a decrease of annual streamflow from 8% (Senegal River Basin to 22% (Gambia River Basin under the RCP4.5 scenario. Under the RCP8.5 scenario, the decrease is more pronounced: 16% (Senegal River Basin and 26% (Gambia River Basin. The Gambia River Basin will be more affected by the climate change.

  20. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  1. Paleoclimate validation of a numerical climate model

    International Nuclear Information System (INIS)

    Schelling, F.J.; Church, H.W.; Zak, B.D.; Thompson, S.L.

    1994-01-01

    An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE's Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented

  2. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  3. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  4. Assessing the competing roles of model resolution and meteorological forcing fidelity in hyperresolution simulations of snowpack and streamflow in the southern Rocky Mountains

    Science.gov (United States)

    Gochis, D. J.; Dugger, A. L.; Karsten, L. R.; Barlage, M. J.; Sampson, K. M.; Yu, W.; Pan, L.; McCreight, J. L.; Howard, K.; Busto, J.; Deems, J. S.

    2017-12-01

    Hydrometeorological processes vary over comparatively short length scales in regions of complex terrain such as the southern Rocky Mountains. Changes in temperature, precipitation, wind and solar radiation can vary significantly across elevation gradients, terrain landform and land cover conditions throughout the region. Capturing such variability in hydrologic models can necessitate the utilization of so-called `hyper-resolution' spatial meshes with effective element spacings of less than 100m. However, it is often difficult to obtain meteorological forcings of high quality in such regions at those resolutions which can result in significant uncertainty in fundamental in hydrologic model inputs. In this study we examine the comparative influences of meteorological forcing data fidelity and spatial resolution on seasonal simulations of snowpack evolution, runoff and streamflow in a set of high mountain watersheds in southern Colorado. We utilize the operational, NOAA National Water Model configuration of the community WRF-Hydro system as a baseline and compare against it, additional model scenarios with differing specifications of meteorological forcing data, with and without topographic downscaling adjustments applied, with and without experimental high resolution radar derived precipitation estimates and with WRF-Hydro configurations of progressively finer spatial resolution. The results suggest significant influence from and importance of meteorological downscaling techniques in controlling spatial distributions of meltout and runoff timing. The use of radar derived precipitation exhibits clear sensitivity on hydrologic simulation skill compared with the use of coarser resolution, background precipitation analyses. Advantages and disadvantages of the utilization of progressively higher resolution model configurations both in terms of computational requirements and model fidelity are also discussed.

  5. Computing daily mean streamflow at ungaged locations in Iowa by using the Flow Anywhere and Flow Duration Curve Transfer statistical methods

    Science.gov (United States)

    Linhart, S. Mike; Nania, Jon F.; Sanders, Curtis L.; Archfield, Stacey A.

    2012-01-01

    linear regression method and the daily mean streamflow for the 15th day of every other month. The Flow Duration Curve Transfer method was used to estimate unregulated daily mean streamflow from the physical and climatic characteristics of gaged basins. For the Flow Duration Curve Transfer method, daily mean streamflow quantiles at the ungaged site were estimated with the parameter-based regression model, which results in a continuous daily flow-duration curve (the relation between exceedance probability and streamflow for each day of observed streamflow) at the ungaged site. By the use of a reference streamgage, the Flow Duration Curve Transfer is converted to a time series. Data used in the Flow Duration Curve Transfer method were retrieved for 113 continuous-record streamgages in Iowa and within a 50-mile buffer of Iowa. The final statewide regression equations for Iowa were computed by using a weighted-least-squares multiple linear regression method and were computed for the 0.01-, 0.05-, 0.10-, 0.15-, 0.20-, 0.30-, 0.40-, 0.50-, 0.60-, 0.70-, 0.80-, 0.85-, 0.90-, and 0.95-exceedance probability statistics determined from the daily mean streamflow with a reporting limit set at 0.1 ft3/s. The final statewide regression equation for Iowa computed by using left-censored regression techniques was computed for the 0.99-exceedance probability statistic determined from the daily mean streamflow with a low limit threshold and a reporting limit set at 0.1 ft3/s. For the Flow Anywhere method, results of the validation study conducted by using six streamgages show that differences between the root-mean-square error and the mean absolute error ranged from 1,016 to 138 ft3/s, with the larger value signifying a greater occurrence of outliers between observed and estimated streamflows. Root-mean-square-error values ranged from 1,690 to 237 ft3/s. Values of the percent root-mean-square error ranged from 115 percent to 26.2 percent. The logarithm (base 10) streamflow percent root

  6. The Global Streamflow Indices and Metadata Archive (GSIM) - Part 2: Quality control, time-series indices and homogeneity assessment

    Science.gov (United States)

    Gudmundsson, Lukas; Do, Hong Xuan; Leonard, Michael; Westra, Seth

    2018-04-01

    This is Part 2 of a two-paper series presenting the Global Streamflow Indices and Metadata Archive (GSIM), which is a collection of daily streamflow observations at more than 30 000 stations around the world. While Part 1 (Do et al., 2018a) describes the data collection process as well as the generation of auxiliary catchment data (e.g. catchment boundary, land cover, mean climate), Part 2 introduces a set of quality controlled time-series indices representing (i) the water balance, (ii) the seasonal cycle, (iii) low flows and (iv) floods. To this end we first consider the quality of individual daily records using a combination of quality flags from data providers and automated screening methods. Subsequently, streamflow time-series indices are computed for yearly, seasonal and monthly resolution. The paper provides a generalized assessment of the homogeneity of all generated streamflow time-series indices, which can be used to select time series that are suitable for a specific task. The newly generated global set of streamflow time-series indices is made freely available with an digital object identifier at https://doi.pangaea.de/10.1594/PANGAEA.887470 and is expected to foster global freshwater research, by acting as a ground truth for model validation or as a basis for assessing the role of human impacts on the terrestrial water cycle. It is hoped that a renewed interest in streamflow data at the global scale will foster efforts in the systematic assessment of data quality and provide momentum to overcome administrative barriers that lead to inconsistencies in global collections of relevant hydrological observations.

  7. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  8. Impact of Rain Gauge Quality Control and Interpolation on Streamflow Simulation: An Application to the Warwick Catchment, Australia

    Directory of Open Access Journals (Sweden)

    Shulun Liu

    2018-01-01

    Full Text Available Rain gauges are widely used to obtain temporally continuous point rainfall records, which are then interpolated into spatially continuous data to force hydrological models. However, rainfall measurements and interpolation procedure are subject to various uncertainties, which can be reduced by applying quality control and selecting appropriate spatial interpolation approaches. Consequently, the integrated impact of rainfall quality control and interpolation on streamflow simulation has attracted increased attention but not been fully addressed. This study applies a quality control procedure to the hourly rainfall measurements obtained in the Warwick catchment in eastern Australia. The grid-based daily precipitation from the Australian Water Availability Project was used as a reference. The Pearson correlation coefficient between the daily accumulation of gauged rainfall and the reference data was used to eliminate gauges with significant quality issues. The unrealistic outliers were censored based on a comparison between gauged rainfall and the reference. Four interpolation methods, including the inverse distance weighting (IDW, nearest neighbors (NN, linear spline (LN, and ordinary Kriging (OK, were implemented. The four methods were firstly assessed through a cross-validation using the quality-controlled rainfall data. The impacts of the quality control and interpolation on streamflow simulation were then evaluated through a semi-distributed hydrological model. The results showed that the Nash–Sutcliffe model efficiency coefficient (NSE and Bias of the streamflow simulations were significantly improved after quality control. In the cross-validation, the IDW and OK methods resulted in good interpolation rainfall, while the NN led to the worst result. In terms of the impact on hydrological prediction, the IDW led to the most consistent streamflow predictions with the observations, according to the validation at five streamflow-gauged locations

  9. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  10. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  11. Seclazone Reactor Modeling And Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Osinga, T. [ETH-Zuerich (Switzerland); Olalde, G. [CNRS Odeillo (France); Steinfeld, A. [PSI and ETHZ (Switzerland)

    2005-03-01

    A numerical model is formulated for the SOLZINC solar chemical reactor for the production of Zn by carbothermal reduction of ZnO. The model involves solving, by the finite-volume technique, a 1D unsteady state energy equation that couples heat transfer to the chemical kinetics for a shrinking packed bed exposed to thermal radiation. Validation is accomplished by comparison with experimentally measured temperature profiles and Zn production rates as a function of time, obtained for a 5-kW solar reactor tested at PSI's solar furnace. (author)

  12. Turbulence Modeling Validation, Testing, and Development

    Science.gov (United States)

    Bardina, J. E.; Huang, P. G.; Coakley, T. J.

    1997-01-01

    The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.

  13. Effect of initial conditions of a catchment on seasonal streamflow prediction using ensemble streamflow prediction (ESP) technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    Science.gov (United States)

    Singh, Shailesh Kumar; Zammit, Christian; Hreinsson, Einar; Woods, Ross; Clark, Martyn; Hamlet, Alan

    2013-04-01

    Increased access to water is a key pillar of the New Zealand government plan for economic growths. Variable climatic conditions coupled with market drivers and increased demand on water resource result in critical decision made by water managers based on climate and streamflow forecast. Because many of these decisions have serious economic implications, accurate forecast of climate and streamflow are of paramount importance (eg irrigated agriculture and electricity generation). New Zealand currently does not have a centralized, comprehensive, and state-of-the-art system in place for providing operational seasonal to interannual streamflow forecasts to guide water resources management decisions. As a pilot effort, we implement and evaluate an experimental ensemble streamflow forecasting system for the Waitaki and Rangitata River basins on New Zealand's South Island using a hydrologic simulation model (TopNet) and the familiar ensemble streamflow prediction (ESP) paradigm for estimating forecast uncertainty. To provide a comprehensive database for evaluation of the forecasting system, first a set of retrospective model states simulated by the hydrologic model on the first day of each month were archived from 1972-2009. Then, using the hydrologic simulation model, each of these historical model states was paired with the retrospective temperature and precipitation time series from each historical water year to create a database of retrospective hindcasts. Using the resulting database, the relative importance of initial state variables (such as soil moisture and snowpack) as fundamental drivers of uncertainties in forecasts were evaluated for different seasons and lead times. The analysis indicate that the sensitivity of flow forecast to initial condition uncertainty is depend on the hydrological regime and season of forecast. However initial conditions do not have a large impact on seasonal flow uncertainties for snow dominated catchments. Further analysis indicates

  14. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  15. Understanding uncertainties in future Colorado River streamflow

    Science.gov (United States)

    Julie A. Vano,; Bradley Udall,; Cayan, Daniel; Jonathan T Overpeck,; Brekke, Levi D.; Das, Tapash; Hartmann, Holly C.; Hidalgo, Hugo G.; Hoerling, Martin P; McCabe, Gregory J.; Morino, Kiyomi; Webb, Robert S.; Werner, Kevin; Lettenmaier, Dennis P.

    2014-01-01

    The Colorado River is the primary water source for more than 30 million people in the United States and Mexico. Recent studies that project streamf low changes in the Colorado River all project annual declines, but the magnitude of the projected decreases range from less than 10% to 45% by the mid-twenty-first century. To understand these differences, we address the questions the management community has raised: Why is there such a wide range of projections of impacts of future climate change on Colorado River streamflow, and how should this uncertainty be interpreted? We identify four major sources of disparities among studies that arise from both methodological and model differences. In order of importance, these are differences in 1) the global climate models (GCMs) and emission scenarios used; 2) the ability of land surface and atmospheric models to simulate properly the high-elevation runoff source areas; 3) the sensitivities of land surface hydrology models to precipitation and temperature changes; and 4) the methods used to statistically downscale GCM scenarios. In accounting for these differences, there is substantial evidence across studies that future Colorado River streamflow will be reduced under the current trajectories of anthropogenic greenhouse gas emissions because of a combination of strong temperature-induced runoff curtailment and reduced annual precipitation. Reconstructions of preinstrumental streamflows provide additional insights; the greatest risk to Colorado River streamf lows is a multidecadal drought, like that observed in paleoreconstructions, exacerbated by a steady reduction in flows due to climate change. This could result in decades of sustained streamflows much lower than have been observed in the ~100 years of instrumental record.

  16. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  17. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  18. Reconstructing pre-instrumental streamflow in Eastern Australia using a water balance approach

    Science.gov (United States)

    Tozer, C. R.; Kiem, A. S.; Vance, T. R.; Roberts, J. L.; Curran, M. A. J.; Moy, A. D.

    2018-03-01

    Streamflow reconstructions based on paleoclimate proxies provide much longer records than the short instrumental period records on which water resource management plans are currently based. In Australia there is a lack of in-situ high resolution paleoclimate proxy records, but remote proxies with teleconnections to Australian climate have utility in producing streamflow reconstructions. Here we investigate, via a case study for a catchment in eastern Australia, the novel use of an Antarctic ice-core based rainfall reconstruction within a Budyko-framework to reconstruct ∼1000 years of annual streamflow. The resulting streamflow reconstruction captures interannual to decadal variability in the instrumental streamflow, validating both the use of the ice core rainfall proxy record and the Budyko-framework method. In the preinstrumental era the streamflow reconstruction shows longer wet and dry epochs and periods of streamflow variability that are higher than observed in the instrumental era. Importantly, for both the instrumental record and preinstrumental reconstructions, the wet (dry) epochs in the rainfall record are shorter (longer) in the streamflow record and this non-linearity must be considered when inferring hydroclimatic risk or historical water availability directly from rainfall proxy records alone. These insights provide a better understanding of present infrastructure vulnerability in the context of past climate variability for eastern Australia. The streamflow reconstruction presented here also provides a better understanding of the range of hydroclimatic variability possible, and therefore represents a more realistic baseline on which to quantify the potential impacts of anthropogenic climate change on water security.

  19. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  20. Mitigating the Impacts of Climate Nonstationarity on Seasonal Streamflow Predictability in the U.S. Southwest

    Science.gov (United States)

    Lehner, Flavio; Wood, Andrew W.; Llewellyn, Dagmar; Blatchford, Douglas B.; Goodbody, Angus G.; Pappenberger, Florian

    2017-12-01

    Seasonal streamflow predictions provide a critical management tool for water managers in the American Southwest. In recent decades, persistent prediction errors for spring and summer runoff volumes have been observed in a number of watersheds in the American Southwest. While mostly driven by decadal precipitation trends, these errors also relate to the influence of increasing temperature on streamflow in these basins. Here we show that incorporating seasonal temperature forecasts from operational global climate prediction models into streamflow forecasting models adds prediction skill for watersheds in the headwaters of the Colorado and Rio Grande River basins. Current dynamical seasonal temperature forecasts now show sufficient skill to reduce streamflow forecast errors in snowmelt-driven regions. Such predictions can increase the resilience of streamflow forecasting and water management systems in the face of continuing warming as well as decadal-scale temperature variability and thus help to mitigate the impacts of climate nonstationarity on streamflow predictability.

  1. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  2. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  3. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  4. Current and future carbon export by the Connecticut River: using streamflow data archives and rating curves to model annual and seasonal constituent loads under future discharge scenarios

    Science.gov (United States)

    Petsch, S.; Armfield, J. R.

    2013-12-01

    , DOC, POC and TSS export are heavily skewed towards spring discharge events, and effective discharge for these constituents is most likely to occur in spring. A Monte Carlo simulation of annual and seasonal discharge was developed from a probability distribution of the log-transformed mean and standard deviation of the daily discharge archive. This model was used to generate estimates of annual and seasonal export of TSS, POC and DOC, C/N ratios and weight % OC. Future changes in discharge were simulated in the model to examine effects on constituent loads. For example, shifts to more winter and less spring discharge to mimic less snowpack and spring freshet results in substantially greater TSS and POC export. Increased summer and fall discharge to mimic increases in tropical storms yields only modest increases in constituent export, due to the overall low discharge values during these seasons. These results highlight the importance of long-term streamflow and constituent datasets in determining current annual and seasonal carbon export from river systems, and for generating predictions of changes in carbon export the result from future hydrological and climatic change.

  5. Impact of LUCC on streamflow based on the SWAT model over the Wei River basin on the Loess Plateau in China

    Directory of Open Access Journals (Sweden)

    H. Wang

    2017-04-01

    impact on both soil flow and baseflow by compensating for reduced surface runoff, which leads to a slight increase in the streamflow in the Wei River with the mixed landscapes on the Loess Plateau that include earth–rock mountain area.

  6. Atmospheric corrosion: statistical validation of models

    International Nuclear Information System (INIS)

    Diaz, V.; Martinez-Luaces, V.; Guineo-Cobs, G.

    2003-01-01

    In this paper we discuss two different methods for validation of regression models, applied to corrosion data. One of them is based on the correlation coefficient and the other one is the statistical test of lack of fit. Both methods are used here to analyse fitting of bi logarithmic model in order to predict corrosion for very low carbon steel substrates in rural and urban-industrial atmospheres in Uruguay. Results for parameters A and n of the bi logarithmic model are reported here. For this purpose, all repeated values were used instead of using average values as usual. Modelling is carried out using experimental data corresponding to steel substrates under the same initial meteorological conditions ( in fact, they are put in the rack at the same time). Results of correlation coefficient are compared with the lack of it tested at two different signification levels (α=0.01 and α=0.05). Unexpected differences between them are explained and finally, it is possible to conclude, at least in the studied atmospheres, that the bi logarithmic model does not fit properly the experimental data. (Author) 18 refs

  7. Hydrologic functioning of the deep Critical Zone and contributions to streamflow in a high elevation catchment: testing of multiple conceptual models

    Science.gov (United States)

    Dwivedi, R.; Meixner, T.; McIntosh, J. C.; Ferre, T. P. A.; Eastoe, C. J.; Minor, R. L.; Barron-Gafford, G.; Chorover, J.

    2017-12-01

    The composition of natural mountainous waters maintains important control over the water quality available to downstream users. Furthermore, the geochemical constituents of stream water in the mountainous catchments represent the result of the spatial and temporal evolution of critical zone structure and processes. A key problem is that high elevation catchments involve rugged terrain and are subject to extreme climate and landscape gradients; therefore, high density or high spatial resolution hydro-geochemical observations are rare. Despite such difficulties, the Santa Catalina Mountains Critical Zone Observatory (SCM-CZO), Tucson, AZ, generates long-term hydrogeochemical data for understanding not only hydrological processes and their seasonal characters, but also the geochemical impacts of such processes on streamflow chemical composition. Using existing instrumentation and hydrogeochemical observations from the last 9+ years (2009 through 2016 and an initial part of 2017), we employed a multi-tracer approach along with principal component analysis to identify water sources and their seasonal character. We used our results to inform hydrological process understanding (flow paths, residence times, and water sources) for our study site. Our results indicate that soil water is the largest contributor to streamflow, which is ephemeral in nature. Although a 3-dimensional mixing space involving precipitation, soil water, interflow, and deep groundwater end-members could explain most of the streamflow chemistry, geochemical complexity was observed to grow with catchment storage. In terms of processes and their seasonal character, we found soil water and interflow were the primary end-member contributors to streamflow in all seasons. Deep groundwater only contributes to streamflow at high catchment storage conditions, but it provides major ions such as Na, Mg, and Ca that are lacking in other water types. In this way, our results indicate that any future efforts aimed

  8. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  9. A model to forecast short-term snowmelt runoff using synoptic observations of streamflow, temperature, and precipitation

    Science.gov (United States)

    Tangborn, Wendell V.

    1980-01-01

    Snowmelt runoff is forecast with a statistical model that utilizes daily values of stream discharge, gaged precipitation, and maximum and minimum observations of air temperature. Synoptic observations of these variables are made at existing low- and medium-altitude weather stations, thus eliminating the difficulties and expense of new, high-altitude installations. Four model development steps are used to demonstrate the influence on prediction accuracy of basin storage, a preforecast test season, air temperature (to estimate ablation), and a prediction based on storage. Daily ablation is determined by a technique that employs both mean temperature and a radiative index. Radiation (both long- and short-wave components) is approximated by using the range in daily temperature, which is shown to be closely related to mean cloud cover. A technique based on the relationship between prediction error and prediction season weather utilizes short-term forecasts of precipitation and temperature to improve the final prediction. Verification of the model is accomplished by a split sampling technique for the 1960–1977 period. Short- term (5–15 days) predictions of runoff throughout the main snowmelt season are demonstrated for mountain drainages in western Washington, south-central Arizona, western Montana, and central California. The coefficient of prediction (Cp) based on actual, short-term predictions for 18 years is for Thunder Creek (Washington), 0.69; for South Fork Flathead River (Montana), 0.45; for the Black River (Arizona), 0.80; and for the Kings River (California), 0.80.

  10. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  11. Sensitivity of streamflow to climate change in California

    Science.gov (United States)

    Grantham, T.; Carlisle, D.; Wolock, D.; McCabe, G. J.; Wieczorek, M.; Howard, J.

    2015-12-01

    Trends of decreasing snowpack and increasing risk of drought are looming challenges for California water resource management. Increasing vulnerability of the state's natural water supplies threatens California's social-economic vitality and the health of its freshwater ecosystems. Despite growing awareness of potential climate change impacts, robust management adaptation has been hindered by substantial uncertainty in future climate predictions for the region. Down-scaled global climate model (GCM) projections uniformly suggest future warming of the region, but projections are highly variable with respect to the direction and magnitude of change in regional precipitation. Here we examine the sensitivity of California surface water supplies to climate variation independently of GCMs. We use a statistical approach to construct predictive models of monthly streamflow based on historical climate and river basin features. We then propagate an ensemble of synthetic climate simulations through the models to assess potential streamflow responses to changes in temperature and precipitation in different months and regions of the state. We also consider the range of streamflow change predicted by bias-corrected downscaled GCMs. Our results indicate that the streamflow in the xeric and coastal mountain regions of California is more sensitive to changes in precipitation than temperature, whereas streamflow in the interior mountain region responds strongly to changes in both temperature and precipitation. Mean climate projections for 2025-2075 from GCM ensembles are highly variable, indicating streamflow changes of -50% to +150% relative to baseline (1980-2010) for most months and regions. By quantifying the sensitivity of streamflow to climate change, rather than attempting to predict future hydrologic conditions based on uncertain GCM projections, these results should be more informative to water managers seeking to assess, and potentially reduce, the vulnerability of surface

  12. A Self-Calibrating Runoff and Streamflow Remote Sensing Model for Ungauged Basins Using Open-Access Earth Observation Data

    Directory of Open Access Journals (Sweden)

    Ate Poortinga

    2017-01-01

    Full Text Available Due to increasing pressures on water resources, there is a need to monitor regional water resource availability in a spatially and temporally explicit manner. However, for many parts of the world, there is insufficient data to quantify stream flow or ground water infiltration rates. We present the results of a pixel-based water balance formulation to partition rainfall into evapotranspiration, surface water runoff and potential ground water infiltration. The method leverages remote sensing derived estimates of precipitation, evapotranspiration, soil moisture, Leaf Area Index, and a single F coefficient to distinguish between runoff and storage changes. The study produced significant correlations between the remote sensing method and field based measurements of river flow in two Vietnamese river basins. For the Ca basin, we found R2 values ranging from 0.88–0.97 and Nash–Sutcliffe efficiency (NSE values varying between 0.44–0.88. The R2 for the Red River varied between 0.87–0.93 and NSE values between 0.61 and 0.79. Based on these findings, we conclude that the method allows for a fast and cost-effective way to map water resource availability in basins with no gauges or monitoring infrastructure, without the need for application of sophisticated hydrological models or resource-intensive data.

  13. Validation of A Global Hydrological Model

    Science.gov (United States)

    Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.

    due to the precipitation mea- surement errors. Even though the explicit modeling of wetlands and lakes leads to a much improved modeling of both the vertical water balance and the lateral transport of water, not enough information is included in WGHM to accurately capture the hy- drology of these water bodies. Certainly, the reliability of model results is highest at the locations at which WGHM was calibrated. The validation indicates that reliability for cells inside calibrated basins is satisfactory if the basin is relatively homogeneous. Analyses of the few available stations outside of calibrated basins indicate a reason- ably high model reliability, particularly in humid regions.

  14. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  15. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  16. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  17. Streamflow impacts of biofuel policy-driven landscape change.

    Science.gov (United States)

    Khanal, Sami; Anex, Robert P; Anderson, Christopher J; Herzmann, Daryl E

    2014-01-01

    Likely changes in precipitation (P) and potential evapotranspiration (PET) resulting from policy-driven expansion of bioenergy crops in the United States are shown to create significant changes in streamflow volumes and increase water stress in the High Plains. Regional climate simulations for current and biofuel cropping system scenarios are evaluated using the same atmospheric forcing data over the period 1979-2004 using the Weather Research Forecast (WRF) model coupled to the NOAH land surface model. PET is projected to increase under the biofuel crop production scenario. The magnitude of the mean annual increase in PET is larger than the inter-annual variability of change in PET, indicating that PET increase is a forced response to the biofuel cropping system land use. Across the conterminous U.S., the change in mean streamflow volume under the biofuel scenario is estimated to range from negative 56% to positive 20% relative to a business-as-usual baseline scenario. In Kansas and Oklahoma, annual streamflow volume is reduced by an average of 20%, and this reduction in streamflow volume is due primarily to increased PET. Predicted increase in mean annual P under the biofuel crop production scenario is lower than its inter-annual variability, indicating that additional simulations would be necessary to determine conclusively whether predicted change in P is a response to biofuel crop production. Although estimated changes in streamflow volume include the influence of P change, sensitivity results show that PET change is the significantly dominant factor causing streamflow change. Higher PET and lower streamflow due to biofuel feedstock production are likely to increase water stress in the High Plains. When pursuing sustainable biofuels policy, decision-makers should consider the impacts of feedstock production on water scarcity.

  18. Streamflow impacts of biofuel policy-driven landscape change.

    Directory of Open Access Journals (Sweden)

    Sami Khanal

    Full Text Available Likely changes in precipitation (P and potential evapotranspiration (PET resulting from policy-driven expansion of bioenergy crops in the United States are shown to create significant changes in streamflow volumes and increase water stress in the High Plains. Regional climate simulations for current and biofuel cropping system scenarios are evaluated using the same atmospheric forcing data over the period 1979-2004 using the Weather Research Forecast (WRF model coupled to the NOAH land surface model. PET is projected to increase under the biofuel crop production scenario. The magnitude of the mean annual increase in PET is larger than the inter-annual variability of change in PET, indicating that PET increase is a forced response to the biofuel cropping system land use. Across the conterminous U.S., the change in mean streamflow volume under the biofuel scenario is estimated to range from negative 56% to positive 20% relative to a business-as-usual baseline scenario. In Kansas and Oklahoma, annual streamflow volume is reduced by an average of 20%, and this reduction in streamflow volume is due primarily to increased PET. Predicted increase in mean annual P under the biofuel crop production scenario is lower than its inter-annual variability, indicating that additional simulations would be necessary to determine conclusively whether predicted change in P is a response to biofuel crop production. Although estimated changes in streamflow volume include the influence of P change, sensitivity results show that PET change is the significantly dominant factor causing streamflow change. Higher PET and lower streamflow due to biofuel feedstock production are likely to increase water stress in the High Plains. When pursuing sustainable biofuels policy, decision-makers should consider the impacts of feedstock production on water scarcity.

  19. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  20. Contribution of multiple climatic variables and human activities to streamflow changes across China

    Science.gov (United States)

    Liu, Jianyu; Zhang, Qiang; Singh, Vijay P.; Shi, Peijun

    2017-02-01

    Using monthly streamflow data from the 1960-2000 period and annual streamflow data from the 2001-2014 period, and also meteorological data from the 1960 to 2014 period from 815 meteorological stations across China, the Budyko-based hydrothermal balance model was used to quantitatively evaluate the fractional contributions of climate change and human activities to streamflow changes in ten river basins across China. Particular importance was attached to human activities, such as population density and Gross Domestic Product (GDP), and also water reservoirs in terms of their relationship with streamflow changes. Results indicated that: (1) streamflow changes of river basins in northern China were more sensitive to climate change than those of river basins in southern China. Based on the degree of sensitivity, the influencing factors to which streamflow changes are sensitive included: precipitation > human activities > relative humidity > solar radiation > maximum temperature > wind speed > minimum temperature. Hence, it can be argued that hydrological systems in northern China are more fragile and more sensitive to changing environment than those in southern China and hence water resources management in northern China is more challenging; (2) during 1980-2000, climate change tended to increase streamflow changes across China and have a dominant role in streamflow variation. However, climate change tends to decrease streamflow in river basins of northern China. Generally, human activities cause a decrease of streamflow across China; (3) In recent years such as a period of 2001-2014, human activities tend to have increasing or enhancing impacts on instream flow changes, and fractional contributions of climate change and human activities to streamflow changes are, respectively, 53.5% and 46.5%. Increasing human-induced impacts on streamflow changes have the potential to add more uncertainty in the management of water resources at different spatial and temporal scales.

  1. Validation of the community radiative transfer model

    International Nuclear Information System (INIS)

    Ding Shouguo; Yang Ping; Weng Fuzhong; Liu Quanhua; Han Yong; Delst, Paul van; Li Jun; Baum, Bryan

    2011-01-01

    To validate the Community Radiative Transfer Model (CRTM) developed by the U.S. Joint Center for Satellite Data Assimilation (JCSDA), the discrete ordinate radiative transfer (DISORT) model and the line-by-line radiative transfer model (LBLRTM) are combined in order to provide a reference benchmark. Compared with the benchmark, the CRTM appears quite accurate for both clear sky and ice cloud radiance simulations with RMS errors below 0.2 K, except for clouds with small ice particles. In a computer CPU run time comparison, the CRTM is faster than DISORT by approximately two orders of magnitude. Using the operational MODIS cloud products and the European Center for Medium-range Weather Forecasting (ECMWF) atmospheric profiles as an input, the CRTM is employed to simulate the Atmospheric Infrared Sounder (AIRS) radiances. The CRTM simulations are shown to be in reasonably close agreement with the AIRS measurements (the discrepancies are within 2 K in terms of brightness temperature difference). Furthermore, the impact of uncertainties in the input cloud properties and atmospheric profiles on the CRTM simulations has been assessed. The CRTM-based brightness temperatures (BTs) at the top of the atmosphere (TOA), for both thin (τ 30) clouds, are highly sensitive to uncertainties in atmospheric temperature and cloud top pressure. However, for an optically thick cloud, the CRTM-based BTs are not sensitive to the uncertainties of cloud optical thickness, effective particle size, and atmospheric humidity profiles. On the contrary, the uncertainties of the CRTM-based TOA BTs resulting from effective particle size and optical thickness are not negligible in an optically thin cloud.

  2. Seasonal Prediction of Taiwan's Streamflow Using Teleconnection Patterns

    Science.gov (United States)

    Chen, Chia-Jeng; Lee, Tsung-Yu

    2017-04-01

    Seasonal streamflow as an integrated response to complex hydro-climatic processes can be subject to activity of prevailing weather systems potentially modulated by large-scale climate oscillations (e.g., El Niño-Southern Oscillation, ENSO). To develop a seamless seasonal forecasting system in Taiwan, this study assesses how significant Taiwan's precipitation and streamflow in different seasons correlate with selected teleconnection patterns. Long-term precipitation and streamflow data in three major precipitation seasons, namely the spring rains (February to April), Mei-Yu (May and June), and typhoon (July to September) seasons, are derived at 28 upstream and 13 downstream catchments in Taiwan. The three seasons depict a complete wet period of Taiwan as well as many regions bearing similar climatic conditions in East Asia. Lagged correlation analysis is then performed to investigate how the precipitation and streamflow data correlate with predominant teleconnection indices at varied lead times. Teleconnection indices are selected only if they show certain linkage with weather systems and activity in the three seasons based on previous literature. For instance, the ENSO and Quasi-Biennial Oscillation, proven to influence East Asian climate across seasons and summer typhoon activity, respectively, are included in the list of climate indices for correlation analysis. Significant correlations found between Taiwan's precipitation and streamflow and teleconnection indices are further examined by a climate regime shift (CRS) test to identify any abrupt changes in the correlations. The understanding of existing CRS is useful for informing the forecasting system of the changes in the predictor-predictand relationship. To evaluate prediction skill in the three seasons and skill differences between precipitation and streamflow, hindcasting experiments of precipitation and streamflow are conducted using stepwise linear regression models. Discussion and suggestions for coping

  3. Validation of the measure automobile emissions model : a statistical analysis

    Science.gov (United States)

    2000-09-01

    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  4. Dental models made with an intraoral scanner: A validation study.

    NARCIS (Netherlands)

    Cuperus, A.M.; Harms, M.C.; Rangel, F.A.; Bronkhorst, E.M.; Schols, J.G.J.H.; Breuning, K.H.

    2012-01-01

    INTRODUCTION: Our objectives were to determine the validity and reproducibility of measurements on stereolithographic models and 3-dimensional digital dental models made with an intraoral scanner. METHODS: Ten dry human skulls were scanned; from the scans, stereolithographic models and digital

  5. The effects of changing land cover on streamflow simulation in Puerto Rico

    Science.gov (United States)

    A.E. Van Beusekom; L.E. Hay; R.J. Viger; W.A. Gould; J.A. Collazo; A. Henareh Khalyani

    2014-01-01

    This study quantitatively explores whether land cover changes have a substantive impact on simulated streamflow within the tropical island setting of Puerto Rico. The Precipitation Runoff Modeling System (PRMS) was used to compare streamflow simulations based on five static parameterizations of land cover with those based on dynamically varying parameters derived from...

  6. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  7. Assessing the Impact of Climate Change on Extreme Streamflow and Reservoir Operation for Nuuanu Watershed, Oahu, Hawaii

    Science.gov (United States)

    Leta, O. T.; El-Kadi, A. I.; Dulaiova, H.

    2016-12-01

    Extreme events, such as flooding and drought, are expected to occur at increased frequencies worldwide due to climate change influencing the water cycle. This is particularly critical for tropical islands where the local freshwater resources are very sensitive to climate. This study examined the impact of climate change on extreme streamflow, reservoir water volume and outflow for the Nuuanu watershed, using the Soil and Water Assessment Tool (SWAT) model. Based on the sensitive parameters screened by the Latin Hypercube-One-factor-At-a-Time (LH-OAT) method, SWAT was calibrated and validated to daily streamflow using the SWAT Calibration and Uncertainty Program (SWAT-CUP) at three streamflow gauging stations. Results showed that SWAT adequately reproduced the observed daily streamflow hydrographs at all stations. This was verified with Nash-Sutcliffe Efficiency that resulted in acceptable values of 0.58 to 0.88, whereby more than 90% of observations were bracketed within 95% model prediction uncertainty interval for both calibration and validation periods, signifying the potential applicability of SWAT for future prediction. The climate change impact on extreme flows, reservoir water volume and outflow was assessed under the Representative Concentration Pathways of 4.5 and 8.5 scenarios. We found wide changes in extreme peak and low flows ranging from -44% to 20% and -50% to -2%, respectively, compared to baseline. Consequently, the amount of water stored in Nuuanu reservoir will be decreased up to 27% while the corresponding outflow rates are expected to decrease up to 37% relative to the baseline. In addition, the stored water and extreme flows are highly sensitive to rainfall change when compared to temperature and solar radiation changes. It is concluded that the decrease in extreme low and peak flows can have serious consequences, such as flooding, drought, with detrimental effects on riparian ecological functioning. This study's results are expected to aid in

  8. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...

  9. Spatiotemporal patterns of precipitation inferred from streamflow observations across the Sierra Nevada mountain range

    Science.gov (United States)

    Henn, Brian; Clark, Martyn P.; Kavetski, Dmitri; Newman, Andrew J.; Hughes, Mimi; McGurk, Bruce; Lundquist, Jessica D.

    2018-01-01

    Given uncertainty in precipitation gauge-based gridded datasets over complex terrain, we use multiple streamflow observations as an additional source of information about precipitation, in order to identify spatial and temporal differences between a gridded precipitation dataset and precipitation inferred from streamflow. We test whether gridded datasets capture across-crest and regional spatial patterns of variability, as well as year-to-year variability and trends in precipitation, in comparison to precipitation inferred from streamflow. We use a Bayesian model calibration routine with multiple lumped hydrologic model structures to infer the most likely basin-mean, water-year total precipitation for 56 basins with long-term (>30 year) streamflow records in the Sierra Nevada mountain range of California. We compare basin-mean precipitation derived from this approach with basin-mean precipitation from a precipitation gauge-based, 1/16° gridded dataset that has been used to simulate and evaluate trends in Western United States streamflow and snowpack over the 20th century. We find that the long-term average spatial patterns differ: in particular, there is less precipitation in the gridded dataset in higher-elevation basins whose aspect faces prevailing cool-season winds, as compared to precipitation inferred from streamflow. In a few years and basins, there is less gridded precipitation than there is observed streamflow. Lower-elevation, southern, and east-of-crest basins show better agreement between gridded and inferred precipitation. Implied actual evapotranspiration (calculated as precipitation minus streamflow) then also varies between the streamflow-based estimates and the gridded dataset. Absolute uncertainty in precipitation inferred from streamflow is substantial, but the signal of basin-to-basin and year-to-year differences are likely more robust. The findings suggest that considering streamflow when spatially distributing precipitation in complex terrain

  10. New validation metrics for models with multiple correlated responses

    International Nuclear Information System (INIS)

    Li, Wei; Chen, Wei; Jiang, Zhen; Lu, Zhenzhou; Liu, Yu

    2014-01-01

    Validating models with correlated multivariate outputs involves the comparison of multiple stochastic quantities. Considering both uncertainty and correlations among multiple responses from model and physical observations imposes challenges. Existing marginal comparison methods and the hypothesis testing-based methods either ignore correlations among responses or only reach Boolean conclusions (yes or no) without accounting for the amount of discrepancy between a model and the underlying reality. A new validation metric is needed to quantitatively characterize the overall agreement of multiple responses considering correlations among responses and uncertainty in both model predictions and physical observations. In this paper, by extending the concept of “area metric” and the “u-pooling method” developed for validating a single response, we propose new model validation metrics for validating correlated multiple responses using the multivariate probability integral transformation (PIT). One new metric is the PIT area metric for validating multi-responses at a single validation site. The other is the t-pooling metric that allows for pooling observations of multiple responses observed at multiple validation sites to assess the global predictive capability. The proposed metrics have many favorable properties that are well suited for validation assessment of models with correlated responses. The two metrics are examined and compared with the direct area metric and the marginal u-pooling method respectively through numerical case studies and an engineering example to illustrate their validity and potential benefits

  11. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  12. Validation of 2D flood models with insurance claims

    Science.gov (United States)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  13. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  14. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach. It is hoped that this paper will help managers to implement different corresponding measures. A case study is presented where this model measure and validates at the ...

  15. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  16. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  17. Stream Heat Budget Modeling of Groundwater Inputs: Model Development and Validation

    Science.gov (United States)

    Glose, A.; Lautz, L. K.

    2012-12-01

    Models of physical processes in fluvial systems are useful for improving understanding of hydrologic systems and for predicting future conditions. Process-based models of fluid flow and heat transport in fluvial systems can be used to quantify unknown spatial and temporal patterns of hydrologic fluxes, such as groundwater discharge, and to predict system response to future change. In this study, a stream heat budget model was developed and calibrated to observed stream water temperature data for Meadowbrook Creek in Syracuse, NY. The one-dimensional (longitudinal), transient stream temperature model is programmed in Matlab and solves the equations for heat and fluid transport using a Crank-Nicholson finite difference scheme. The model considers four meteorologically driven heat fluxes: shortwave solar radiation, longwave radiation, latent heat flux, and sensible heat flux. Streambed conduction is also considered. Input data for the model were collected from June 13-18, 2012 over a 500 m reach of Meadowbrook Creek, a first order urban stream that drains a retention pond in the city of Syracuse, NY. Stream temperature data were recorded every 20 m longitudinally in the stream at 5-minute intervals using iButtons (model DS1922L, accuracy of ±0.5°C, resolution of 0.0625°C). Meteorological data, including air temperature, solar radiation, relative humidity, and wind speed, were recorded at 5-minute intervals using an on-site weather station. Groundwater temperature was measured in wells adjacent to the stream. Stream dimensions, bed temperatures, and type of bed sediments were also collected. A constant rate tracer injection of Rhodamine WT was used to quantify groundwater inputs every 10 m independently to validate model results. Stream temperatures fluctuated diurnally by ~3-5 °C during the observation period with temperatures peaking around 2 pm and cooling overnight, reaching a minimum between 6 and 7 am. Spatially, the stream shows a cooling trend along the

  18. Global separation of plant transpiration from groundwater and streamflow

    Science.gov (United States)

    Jaivime Evaristo; Scott Jasechko; Jeffrey J. McDonnell

    2015-01-01

    Current land surface models assume that groundwater, streamflow and plant transpiration are all sourced and mediated by the same well mixed water reservoir—the soil. However, recent work in Oregon and Mexico has shown evidence of ecohydrological separation, whereby different subsurface compartmentalized pools of water supply either plant transpiration fluxes or the...

  19. Response of streamflow to projected climate change scenarios in an ...

    Indian Academy of Sciences (India)

    Snowmelt run-off model (SRM) based on degree-day approach has been employed to evaluate the change in snow-cover depletion and corresponding streamflow under different projected climatic scenarios for an eastern Himalayan catchment in India. Nuranang catchment located at Tawang district of Arunachal. Pradesh ...

  20. Application of ANN and fuzzy logic algorithms for streamflow ...

    Indian Academy of Sciences (India)

    1Department of Soil and Water Engineering, College of Technology and Engineering, Maharana Pratap ... evaporation, mean daily temperature and lag streamflow used. .... Evaporation, E, mm. 2383. 16.0. 0.2. 3.36. 1.25. 0.106. 0.016. 0.02. Table 2. Input parameters and ANN structure of different model for Savitri Basin.

  1. Response of streamflow to projected climate change scenarios in an ...

    Indian Academy of Sciences (India)

    Snowmelt run-off model (SRM) based on degree-day approach has been employed to evaluate the change in snow-cover depletion and corresponding streamflow under different projected climatic scenarios foran eastern Himalayan catchment in India. Nuranang catchment located at Tawang district of ArunachalPradesh ...

  2. Contribution of MODIS Derived Snow Cover Satellite Data into Artificial Neural Network for Streamflow Estimation

    Science.gov (United States)

    Uysal, Gokcen; Arda Sorman, Ali; Sensoy, Aynur

    2014-05-01

    Contribution of snowmelt and correspondingly snow observations are highly important in mountainous basins for modelers who deal with conceptual, physical or soft computing models in terms of effective water resources management. Long term archived continuous data are needed for appropriate training and testing of data driven approaches like artificial neural networks (ANN). Data is scarce at the upper elevations due to the difficulty of installing sufficient automated SNOTEL stations; thus in literatures many attempts are made on the rainfall dominated basins for streamflow estimation studies. On the other hand, optical satellites can easily detect snow because of its high reflectance property. MODIS (Moderate Resolution Imaging Spectroradiometer) satellite that has two platforms (Terra and Aqua) provides daily and 8-daily snow images for different time periods since 2000, therefore snow cover data (SCA) may be useful as an input layer for ANN applications. In this study, a multi-layer perceptron (MLP) model is trained and tested with precipitation, temperature, radiation, previous day discharges as well as MODIS daily SCA data. The weights and biases are optimized with fastest and robust Levenberg-Marquardt backpropagation algorithm. MODIS snow cover images are removed from cloud coverage using certain filtering techniques. The Upper Euphrates River Basin in eastern part of Turkey (10 250 km2) is selected as the application area since it is fed by snowmelt approximately 2/3 of total annual volume during spring and early summer. Several input models and ANN structures are investigated to see the effect of the contributions using 10 years of data (2001-2010) for training and validation. The accuracy of the streamflow estimations is checked with statistical criteria (coefficient of determination, Nash-Sutcliffe model efficiency, root mean square error, mean absolute error) and the results seem to improve when SCA data is introduced. Furthermore, a forecast study is

  3. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  4. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    OpenAIRE

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-01-01

    The Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamfl...

  5. A Quantitative Comparison of Prediction Methods for Daily Streamflow Time Series at Ungaged Sites

    Science.gov (United States)

    Kiang, Julie; Farmer, William; Archfield, Stacey; Over, Thomas; Vogel, Richard

    2014-05-01

    The existence of reliable, continuous daily records of natural streamflow enhances our ability to manage our water resources. In many regions, due to a lack of adequate gaging resources, it is necessary to create representative records where none exist. Research on prediction in ungaged basins (PUB) has been very active over the past decade. We report the findings of an ongoing national study by the U.S. Geological Survey, which seeks to provide spatially and temporally continuous 30-year records of historical daily records of natural streamflow (1980-2010) at the watershed scale (HUC-12). Employing data from 182 nearly pristine basins in the Southeast United States, a three-fold validation procedure was used to simulate the ungaged case for each basin. Ungaged flows were estimated using transfer-based methods: standardizing by drainage area, mean flows, means and standard deviations, and using an interpolation of flow duration curves (QPPQ). The effect of index-gage selection was also considered: using the nearest-neighboring gage or the gage with the greatest correlation. These methods were compared with a daily version of the Analysis of Flows in Networks of Channels (AFINCH) model and the Precipitation-Runoff Modeling System (PRMS), a deterministic model. We developed a multi-objective, comparative assessment of PUB methods. The selection of an optimal PUB method is shown to depend on the intended application of the estimated flow record. We identify the PUB methods that perform best across the 32 goodness-of-fit metrics considered.

  6. Model Validation for Simulations of Vehicle Systems

    Science.gov (United States)

    2012-08-01

    jackknife”, Annals of Statistics, 7:1-26, 1979. [45] B. Efron and G. Gong, “A leisurely look at the bootstrap, the jackknife, and cross-validation”, The...battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation of ground vehicle systems...Sandia National Laboratories and a battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation

  7. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    The national policies for the education/training of adults are in the 21st century highly influenced by proposals which are formulated and promoted by The European Union (EU) as well as other transnational players and this shift in policy making has consequences. One is that ideas which in the past...... would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  8. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.

    1999-12-09

    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  9. Validating Computational Cognitive Process Models across Multiple Timescales

    Science.gov (United States)

    Myers, Christopher; Gluck, Kevin; Gunzelmann, Glenn; Krusmark, Michael

    2010-12-01

    Model comparison is vital to evaluating progress in the fields of artificial general intelligence (AGI) and cognitive architecture. As they mature, AGI and cognitive architectures will become increasingly capable of providing a single model that completes a multitude of tasks, some of which the model was not specifically engineered to perform. These models will be expected to operate for extended periods of time and serve functional roles in real-world contexts. Questions arise regarding how to evaluate such models appropriately, including issues pertaining to model comparison and validation. In this paper, we specifically address model validation across multiple levels of abstraction, using an existing computational process model of unmanned aerial vehicle basic maneuvering to illustrate the relationship between validity and timescales of analysis.

  10. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  11. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  12. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference excit...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading.......The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...

  13. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    Science.gov (United States)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  14. Ion channel model development and validation

    Science.gov (United States)

    Nelson, Peter Hugo

    2010-03-01

    The structure of the KcsA ion channel selectivity filter is used to develop three simple models of ion channel permeation. The quantitative predictions of the knock-on model are tested by comparison with experimental data from single-channel recordings of the KcsA channel. By comparison with experiment, students discover that the knock-on model can't explain saturation of ion channel current as the concentrations of the bathing solutions are increased. By inverting the energy diagram, students derive the association-dissociation model of ion channel permeation. This model predicts non-linear Michaelis-Menten saturating behavior that requires students to perform non-linear least-squares fits to the experimental data. This is done using Excel's solver feature. Students discover that this simple model does an excellent job of explaining the qualitative features of ion channel permeation but cannot account for changes in voltage sensitivity. The model is then extended to include an electrical dissociation distance. This rapid translocation model is then compared with experimental data from a wide variety of ion channels and students discover that this model also has its limitations. Support from NSF DUE 0836833 is gratefully acknowledged.

  15. A validated physical model of greenhouse climate.

    NARCIS (Netherlands)

    Bot, G.P.A.

    1989-01-01

    In the greenhouse model the momentaneous environmental crop growth factors are calculated as output, together with the physical behaviour of the crop. The boundary conditions for this model are the outside weather conditions; other inputs are the physical characteristics of the crop, of the

  16. Extending Model Checking To Object Process Validation

    NARCIS (Netherlands)

    van Rein, H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent

  17. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  18. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  19. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    incomplete knowledge of the characteristics inherent to each model. During water immersion, the hydrostatic pressure lowers the peripheral vascular capacity and causes increased thoracic blood volume and high vascular perfusion. In turn, these changes lead to high urinary flow, low vasomotor tone, and a high...... a negative pressure around the body. The differences in renal function between space and experimental models appear to be explained by the physical forces affecting tissues and hemodynamics as well as by the changes secondary to these forces. These differences may help in selecting experimental models...

  20. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... are continuous or discrete. With both simulated data, and a real data set related to geolocation of seals, we demonstrate both the potential and the limitations of the techniques. Our results fill a need for convenient methods for validating a state space model, or alternatively, rejecting it while indicating...

  1. Transfer Entropy as a Tool for Hydrodynamic Model Validation

    Directory of Open Access Journals (Sweden)

    Alicia Sendrowski

    2018-01-01

    Full Text Available The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, and solute transport through the deltaic network. We propose using transfer entropy (TE to validate model results. TE quantifies the information transferred between variables in terms of strength, timescale, and direction. Using water level data collected in the distributary channels and inter-channel islands of Wax Lake Delta, Louisiana, USA, along with modeled water level data generated for the same locations using Delft3D, we assess how well couplings between external drivers (river discharge, tides, wind and modeled water levels reproduce the observed data couplings. We perform this operation through time using ten-day windows. Modeled and observed couplings compare well; their differences reflect the spatial parameterization of wind and roughness in the model, which prevents the model from capturing high frequency fluctuations of water level. The model captures couplings better in channels than on islands, suggesting that mechanisms of channel-island connectivity are not fully represented in the model. Overall, TE serves as an additional validation tool to quantify the couplings of the system of interest at multiple spatial and temporal scales.

  2. VERIFICATION AND VALIDATION OF THE SPARC MODEL

    Science.gov (United States)

    Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

  3. Base Flow Model Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  4. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  5. Validating predictions from climate envelope models.

    Directory of Open Access Journals (Sweden)

    James I Watling

    Full Text Available Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 and evaluated using occurrence data from 1998-2002 (t2. Model sensitivity (the ability to correctly classify species presences was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on

  6. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    USACE, Pittsburgh District ( LRP ) requested that the US Army Engineer Research and Development Center, Coastal and ERDC/CHL TR-13-9 2 Hydraulics...approaching the lock and dam. The second set of experiments considered a design, referred to as Plan B lock approach, which contained the weir field in...conditions and model parameters A discharge of 1.35 cfs was set as the inflow boundary condition at the upstream end of the model. The outflow boundary was

  7. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    conservativeness level , the conservative probability of failure obtained from Section 4 must be maintained. The mathematical formulation of conservative model... CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...PDF and a probability of failure are selected from these predicted output PDFs at a user-specified conservativeness level for validation. For

  8. The effects of changing land cover on streamflow simulation in Puerto Rico

    Science.gov (United States)

    Van Beusekom, Ashley; Hay, Lauren E.; Viger, Roland; Gould, William A.; Collazo, Jaime; Henareh Khalyani, Azad

    2014-01-01

    This study quantitatively explores whether land cover changes have a substantive impact on simulated streamflow within the tropical island setting of Puerto Rico. The Precipitation Runoff Modeling System (PRMS) was used to compare streamflow simulations based on five static parameterizations of land cover with those based on dynamically varying parameters derived from four land cover scenes for the period 1953-2012. The PRMS simulations based on static land cover illustrated consistent differences in simulated streamflow across the island. It was determined that the scale of the analysis makes a difference: large regions with localized areas that have undergone dramatic land cover change may show negligible difference in total streamflow, but streamflow simulations using dynamic land cover parameters for a highly altered subwatershed clearly demonstrate the effects of changing land cover on simulated streamflow. Incorporating dynamic parameterization in these highly altered watersheds can reduce the predictive uncertainty in simulations of streamflow using PRMS. Hydrologic models that do not consider the projected changes in land cover may be inadequate for water resource management planning for future conditions.

  9. Analytical flow duration curves for summer streamflow in Switzerland

    Science.gov (United States)

    Santos, Ana Clara; Portela, Maria Manuela; Rinaldo, Andrea; Schaefli, Bettina

    2018-04-01

    This paper proposes a systematic assessment of the performance of an analytical modeling framework for streamflow probability distributions for a set of 25 Swiss catchments. These catchments show a wide range of hydroclimatic regimes, including namely snow-influenced streamflows. The model parameters are calculated from a spatially averaged gridded daily precipitation data set and from observed daily discharge time series, both in a forward estimation mode (direct parameter calculation from observed data) and in an inverse estimation mode (maximum likelihood estimation). The performance of the linear and the nonlinear model versions is assessed in terms of reproducing observed flow duration curves and their natural variability. Overall, the nonlinear model version outperforms the linear model for all regimes, but the linear model shows a notable performance increase with catchment elevation. More importantly, the obtained results demonstrate that the analytical model performs well for summer discharge for all analyzed streamflow regimes, ranging from rainfall-driven regimes with summer low flow to snow and glacier regimes with summer high flow. These results suggest that the model's encoding of discharge-generating events based on stochastic soil moisture dynamics is more flexible than previously thought. As shown in this paper, the presence of snowmelt or ice melt is accommodated by a relative increase in the discharge-generating frequency, a key parameter of the model. Explicit quantification of this frequency increase as a function of mean catchment meteorological conditions is left for future research.

  10. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  11. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  12. Validation of Model Forecasts of the Ambient Solar Wind

    Science.gov (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  13. Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm

    Science.gov (United States)

    Mathai, J.; Mujumdar, P.

    2017-12-01

    A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.

  14. Using Ensemble Streamflows for Power Marketing at Bonneville Power Administration

    Science.gov (United States)

    Barton, S. B.; Koski, P.

    2014-12-01

    Bonneville Power Administration (BPA) is a federal non-profit agency within the Pacific Northwest responsible for marketing the power generated from 31 federal hydro projects throughout the Columbia River Basin. The basin encompasses parts of five states and portions of British Columbia, Canada. BPA works with provincial entities, federal and state agencies, and tribal members to manage the water resources for a variety of purposes including flood risk management, power generation, fisheries, irrigation, recreation, and navigation. This basin is subject to significant hydrologic variability in terms of seasonal volume and runoff shape from year to year which presents new water management challenges each year. The power generation planning group at BPA includes a team of meteorologists and hydrologists responsible for preparing both short-term (up to three weeks) and mid-term (up to 18 months) weather and streamflow forecasts including ensemble streamflow data. Analysts within the mid-term planning group are responsible for running several different hydrologic models used for planning studies. These models rely on these streamflow ensembles as a primary input. The planning studies are run bi-weekly to help determine the amount of energy available, or energy inventory, for forward marketing (selling or purchasing energy up to a year in advance). These studies are run with the objective of meeting the numerous multi-purpose objectives of the basin under the various streamflow conditions within the ensemble set. In addition to ensemble streamflows, an ensemble of seasonal volume forecasts is also provided for the various water conditions in order to set numerous constraints on the system. After meeting all the various requirements of the system, a probabilistic energy inventory is calculated and used for marketing purposes.

  15. Variance analysis of forecasted streamflow maxima in a wet temperate climate

    Science.gov (United States)

    Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.

    2018-05-01

    Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.

  16. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made...

  17. Dental models made with an intraoral scanner: a validation study.

    Science.gov (United States)

    Cuperus, Anne Margreet R; Harms, Marit C; Rangel, Frits A; Bronkhorst, Ewald M; Schols, Jan G J H; Breuning, K Hero

    2012-09-01

    Our objectives were to determine the validity and reproducibility of measurements on stereolithographic models and 3-dimensional digital dental models made with an intraoral scanner. Ten dry human skulls were scanned; from the scans, stereolithographic models and digital models were made. Two observers measured transversal distances, mesiodistal tooth widths, and arch segments on the skulls and the stereolithographic and digital models. All measurements were repeated 4 times. Arch length discrepancy and tooth size discrepancy were calculated. Statistical analysis was performed by using paired t tests. For the measurements on the stereolithographic and digital models, statistically significant differences were found. However, these differences were considered to be clinically insignificant. Digital models had fewer statistically significant differences and generally the smallest duplicate measurement errors compared with the stereolithographic models. Stereolithographic and digital models made with an intraoral scanner are a valid and reproducible method for measuring distances in a dentition. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  18. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  19. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation......The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...

  20. Measuring real-time streamflow using emerging technologies: Radar, hydroacoustics, and the probability concept

    Science.gov (United States)

    Fulton, J.; Ostrowski, J.

    2008-01-01

    Forecasting streamflow during extreme hydrologic events such as floods can be problematic. This is particularly true when flow is unsteady, and river forecasts rely on models that require uniform-flow rating curves to route water from one forecast point to another. As a result, alternative methods for measuring streamflow are needed to properly route flood waves and account for inertial and pressure forces in natural channels dominated by nonuniform-flow conditions such as mild water surface slopes, backwater, tributary inflows, and reservoir operations. The objective of the demonstration was to use emerging technologies to measure instantaneous streamflow in open channels at two existing US Geological Survey streamflow-gaging stations in Pennsylvania. Surface-water and instream-point velocities were measured using hand-held radar and hydroacoustics. Streamflow was computed using the probability concept, which requires velocity data from a single vertical containing the maximum instream velocity. The percent difference in streamflow at the Susquehanna River at Bloomsburg, PA ranged from 0% to 8% with an average difference of 4% and standard deviation of 8.81 m3/s. The percent difference in streamflow at Chartiers Creek at Carnegie, PA ranged from 0% to 11% with an average difference of 5% and standard deviation of 0.28 m3/s. New generation equipment is being tested and developed to advance the use of radar-derived surface-water velocity and instantaneous streamflow to facilitate the collection and transmission of real-time streamflow that can be used to parameterize hydraulic routing models.

  1. Propagation of soil moisture memory to streamflow and evapotranspiration in Europe

    Directory of Open Access Journals (Sweden)

    R. Orth

    2013-10-01

    Full Text Available As a key variable of the land-climate system soil moisture is a main driver of streamflow and evapotranspiration under certain conditions. Soil moisture furthermore exhibits outstanding memory (persistence characteristics. Many studies also report distinct low frequency variations for streamflow, which are likely related to soil moisture memory. Using data from over 100 near-natural catchments located across Europe, we investigate in this study the connection between soil moisture memory and the respective memory of streamflow and evapotranspiration on different time scales. For this purpose we use a simple water balance model in which dependencies of runoff (normalised by precipitation and evapotranspiration (normalised by radiation on soil moisture are fitted using streamflow observations. The model therefore allows us to compute the memory characteristics of soil moisture, streamflow and evapotranspiration on the catchment scale. We find considerable memory in soil moisture and streamflow in many parts of the continent, and evapotranspiration also displays some memory at monthly time scale in some catchments. We show that the memory of streamflow and evapotranspiration jointly depend on soil moisture memory and on the strength of the coupling of streamflow and evapotranspiration to soil moisture. Furthermore, we find that the coupling strengths of streamflow and evapotranspiration to soil moisture depend on the shape of the fitted dependencies and on the variance of the meteorological forcing. To better interpret the magnitude of the respective memories across Europe, we finally provide a new perspective on hydrological memory by relating it to the mean duration required to recover from anomalies exceeding a certain threshold.

  2. Propagation of soil moisture memory to streamflow and evapotranspiration in Europe

    Science.gov (United States)

    Orth, R.; Seneviratne, S. I.

    2013-10-01

    As a key variable of the land-climate system soil moisture is a main driver of streamflow and evapotranspiration under certain conditions. Soil moisture furthermore exhibits outstanding memory (persistence) characteristics. Many studies also report distinct low frequency variations for streamflow, which are likely related to soil moisture memory. Using data from over 100 near-natural catchments located across Europe, we investigate in this study the connection between soil moisture memory and the respective memory of streamflow and evapotranspiration on different time scales. For this purpose we use a simple water balance model in which dependencies of runoff (normalised by precipitation) and evapotranspiration (normalised by radiation) on soil moisture are fitted using streamflow observations. The model therefore allows us to compute the memory characteristics of soil moisture, streamflow and evapotranspiration on the catchment scale. We find considerable memory in soil moisture and streamflow in many parts of the continent, and evapotranspiration also displays some memory at monthly time scale in some catchments. We show that the memory of streamflow and evapotranspiration jointly depend on soil moisture memory and on the strength of the coupling of streamflow and evapotranspiration to soil moisture. Furthermore, we find that the coupling strengths of streamflow and evapotranspiration to soil moisture depend on the shape of the fitted dependencies and on the variance of the meteorological forcing. To better interpret the magnitude of the respective memories across Europe, we finally provide a new perspective on hydrological memory by relating it to the mean duration required to recover from anomalies exceeding a certain threshold.

  3. Estimating current and future streamflow characteristics at ungaged sites, central and eastern Montana, with application to evaluating effects of climate change on fish populations

    Science.gov (United States)

    Sando, Roy; Chase, Katherine J.

    2017-03-23

    A common statistical procedure for estimating streamflow statistics at ungaged locations is to develop a relational model between streamflow and drainage basin characteristics at gaged locations using least squares regression analysis; however, least squares regression methods are parametric and make constraining assumptions about the data distribution. The random forest regression method provides an alternative nonparametric method for estimating streamflow characteristics at ungaged sites and requires that the data meet fewer statistical conditions than least squares regression methods.Random forest regression analysis was used to develop predictive models for 89 streamflow characteristics using Precipitation-Runoff Modeling System simulated streamflow data and drainage basin characteristics at 179 sites in central and eastern Montana. The predictive models were developed from streamflow data simulated for current (baseline, water years 1982–99) conditions and three future periods (water years 2021–38, 2046–63, and 2071–88) under three different climate-change scenarios. These predictive models were then used to predict streamflow characteristics for baseline conditions and three future periods at 1,707 fish sampling sites in central and eastern Montana. The average root mean square error for all predictive models was about 50 percent. When streamflow predictions at 23 fish sampling sites were compared to nearby locations with simulated data, the mean relative percent difference was about 43 percent. When predictions were compared to streamflow data recorded at 21 U.S. Geological Survey streamflow-gaging stations outside of the calibration basins, the average mean absolute percent error was about 73 percent.

  4. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  5. Finite element model validation of bridge based on structural health monitoring—Part II: Uncertainty propagation and model validation

    Directory of Open Access Journals (Sweden)

    Xiaosong Lin

    2015-08-01

    Full Text Available Because of uncertainties involved in modeling, construction, and measurement systems, the assessment of the FE model validation must be conducted based on stochastic measurements to provide designers with confidence for further applications. In this study, based on the updated model using response surface methodology, a practical model validation methodology via uncertainty propagation is presented. Several criteria of testing/analysis correlation are introduced, and the sources of model and testing uncertainties are also discussed. After that, Monte Carlo stochastic finite element (FE method is employed to perform the uncertainty quantification and propagation. The proposed methodology is illustrated with the examination of the validity of a large-span prestressed concrete continuous rigid frame bridge monitored under operational conditions. It can be concluded that the calculated frequencies and vibration modes of the updated FE model of Xiabaishi Bridge are consistent with the measured ones. The relative errors of each frequency are all less than 3.7%. Meanwhile, the overlap ratio indexes of each frequency are all more than 75%; The MAC values of each calculated vibration frequency are all more than 90%. The model of Xiabaishi Bridge is valid in the whole operation space including experimental design space, and its confidence level is upper than 95%. The validated FE model of Xiabaishi Bridge can reflect the current condition of Xiabaishi Bridge, and also can be used as basis of bridge health monitoring, damage identification and safety assessment.

  6. [Validation of abdominal wound dehiscence's risk model].

    Science.gov (United States)

    Gómez Díaz, Carlos Javier; Rebasa Cladera, Pere; Navarro Soto, Salvador; Hidalgo Rosas, José Manuel; Luna Aufroy, Alexis; Montmany Vioque, Sandra; Corredera Cantarín, Constanza

    2014-02-01

    The aim of this study is to determine the usefulness of the risk model developed by van Ramshorst et al., and a modification of the same, to predict the abdominal wound dehiscence's risk in patients who underwent midline laparotomy incisions. Observational longitudinal retrospective study. Patients who underwent midline laparotomy incisions in the General and Digestive Surgery Department of the Sabadell's Hospital-Parc Taulí's Health and University Corporation-Barcelona, between January 1, 2010 and June 30, 2010. Dependent variable: Abdominal wound dehiscence. Global risk score, preoperative risk score (postoperative variables were excluded), global and preoperative probabilities of developing abdominal wound dehiscence. 176 patients. Patients with abdominal wound dehiscence: 15 (8.5%). The global risk score of abdominal wound dehiscence group (mean: 4.97; IC 95%: 4.15-5.79) was better than the global risk score of No abdominal wound dehiscence group (mean: 3.41; IC 95%: 3.20-3.62). This difference is statistically significant (P<.001). The preoperative risk score of abdominal wound dehiscence group (mean: 3.27; IC 95%: 2.69-3.84) was better than the preoperative risk score of No abdominal wound dehiscence group (mean: 2.77; IC 95%: 2.64-2.89), also a statistically significant difference (P<.05). The global risk score (area under the ROC curve: 0.79) has better accuracy than the preoperative risk score (area under the ROC curve: 0.64). The risk model developed by van Ramshorst et al. to predict the abdominal wound dehiscence's risk in the preoperative phase has a limited usefulness. Additional refinements in the preoperative risk score are needed to improve its accuracy. Copyright © 2012 AEC. Published by Elsevier Espana. All rights reserved.

  7. Monthly streamflow forecasting with auto-regressive integrated moving average

    Science.gov (United States)

    Nasir, Najah; Samsudin, Ruhaidah; Shabri, Ani

    2017-09-01

    Forecasting of streamflow is one of the many ways that can contribute to better decision making for water resource management. The auto-regressive integrated moving average (ARIMA) model was selected in this research for monthly streamflow forecasting with enhancement made by pre-processing the data using singular spectrum analysis (SSA). This study also proposed an extension of the SSA technique to include a step where clustering was performed on the eigenvector pairs before reconstruction of the time series. The monthly streamflow data of Sungai Muda at Jeniang, Sungai Muda at Jambatan Syed Omar and Sungai Ketil at Kuala Pegang was gathered from the Department of Irrigation and Drainage Malaysia. A ratio of 9:1 was used to divide the data into training and testing sets. The ARIMA, SSA-ARIMA and Clustered SSA-ARIMA models were all developed in R software. Results from the proposed model are then compared to a conventional auto-regressive integrated moving average model using the root-mean-square error and mean absolute error values. It was found that the proposed model can outperform the conventional model.

  8. Downscaling of GCM forecasts to streamflow over Scandinavia

    DEFF Research Database (Denmark)

    Nilsson, P.; Uvo, C.B.; Landman, W.A.

    2008-01-01

    A seasonal forecasting technique to produce probabilistic and deterministic streamflow forecasts for 23 basins in Norway and northern Sweden is developed in this work. Large scale circulation and moisture fields, forecasted by the ECHAM4.5 model 4 months in advance, are used to forecast spring...... flows. The technique includes model output statistics (MOS) based on a non-linear Neural Network (NN) approach. Results show that streamflow forecasts from Global Circulation Model (GCM) predictions, for the Scandinavia region are viable and highest skill values were found for basins located in south......-western Norway. The physical interpretation of the forecasting skill is that stations close to the Norwegian coast are directly exposed to prevailing winds from the Atlantic ocean, which constitute the principal source of predictive information from the atmosphere on the seasonal timescale....

  9. Resampling procedures to validate dendro-auxometric regression models

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Regression analysis has a large use in several sectors of forest research. The validation of a dendro-auxometric model is a basic step in the building of the model itself. The more a model resists to attempts of demonstrating its groundlessness, the more its reliability increases. In the last decades many new theories, that quite utilizes the calculation speed of the calculators, have been formulated. Here we show the results obtained by the application of a bootsprap resampling procedure as a validation tool.

  10. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  11. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  12. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  13. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  14. Landslide Tsunami Generation Models: Validation and Case Studies

    Science.gov (United States)

    Watts, P.; Grilli, S. T.; Kirby, J. T.; Fryer, G. J.; Tappin, D. R.

    2002-12-01

    There has been a proliferation of landslide tsunami generation and propagation models in recent time, spurred largely by the 1998 Papua New Guinea event. However, few of these models or techniques have been carefully validated. Moreover, few of these models have proven capable of integrating the best available geological data and interpretations into convincing case studies. The Tsunami Open and Progressive Initial Conditions System (TOPICS) rapidly provides approximate landslide tsunami sources for tsunami propagation models. We present 3D laboratory experiments and 3D Boundary Element Method simulations that validate the tsunami sources given by TOPICS. Geowave is a combination of TOPICS with the fully nonlinear and dispersive Boussinesq model FUNWAVE, which has been the subject of extensive testing and validation over the course of the last decade. Geowave is currently a tsunami community model made available to all tsunami researchers on the web site www.tsunamicommunity.org. We validate Geowave with case studies of the 1946 Unimak, Alaska, the 1994 Skagway, Alaska, and the 1998 Papua New Guinea events. The benefits of Boussinesq wave propagation over traditional shallow water wave models is very apparent for these relatively steep and nonlinear waves. For the first time, a tsunami community model appear sufficiently powerful to reproduce all observations and records with the first numerical simulation. This can only be accomplished by first assembling geological data and interpretations into a reasonable tsunami source.

  15. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  16. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available This paper provides a methodology for Validation and Verification (V&V) of a Bayesian Network (BN) model for aircraft vulnerability against Infrared (IR) missile threats. The model considers that the aircraft vulnerability depends both on a missile...

  17. On the development and validation of QSAR models.

    Science.gov (United States)

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  18. Improving flood prediction by assimilation of the distributed streamflow observations with variable uncertainty and intermittent behavior

    Science.gov (United States)

    Mazzoleni, Maurizio; Alfonso, Leonardo; Solomatine, Dimitri

    2015-04-01

    Data assimilation techniques have been used in the last decades to integrate water measurements for physical sensors in mathematical model in order to improve flood prediction. Parallel to this, the continued technological improvement has stimulated the spread of low-cost sensors used to infer hydrological variables in a more distributed way but less accurately. The main goal of this study is to demonstrate how assimilation of streamflow observations having variable uncertainty and intermittent characteristics can improve flood prediction using hydrological model. The methodology is applied in the Brue catchment, South West of England. The catchment is divided in small sub-basins, about 2km2 resolution, in order to represent the spatial variability of the streamflow observations by means of a semi-distributed Kalinin-Milyukov-Nash Cascade model. The measured precipitation values are used as perfect forecast input in the hydrological model. Then, an Ensemble Kalman filter is implemented and adapted to account for streamflow observations having random uncertainty and coming at irregular time steps. Due to the fact that distributed observations are not available within the Brue basin, synthetic streamflow values are generated. The results show how streamflow observations having variable uncertainty can improve the flood prediction according to the location from which these observations are coming. Overall, streamflow observations coming from low cost sensors can be integrated with physical sensors observation to improve flood prediction. This study is part of the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/).

  19. Global change in streamflow extremes under climate change over the 21st century

    Science.gov (United States)

    Asadieh, Behzad; Krakauer, Nir Y.

    2017-11-01

    Global warming is expected to intensify the Earth's hydrological cycle and increase flood and drought risks. Changes over the 21st century under two warming scenarios in different percentiles of the probability distribution of streamflow, and particularly of high and low streamflow extremes (95th and 5th percentiles), are analyzed using an ensemble of bias-corrected global climate model (GCM) fields fed into different global hydrological models (GHMs) provided by the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to understand the changes in streamflow distribution and simultaneous vulnerability to different types of hydrological risk in different regions. In the multi-model mean under the Representative Concentration Pathway 8.5 (RCP8.5) scenario, 37 % of global land areas experience an increase in magnitude of extremely high streamflow (with an average increase of 24.5 %), potentially increasing the chance of flooding in those regions. On the other hand, 43 % of global land areas show a decrease in the magnitude of extremely low streamflow (average decrease of 51.5 %), potentially increasing the chance of drought in those regions. About 10 % of the global land area is projected to face simultaneously increasing high extreme streamflow and decreasing low extreme streamflow, reflecting the potentially worsening hazard of both flood and drought; further, these regions tend to be highly populated parts of the globe, currently holding around 30 % of the world's population (over 2.1 billion people). In a world more than 4° warmer by the end of the 21st century compared to the pre-industrial era (RCP8.5 scenario), changes in magnitude of streamflow extremes are projected to be about twice as large as in a 2° warmer world (RCP2.6 scenario). Results also show that inter-GHM uncertainty in streamflow changes, due to representation of terrestrial hydrology, is greater than the inter-GCM uncertainty due to simulation of climate change. Under both forcing

  20. Validation of heat transfer models for gap cooling

    International Nuclear Information System (INIS)

    Okano, Yukimitsu; Nagae, Takashi; Murase, Michio

    2004-01-01

    For severe accident assessment of a light water reactor, models of heat transfer in a narrow annular gap between overheated core debris and a reactor pressure vessel are important for evaluating vessel integrity and accident management. The authors developed and improved the models of heat transfer. However, validation was not sufficient for applicability of the gap heat flux correlation to the debris cooling in the vessel lower head and applicability of the local boiling heat flux correlations to the high-pressure conditions. Therefore, in this paper, we evaluated the validity of the heat transfer models and correlations by analyses for ALPHA and LAVA experiments where molten aluminum oxide (Al 2 O 3 ) at about 2700 K was poured into the high pressure water pool in a small-scale simulated vessel lower head. In the heating process of the vessel wall, the calculated heating rate and peak temperature agreed well with the measured values, and the validity of the heat transfer models and gap heat flux correlation was confirmed. In the cooling process of the vessel wall, the calculated cooling rate was compared with the measured value, and the validity of the nucleate boiling heat flux correlation was confirmed. The peak temperatures of the vessel wall in ALPHA and LAVA experiments were lower than the temperature at the minimum heat flux point between film boiling and transition boiling, so the minimum heat flux correlation could not be validated. (author)

  1. What Do They Have in Common? Drivers of Streamflow Spatial Correlation and Prediction of Flow Regimes in Ungauged Locations

    Science.gov (United States)

    Betterle, A.; Radny, D.; Schirmer, M.; Botter, G.

    2017-12-01

    The spatial correlation of daily streamflows represents a statistical index encapsulating the similarity between hydrographs at two arbitrary catchment outlets. In this work, a process-based analytical framework is utilized to investigate the hydrological drivers of streamflow spatial correlation through an extensive application to 78 pairs of stream gauges belonging to 13 unregulated catchments in the eastern United States. The analysis provides insight on how the observed heterogeneity of the physical processes that control flow dynamics ultimately affect streamflow correlation and spatial patterns of flow regimes. Despite the variability of recession properties across the study catchments, the impact of heterogeneous drainage rates on the streamflow spatial correlation is overwhelmed by the spatial variability of frequency and intensity of effective rainfall events. Overall, model performances are satisfactory, with root mean square errors between modeled and observed streamflow spatial correlation below 10% in most cases. We also propose a method for estimating streamflow correlation in the absence of discharge data, which proves useful to predict streamflow regimes in ungauged areas. The method consists in setting a minimum threshold on the modeled flow correlation to individuate hydrologically similar sites. Catchment outlets that are most correlated (ρ>0.9) are found to be characterized by analogous streamflow distributions across a broad range of flow regimes.

  2. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  3. Validity of empirical models of exposure in asphalt paving

    Science.gov (United States)

    Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

    2002-01-01

    Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

  4. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Multiscale temporal variability and regional patterns in 555 years of conterminous U.S. streamflow

    Science.gov (United States)

    Ho, Michelle; Lall, Upmanu; Sun, Xun; Cook, Edward R.

    2017-04-01

    The development of paleoclimate streamflow reconstructions in the conterminous United States (CONUS) has provided water resource managers with improved insights into multidecadal and centennial scale variability that cannot be reliably detected using shorter instrumental records. Paleoclimate streamflow reconstructions have largely focused on individual catchments limiting the ability to quantify variability across the CONUS. The Living Blended Drought Atlas (LBDA), a spatially and temporally complete 555 year long paleoclimate record of summer drought across the CONUS, provides an opportunity to reconstruct and characterize streamflow variability at a continental scale. We explore the validity of the first paleoreconstructions of streamflow that span the CONUS informed by the LBDA targeting a set of U.S. Geological Survey streamflow sites. The reconstructions are skillful under cross validation across most of the country, but the variance explained is generally low. Spatial and temporal structures of streamflow variability are analyzed using hierarchical clustering, principal component analysis, and wavelet analyses. Nine spatially coherent clusters are identified. The reconstructions show signals of contemporary droughts such as the Dust Bowl (1930s) and 1950s droughts. Decadal-scale variability was detected in the late 1900s in the western U.S., however, similar modes of temporal variability were rarely present prior to the 1950s. The twentieth century featured longer wet spells and shorter dry spells compared with the preceding 450 years. Streamflows in the Pacific Northwest and Northeast are negatively correlated with the central U.S. suggesting the potential to mitigate some drought impacts by balancing economic activities and insurance pools across these regions during major droughts.

  6. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate...... empirical data for validation of DSF modeling with building simulation software were produced within the International Energy Agency (IEA) SHCTask 34 / ECBCS Annex 43. This paper describes the full-scale outdoor experimental test facility, the experimental set-up and the measurements procedure...

  7. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  8. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  9. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  10. Validation of the newborn larynx modeling with aerodynamical experimental data.

    Science.gov (United States)

    Nicollas, R; Giordano, J; Garrel, R; Medale, M; Caminat, P; Giovanni, A; Ouaknine, M; Triglia, J M

    2009-06-01

    Many authors have studied adult's larynx modelization, but the mechanisms of newborn's voice production have very rarely been investigated. After validating a numerical model with acoustic data, studies were performed on larynges of human fetuses in order to validate this model with aerodynamical experiments. Anatomical measurements were performed and a simplified numerical model was built using Fluent((R)) with the vocal folds in phonatory position. The results obtained are in good agreement with those obtained by laser Doppler velocimetry (LDV) and high-frame rate particle image velocimetry (HFR-PIV), on an experimental bench with excised human fetus larynges. It appears that computing with first cry physiological parameters leads to a model which is close to those obtained in experiments with real organs.

  11. Thermal hydraulic model validation for HOR mixed core fuel management

    International Nuclear Information System (INIS)

    Gibcus, H.P.M.; Vries, J.W. de; Leege, P.F.A. de

    1997-01-01

    A thermal-hydraulic core management model has been developed for the Hoger Onderwijsreactor (HOR), a 2 MW pool-type university research reactor. The model was adopted for safety analysis purposes in the framework of HEU/LEU core conversion studies. It is applied in the thermal-hydraulic computer code SHORT (Steady-state HOR Thermal-hydraulics) which is presently in use in designing core configurations and for in-core fuel management. An elaborate measurement program was performed for establishing the core hydraulic characteristics for a variety of conditions. The hydraulic data were obtained with a dummy fuel element with special equipment allowing a.o. direct measurement of the true core flow rate. Using these data the thermal-hydraulic model was validated experimentally. The model, experimental tests, and model validation are discussed. (author)

  12. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  13. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  14. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian

    2017-01-01

    This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties of the surroun......This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties...... velocity is non-zero. This is the case in FSVs, where it results in an additional dampening effect, which is of relevance when analyzing contact-impact. Experimental data from different tests cases of a FSV has been gathered, with the plunger moving through a medium of either oil or air. This data is used...... to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  15. Modeling and validation of microwave ablations with internal vaporization.

    Science.gov (United States)

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L

    2015-02-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this paper, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10, and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intraprocedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard index of 0.27, 0.49, 0.61, 0.67, and 0.69 at 1, 2, 3, 4, and 5 min, respectively. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally.

  16. Estimation of average annual streamflows and power potentials for Alaska and Hawaii

    Energy Technology Data Exchange (ETDEWEB)

    Verdin, Kristine L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL)

    2004-05-01

    This paper describes the work done to develop average annual streamflow estimates and power potential for the states of Alaska and Hawaii. The Elevation Derivatives for National Applications (EDNA) database was used, along with climatic datasets, to develop flow and power estimates for every stream reach in the EDNA database. Estimates of average annual streamflows were derived using state-specific regression equations, which were functions of average annual precipitation, precipitation intensity, drainage area, and other elevation-derived parameters. Power potential was calculated through the use of the average annual streamflow and the hydraulic head of each reach, which is calculated from the EDNA digital elevation model. In all, estimates of streamflow and power potential were calculated for over 170,000 stream segments in the Alaskan and Hawaiian datasets.

  17. Predicting third molar surgery operative time: a validated model.

    Science.gov (United States)

    Susarla, Srinivas M; Dodson, Thomas B

    2013-01-01

    The purpose of the present study was to develop and validate a statistical model to predict third molar (M3) operative time. This was a prospective cohort study consisting of a sample of subjects presenting for M3 removal. The demographic, anatomic, and operative variables were recorded for each subject. Using an index sample of randomly selected subjects, a multiple linear regression model was generated to predict the operating time. A nonoverlapping group of randomly selected subjects (validation sample) was used to assess model accuracy. P≤.05 was considered significant. The sample was composed of 150 subjects (n) who had 450 (k) M3s removed. The index sample (n=100 subjects, k=313 M3s extracted) had a mean age of 25.4±10.0 years. The mean extraction time was 6.4±7.0 minutes. The multiple linear regression model included M3 location, Winter's classification, tooth morphology, number of teeth extracted, procedure type, and surgical experience (R2=0.58). No statistically significant differences were seen between the index sample and the validation sample (n=50, k=137) for any of the study variables. Compared with the index model, the β-coefficients of the validation model were similar in direction and magnitude for most variables. Compared with the observed extraction time for all teeth in the sample, the predicted extraction time was not significantly different (P=.16). Fair agreement was seen between the β-coefficients for our multiple models in the index and validation populations, with no significant difference in the predicted and observed operating times. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  19. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  20. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  1. Streamflow conditions along Soldier Creek, Northeast Kansas

    Science.gov (United States)

    Juracek, Kyle E.

    2017-11-14

    The availability of adequate water to meet the present (2017) and future needs of humans, fish, and wildlife is a fundamental issue for the Prairie Band Potawatomi Nation in northeast Kansas. Because Soldier Creek flows through the Prairie Band Potawatomi Nation Reservation, it is an important tribal resource. An understanding of historical Soldier Creek streamflow conditions is required for the effective management of tribal water resources, including drought contingency planning. Historical data for six selected U.S. Geological Survey (USGS) streamgages along Soldier Creek were used in an assessment of streamflow characteristics and trends by Juracek (2017). Streamflow data for the period of record at each streamgage were used to compute annual mean streamflow, annual mean base flow, mean monthly flow, annual peak flow, and annual minimum flow. Results of the assessment are summarized in this fact sheet.

  2. Multiphysics modelling and experimental validation of high concentration photovoltaic modules

    International Nuclear Information System (INIS)

    Theristis, Marios; Fernández, Eduardo F.; Sumner, Mike; O'Donovan, Tadhg S.

    2017-01-01

    Highlights: • A multiphysics modelling approach for concentrating photovoltaics was developed. • An experimental campaign was conducted to validate the models. • The experimental results were in good agreement with the models. • The multiphysics modelling allows the concentrator’s optimisation. - Abstract: High concentration photovoltaics, equipped with high efficiency multijunction solar cells, have great potential in achieving cost-effective and clean electricity generation at utility scale. Such systems are more complex compared to conventional photovoltaics because of the multiphysics effect that is present. Modelling the power output of such systems is therefore crucial for their further market penetration. Following this line, a multiphysics modelling procedure for high concentration photovoltaics is presented in this work. It combines an open source spectral model, a single diode electrical model and a three-dimensional finite element thermal model. In order to validate the models and the multiphysics modelling procedure against actual data, an outdoor experimental campaign was conducted in Albuquerque, New Mexico using a high concentration photovoltaic monomodule that is thoroughly described in terms of its geometry and materials. The experimental results were in good agreement (within 2.7%) with the predicted maximum power point. This multiphysics approach is relatively more complex when compared to empirical models, but besides the overall performance prediction it can also provide better understanding of the physics involved in the conversion of solar irradiance into electricity. It can therefore be used for the design and optimisation of high concentration photovoltaic modules.

  3. Streamflow predictions under climate scenarios in the Boulder Creek Watershed at Orodell

    Science.gov (United States)

    Zhang, Q.; Williams, M. W.; Livneh, B.

    2016-12-01

    Mountainous areas have complex geological features and climatic variability, which limit our ability to simulate and predict hydrologic processes, especially in face to a changing climate. Hydrologic models can improve our understanding of land surface water and energy budgets in these regions. In this study, a distributed physically-based hydrologic model is applied to the Boulder Creek Watershed, USA to study streamflow conditions under future climatic scenarios. Model parameters were adjusted using observed streamflow data at 1/16th degree resolution, with a NSE value of 0.69. The results from CMIP5 models can give a general range of streamflow conditions under different climatic scenarios. Two scenarios are being applied, including the RCP 4.5 and 8.5 scenarios. RCP 8.5 has higher emission concentrations than RCP 4.5, but not very significant in the period of study. Using pair t-test and Mann-Whitney test at specific grid cells to compare modeled and observed climate data, four CMIP5 models were chosen to predict streamflow from 2010 to 2025. Of the four models, two models predicted increased precipitation, while the other two models predicted decreased precipitation, and the four models predicted increased minimum and maximum temperature in RCP 4.5. Average streamflow decreased by 2% 14%, while maximum SWE varies from -7% to +210% from 2010 to 2025, relative to 2006 to 2010. In RCP 8.5, three models predicted increased precipitation, while the other one model predicted decreased precipitation, and the four models predicted increased maximum and minimum temperature. Besides one model, the other three models predicted increased average streamflow by 3.5% 32%, which results from the higher increasing magnitude in precipitation. Maximum SWE varies by 6% 55% higher than that from 2006 to 2010. This study shows that average daily maximum and minimum temperature will increase toward 2025 from different climate models, while average streamflow will decrease in RCP 4

  4. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  5. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  6. A Comparison and Validation of Two Surface Ship Readiness Models

    Science.gov (United States)

    1994-09-01

    they cannot be considered validated. Any application of these programs without additional verification is at the risk of the user. vii Vifi TABLE OF...contains the S’AS code that was used to perform the full model run for the SIM. // SIMIC JOB USER=S6402,CLASS--C 1/ EXEC SAS //WORK DD UNIT=SYSDASPACE

  7. Validating Work Discrimination and Coping Strategy Models for Sexual Minorities

    Science.gov (United States)

    Chung, Y. Barry; Williams, Wendi; Dispenza, Franco

    2009-01-01

    The purpose of this study was to validate and expand on Y. B. Chung's (2001) models of work discrimination and coping strategies among lesbian, gay, and bisexual persons. In semistructured individual interviews, 17 lesbians and gay men reported 35 discrimination incidents and their related coping strategies. Responses were coded based on Chung's…

  8. Validity of the Bersohn–Zewail model beyond justification

    DEFF Research Database (Denmark)

    Petersen, Jakob; Henriksen, Niels Engholm; Møller, Klaus Braagaard

    2012-01-01

    excellent agreement between the classical trajectory and the average position of the excited state wave packet. By investigating the approximations connecting the nuclear dynamics described by quantum mechanics and the BZ model, we conclude that this agreement goes far beyond the validity of the individual...

  9. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of

  10. Validating soil phosphorus routines in the SWAT model

    Science.gov (United States)

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  11. Dust-on-snow and the Timing of Peak Streamflow in the Upper Rio Grande

    Science.gov (United States)

    Steele, C. M.; Elias, E.; Moffitt, A.; Beltran, I.; Rango, A.

    2015-12-01

    Dust radiative forcing on high elevation snowpack is well-documented in the southern Rockies. Various field studies show that dust deposits decrease snow albedo and increase absorption of solar radiation, leading to earlier snowmelt and peak stream flows. These findings have implications for the use of temperature-index snow runoff models (such as the Snowmelt Runoff Model [SRM]) for predicting streamflow. In previous work, we have used SRM to simulate historical streamflow from 26 Upper Rio Grande sub-basins. Because dust radiative forcing can alter the relation between temperature and snowmelt, we wanted to find out if there is evidence of dust radiative forcing and earlier snowmelt in our study basins, particularly for those years where SRM was less successful in simulating streamflow. To accomplish this we have used openly-available data such as EPA air quality station measurements of particulate matter up to 10 micrometers (PM10); streamflow data from the USGS National Water Information System and Colorado Division of Water Resources; temperature, precipitation and snow water equivalent (SWE) from NRCS SNOTEL stations and remotely sensed data products from the MODIS sensor. Initial analyses indicate that a connection between seasonal dust concentration and streamflow timing (date of onset of warm-season snowmelt, date of streamflow center-of-volume) can be detected. This is further supported by time series analysis of MODIS-derived estimates of snow albedo and dust radiative-forcing in alpine and open subalpine snow fields.

  12. An integrated uncertainty and ensemble-based data assimilation approach for improved operational streamflow predictions

    Directory of Open Access Journals (Sweden)

    M. He

    2012-03-01

    Full Text Available The current study proposes an integrated uncertainty and ensemble-based data assimilation framework (ICEA and evaluates its viability in providing operational streamflow predictions via assimilating snow water equivalent (SWE data. This step-wise framework applies a parameter uncertainty analysis algorithm (ISURF to identify the uncertainty structure of sensitive model parameters, which is subsequently formulated into an Ensemble Kalman Filter (EnKF to generate updated snow states for streamflow prediction. The framework is coupled to the US National Weather Service (NWS snow and rainfall-runoff models. Its applicability is demonstrated for an operational basin of a western River Forecast Center (RFC of the NWS. Performance of the framework is evaluated against existing operational baseline (RFC predictions, the stand-alone ISURF and the stand-alone EnKF. Results indicate that the ensemble-mean prediction of ICEA considerably outperforms predictions from the other three scenarios investigated, particularly in the context of predicting high flows (top 5th percentile. The ICEA streamflow ensemble predictions capture the variability of the observed streamflow well, however the ensemble is not wide enough to consistently contain the range of streamflow observations in the study basin. Our findings indicate that the ICEA has the potential to supplement the current operational (deterministic forecasting method in terms of providing improved single-valued (e.g., ensemble mean streamflow predictions as well as meaningful ensemble predictions.

  13. Soil Moisture Initialization Error and Subgrid Variability of Precipitation in Seasonal Streamflow Forecasting

    Science.gov (United States)

    Koster, Randal D.; Walker, Gregory K.; Mahanama, Sarith P.; Reichle, Rolf H.

    2013-01-01

    Offline simulations over the conterminous United States (CONUS) with a land surface model are used to address two issues relevant to the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which a realistic increase in the spatial resolution of forecasted precipitation would improve streamflow forecasts. The addition of error to a soil moisture initialization field is found to lead to a nearly proportional reduction in streamflow forecast skill. The linearity of the response allows the determination of a lower bound for the increase in streamflow forecast skill achievable through improved soil moisture estimation, e.g., through satellite-based soil moisture measurements. An increase in the resolution of precipitation is found to have an impact on large-scale streamflow forecasts only when evaporation variance is significant relative to the precipitation variance. This condition is met only in the western half of the CONUS domain. Taken together, the two studies demonstrate the utility of a continental-scale land surface modeling system as a tool for addressing the science of hydrological prediction.

  14. Ensemble streamflow predictions: from climate scenarios to probabilistic weather predictions

    Science.gov (United States)

    Fortin, V.; Evora, N.; Perreault, L.; Trinh, N.; Favre, A.; Benoit, H.

    2004-05-01

    Ensemble streamflow predictions (ESP) are obtained by processing an ensemble of meteorological scenarios through a rainfall-runoff hydrological model to obtain hydrological scenarios. Until recently, these scenarios were typically taken from the climatology. Now that more accurate medium- and long-term numerical weather predictions (NWP) are available, it is tempting to replace climatology by numerical weather forecasts. At least two approaches are possible to take into account the uncerta1000 inty on the meteorological forecast: (1) let a meteorologist propose a subjective probabilistic forecast based on one or more deterministic NWPs, or (2) take advantage of ensemble meteorological forecasts, which are built precisely to assess the level of uncertainty on the deterministic forecast. Practical solutions to problems encountered with both types of meteorological forecasts are discussed, and the methodology used by Hydro-Québec to score the resulting streamflow forecasts is presented.

  15. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During the recent years the attention to the double skin facade (DSF) concept has greatly increased. Nevertheless, the application of the concept depends on whether a reliable model for simulation of the DSF performance will be developed or pointed out. This is, however, not possible to do, until...... the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...... of a double skin facade: 1. External air curtain mode, it is the naturally ventilated DSF cavity with the top and bottom openings open to the outdoor; 2. Thermal insulation mode, when all of the DSF openings closed; 3. Preheating mode, with the bottom DSF openings open to the outdoor and top openings open...

  16. Validation of Fatigue Modeling Predictions in Aviation Operations

    Science.gov (United States)

    Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin

    2017-01-01

    Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.

  17. HELOKA-HP thermal-hydraulic model validation and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Xue Zhou; Ghidersa, Bradut-Eugen; Badea, Aurelian Florin

    2016-11-01

    Highlights: • The electrical heater in HELOKA-HP has been modeled with RELAP5-3D using experimental data as input. • The model has been validated using novel techniques for assimilating experimental data and the representative model parameters with BEST-EST. • The methodology is successfully used for reducing the model uncertainties and provides a quantitative measure of the consistency between the experimental data and the model. - Abstract: The Helium Loop Karlsruhe High Pressure (HELOKA-HP) is an experimental facility for the testing of various helium-cooled components at high temperature (500 °C) and high pressure (8 MPa) for nuclear fusion applications. For modeling the loop thermal dynamics, a thermal-hydraulic model has been created using the system code RELAP5-3D. Recently, new experimental data covering the behavior of the loop components under relevant operational conditions have been made available giving the possibility of validating and calibrating the existing models in order to reduce the uncertainties of the simulated responses. This paper presents an example where such process has been applied for the HELOKA electrical heater model. Using novel techniques for assimilating experimental data, implemented in the computational module BEST-EST, the representative parameters of the model have been calibrated.

  18. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  19. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  20. Seasonal Streamflow Forecasts for African Basins

    Science.gov (United States)

    Serrat-Capdevila, A.; Valdes, J. B.; Wi, S.; Roy, T.; Roberts, J. B.; Robertson, F. R.; Demaria, E. M.

    2015-12-01

    Using high resolution downscaled seasonal meteorological forecasts we present the development and evaluation of seasonal hydrologic forecasts with Stakeholder Agencies for selected African basins. The meteorological forecasts are produced using the Bias Correction and Spatial Disaggregation (BCSD) methodology applied to NMME hindcasts (North American Multi-Model Ensemble prediction system) to generate a bootstrap resampling of plausible weather forecasts from historical observational data. This set of downscaled forecasts is then used to drive hydrologic models to produce a range of forecasts with uncertainty estimates suitable for water resources planning in African pilot basins (i.e. Upper Zambezi, Mara Basin). In an effort to characterize the utility of these forecasts, we will present an evaluation of these forecast ensembles over the pilot basins, and discuss insights as to their operational applicability by regional actors. Further, these forecasts will be contrasted with those from a standard Ensemble Streamflow Prediction (ESP) approach to seasonal forecasting. The case studies presented here have been developed in the setting of the NASA SERVIR Applied Sciences Team and within the broader context of operational seasonal forecasting in Africa. These efforts are part of a dialogue with relevant planning and management agencies and institutions in Africa, which are in turn exploring how to best use uncertain forecasts for decision making.

  1. The relationship between groundwater ages, streamflow ages, and storage selection functions under stationary conditions

    Science.gov (United States)

    Berghuijs, W.; Kirchner, J. W.

    2017-12-01

    Waters in aquifers are often much older than the streamwaters that drain them. Simple physically based reasoning suggests that these age contrasts should be expected wherever catchments are heterogeneous. However, a general quantitative catchment-scale explanation of these age contrasts remains elusive. We show that under stationary conditions conservation of mass and age dictate that the age distribution of water stored in a catchment can be directly estimated from the age distribution of its outflows, and vice versa. This in turn implies that the catchment's preference for the release or retention of waters of different ages can be estimated directly from the age distribution of outflow under stationary conditions. Using simple models of transit times, we show that the mean age of stored water can range from half as old as the mean age of streamflow (for plug flow conditions) to almost infinitely older (for strongly preferential flow). Streamflow age distributions reported in the literature often have long upper tails, consistent with preferential flow and implying that storage ages are substantially older than streamflow ages. Mean streamflow ages reported in the literature imply that most streamflow originates from a thin veneer of total groundwater storage. This preferential release of young streamflow implies that most groundwater is exchanged only slowly with the surface, and consequently must be very old. Where information is available for both storage ages and streamflow ages, our analysis establishes consistency relationships through which each could be used to better constrain the other. By quantifying the relationship between groundwater and streamflow ages, our analysis provides tools to jointly assess both of these important catchment properties.

  2. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  3. Relative contributions of transient and steady state infiltration during ephemeral streamflow

    Science.gov (United States)

    Blasch, Kyle W.; Ferré, Ty P.A.; Hoffmann, John P.; Fleming, John B.

    2006-01-01

    Simulations of infiltration during three ephemeral streamflow events in a coarse‐grained alluvial channel overlying a less permeable basin‐fill layer were conducted to determine the relative contribution of transient infiltration at the onset of streamflow to cumulative infiltration for the event. Water content, temperature, and piezometric measurements from 2.5‐m vertical profiles within the alluvial sediments were used to constrain a variably saturated water flow and heat transport model. Simulated and measured transient infiltration rates at the onset of streamflow were about two to three orders of magnitude greater than steady state infiltration rates. The duration of simulated transient infiltration ranged from 1.8 to 20 hours, compared with steady state flow periods of 231 to 307 hours. Cumulative infiltration during the transient period represented 10 to 26% of the total cumulative infiltration, with an average contribution of approximately 18%. Cumulative infiltration error for the simulated streamflow events ranged from 9 to 25%. Cumulative infiltration error for typical streamflow events of about 8 hours in duration in is about 90%. This analysis indicates that when estimating total cumulative infiltration in coarse‐grained ephemeral stream channels, consideration of the transient infiltration at the onset of streamflow will improve predictions of the total volume of infiltration that may become groundwater recharge.

  4. Temporal Variability and Trends of Rainfall and Streamflow in Tana River Basin, Kenya

    Directory of Open Access Journals (Sweden)

    Philip Kibet Langat

    2017-10-01

    Full Text Available This study investigated temporal variabilities and trends of rainfall and discharges in Tana River Basin in Kenya using Mann–Kendall non-parametric test. Monthly rainfall data from ten stations spanning from 1967 to 2016 and daily streamflow data time series of observations from 1941 to 2016 (75 years were analyzed with the aim of capturing and detecting multiannual and seasonal variabilities and monotonic trends. The results for the datasets suggested that the streamflow is largely dependent on increasing rainfall at the highlands. The rainfall trends seemed to have been influenced by altitudinal factors. The coefficient of variation of the ten rainfall stations ranged from 12% to 17% but 70% of rainfall stations showed negative monotonic trends and 30% show significant trends. The streamflow showed statistically significant upward monotonic trend and seasonal variability indicating a substantial change in the streamflow regime. Although the increasing trend of the streamflow during this period may not pose future risks and vulnerability of energy and irrigated agricultural production systems across the basin, variability observed indicates the need for enhanced alternative water management strategies during the low flow seasons. The trends and time series data indicate the potential evidence of climate and land use change and their impacts on the availability of water and sustainability of ecology and energy and agricultural production systems across the basin. Variability and trends of rainfall and streamflow are useful for planning studies, hydrological modeling and climate change impacts assessment within Tana River Basin.

  5. Deducing Climatic Elasticity to Assess Projected Climate Change Impacts on Streamflow Change across China

    Science.gov (United States)

    Liu, Jianyu; Zhang, Qiang; Zhang, Yongqiang; Chen, Xi; Li, Jianfeng; Aryal, Santosh K.

    2017-10-01

    Climatic elasticity has been widely applied to assess streamflow responses to climate changes. To fully assess impacts of climate under global warming on streamflow and reduce the error and uncertainty from various control variables, we develop a four-parameter (precipitation, catchment characteristics n, and maximum and minimum temperatures) climatic elasticity method named PnT, based on the widely used Budyko framework and simplified Makkink equation. We use this method to carry out the first comprehensive evaluation of the streamflow response to potential climate change for 372 widely spread catchments in China. The PnT climatic elasticity was first evaluated for a period 1980-2000, and then used to evaluate streamflow change response to climate change based on 12 global climate models under Representative Concentration Pathway 2.6 (RCP2.6) and RCP 8.5 emission scenarios. The results show that (1) the PnT climatic elasticity method is reliable; (2) projected increasing streamflow takes place in more than 60% of the selected catchments, with mean increments of 9% and 15.4% under RCP2.6 and RCP8.5 respectively; and (3) uncertainties in the projected streamflow are considerable in several regions, such as the Pearl River and Yellow River, with more than 40% of the selected catchments showing inconsistent change directions. Our results can help Chinese policy makers to manage and plan water resources more effectively, and the PnT climatic elasticity should be applied to other parts of the world.

  6. Predictability of soil moisture and streamflow on subseasonal timescales: A case study

    Science.gov (United States)

    Orth, Rene; Seneviratne, Sonia I.

    2013-10-01

    Hydrological forecasts constitute an important tool in water resource management, especially in the case of impending extreme events. This study investigates the potential predictability of soil moisture and streamflow in Switzerland using a conceptual model including a simple water balance representation and a snow module. Our results show that simulated soil moisture and streamflow are more predictable (as indicated by significantly improved performance compared to climatology) until lead times of approximately 1 week and 2-3 days, respectively, when using initial soil moisture information and climatological atmospheric forcing. Using also initial snow information and seasonal weather forecasts as forcing, the predictable lead time doubles in case of soil moisture and triples for streamflow. The skill contributions of the additional information vary with altitude; at low altitudes the precipitation forecast is most important, whereas in mountainous areas the temperature forecast and the initial snow information are the most valuable contributors. We find furthermore that the soil moisture and streamflow forecast skills increase with increasing initial soil moisture anomalies. Comparing the respective value of realistic initial conditions and state-of-the-art forcing forecasts, we show that the former are generally more important for soil moisture forecasts, whereas the latter are more valuable for streamflow forecasts. To relate the derived predictabilities to respective soil moisture and streamflow memories investigated in other publications, we additionally illustrate the similarity between the concepts of memory and predictability as measures of persistence in the last part of this study.

  7. Validation of a parametric finite element human femur model.

    Science.gov (United States)

    Klein, Katelyn F; Hu, Jingwen; Reed, Matthew P; Schneider, Lawrence W; Rupp, Jonathan D

    2017-05-19

    Finite element (FE) models with geometry and material properties that are parametric with subject descriptors, such as age and body shape/size, are being developed to incorporate population variability into crash simulations. However, the validation methods currently being used with these parametric models do not assess whether model predictions are reasonable in the space over which the model is intended to be used. This study presents a parametric model of the femur and applies a unique validation paradigm to this parametric femur model that characterizes whether model predictions reproduce experimentally observed trends. FE models of male and female femurs with geometries that are parametric with age, femur length, and body mass index (BMI) were developed based on existing statistical models that predict femur geometry. These parametric FE femur models were validated by comparing responses from combined loading tests of femoral shafts to simulation results from FE models of the corresponding femoral shafts whose geometry was predicted using the associated age, femur length, and BMI. The effects of subject variables on model responses were also compared with trends in the experimental data set by fitting similarly parameterized statistical models to both the results of the experimental data and the corresponding FE model results and then comparing fitted model coefficients for the experimental and predicted data sets. The average error in impact force at experimental failure for the parametric models was 5%. The coefficients of a statistical model fit to simulation data were within one standard error of the coefficients of a similarly parameterized model of the experimental data except for the age parameter, likely because material properties used in simulations were not varied with specimen age. In simulations to explore the effects of femur length, BMI, and age on impact response, only BMI significantly affected response for both men and women, with increasing

  8. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  9. Escherichia coli bacteria density in relation to turbidity, streamflow characteristics, and season in the Chattahoochee River near Atlanta, Georgia, October 2000 through September 2008—Description, statistical analysis, and predictive modeling

    Science.gov (United States)

    Lawrence, Stephen J.

    2012-01-01

    Water-based recreation—such as rafting, canoeing, and fishing—is popular among visitors to the Chattahoochee River National Recreation Area (CRNRA) in north Georgia. The CRNRA is a 48-mile reach of the Chattahoochee River upstream from Atlanta, Georgia, managed by the National Park Service (NPS). Historically, high densities of fecal-indicator bacteria have been documented in the Chattahoochee River and its tributaries at levels that commonly exceeded Georgia water-quality standards. In October 2000, the NPS partnered with the U.S. Geological Survey (USGS), State and local agencies, and non-governmental organizations to monitor Escherichia coli bacteria (E. coli) density and develop a system to alert river users when E. coli densities exceeded the U.S. Environmental Protection Agency (USEPA) single-sample beach criterion of 235 colonies (most probable number) per 100 milliliters (MPN/100 mL) of water. This program, called BacteriALERT, monitors E. coli density, turbidity, and water temperature at two sites on the Chattahoochee River upstream from Atlanta, Georgia. This report summarizes E. coli bacteria density and turbidity values in water samples collected between 2000 and 2008 as part of the BacteriALERT program; describes the relations between E. coli density and turbidity, streamflow characteristics, and season; and describes the regression analyses used to develop predictive models that estimate E. coli density in real time at both sampling sites.

  10. Decomposition of Sources of Errors in Seasonal Streamflow Forecasts in a Rainfall-Runoff Dominated Basin

    Science.gov (United States)

    Sinha, T.; Arumugam, S.

    2012-12-01

    Seasonal streamflow forecasts contingent on climate forecasts can be effectively utilized in updating water management plans and optimize generation of hydroelectric power. Streamflow in the rainfall-runoff dominated basins critically depend on forecasted precipitation in contrast to snow dominated basins, where initial hydrological conditions (IHCs) are more important. Since precipitation forecasts from Atmosphere-Ocean-General Circulation Models are available at coarse scale (~2.8° by 2.8°), spatial and temporal downscaling of such forecasts are required to implement land surface models, which typically runs on finer spatial and temporal scales. Consequently, multiple sources are introduced at various stages in predicting seasonal streamflow. Therefore, in this study, we addresses the following science questions: 1) How do we attribute the errors in monthly streamflow forecasts to various sources - (i) model errors, (ii) spatio-temporal downscaling, (iii) imprecise initial conditions, iv) no forecasts, and (iv) imprecise forecasts? and 2) How does monthly streamflow forecast errors propagate with different lead time over various seasons? In this study, the Variable Infiltration Capacity (VIC) model is calibrated over Apalachicola River at Chattahoochee, FL in the southeastern US and implemented with observed 1/8° daily forcings to estimate reference streamflow during 1981 to 2010. The VIC model is then forced with different schemes under updated IHCs prior to forecasting period to estimate relative mean square errors due to: a) temporally disaggregation, b) spatial downscaling, c) Reverse Ensemble Streamflow Prediction (imprecise IHCs), d) ESP (no forecasts), and e) ECHAM4.5 precipitation forecasts. Finally, error propagation under different schemes are analyzed with different lead time over different seasons.

  11. On the sensitivity of annual streamflow to air temperature

    Science.gov (United States)

    Milly, Paul C.D.; Kam, Jonghun; Dunne, Krista A.

    2018-01-01

    Although interannual streamflow variability is primarily a result of precipitation variability, temperature also plays a role. The relative weakness of the temperature effect at the annual time scale hinders understanding, but may belie substantial importance on climatic time scales. Here we develop and evaluate a simple theory relating variations of streamflow and evapotranspiration (E) to those of precipitation (P) and temperature. The theory is based on extensions of the Budyko water‐balance hypothesis, the Priestley‐Taylor theory for potential evapotranspiration ( ), and a linear model of interannual basin storage. The theory implies that the temperature affects streamflow by modifying evapotranspiration through a Clausius‐Clapeyron‐like relation and through the sensitivity of net radiation to temperature. We apply and test (1) a previously introduced “strong” extension of the Budyko hypothesis, which requires that the function linking temporal variations of the evapotranspiration ratio (E/P) and the index of dryness ( /P) at an annual time scale is identical to that linking interbasin variations of the corresponding long‐term means, and (2) a “weak” extension, which requires only that the annual evapotranspiration ratio depends uniquely on the annual index of dryness, and that the form of that dependence need not be known a priori nor be identical across basins. In application of the weak extension, the readily observed sensitivity of streamflow to precipitation contains crucial information about the sensitivity to potential evapotranspiration and, thence, to temperature. Implementation of the strong extension is problematic, whereas the weak extension appears to capture essential controls of the temperature effect efficiently.

  12. Using SST, PDO and SOI for Streamflow Reconstruction

    Science.gov (United States)

    Bukhary, S. S.; Kalra, A.; Ahmad, S.

    2015-12-01

    Recurring droughts in southwestern U.S. particularly California, have strained the existing water reserves of the region. Frequency, severity and duration of these recurring drought events may not be captured by the available instrumental records. Thus streamflow reconstruction becomes imperative to identify the historic hydroclimatic extremes of a region and assists in developing better water management strategies, vital for sustainability of water reserves. Tree ring chronologies (TRC) are conventionally used to reconstruct streamflows, since tree rings are representative of climatic information. Studies have shown that sea surface temperature (SST) and climate indices of southern oscillation index (SOI) and pacific decadal oscillation (PDO) influence U.S. streamflow volumes. The purpose of this study was to improve the traditional reconstruction methodology by incorporating the oceanic-atmospheric variables of PDO, SOI, and Pacific Ocean SST, alongwith TRC as predictors in a step-wise linear regression model. The methodology of singular value decomposition was used to identify teleconnected regions of streamflow and SST. The approach was tested on eleven gage stations in Sacramento River Basin (SRB) and San Joaquin River Basin (JRB). The reconstructions were successfully generated from 1800-1980, having an overlap period of 1932-1980. Improved results were exhibited when using the predictor variable of SST along with TRC (calibration r2=0.6-0.91) compared to when using TRC in combination with SOI and PDO (calibration r2=0.51-0.78) or when using TRC by itself (calibration r2=0.51-0.86). For future work, this approach can be replicated for other watersheds by using the oceanic-atmospheric climate variables influencing that region.

  13. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  14. Validated TRNSYS Model for Solar Assisted Space Heating System

    International Nuclear Information System (INIS)

    Abdalla, Nedal

    2014-01-01

    The present study involves a validated TRNSYS model for solar assisted space heating system as applied to a residential building in Jordan using new detailed radiation models of the TRNSYS 17.1 and geometric building model Trnsys3d for the Google SketchUp 3D drawing program. The annual heating load for a building (Solar House) which is located at the Royal ScientiFIc Society (RS5) in Jordan is estimated under climatological conditions of Amman. The aim of this Paper is to compare measured thermal performance of the Solar House with that modeled using TRNSYS. The results showed that the annual measured space heating load for the building was 6,188 kWh while the heati.ng load for the modeled building was 6,391 kWh. Moreover, the measured solar fraction for the solar system was 50% while the modeled solar fraction was 55%. A comparison of modeled and measured data resulted in percentage mean absolute errors for solar energy for space heating, auxiliary heating and solar fraction of 13%, 7% and 10%, respectively. The validated model will be useful for long-term performance simulation under different weather and operating conditions.(author)

  15. Systematic Change in Global Patterns of Streamflow Following Volcanic Eruptions

    Science.gov (United States)

    Iles, C. E.; Hegerl, G. C.

    2015-12-01

    Precipitation decreases over much of the globe following large explosive volcanic eruptions, particularly in climatologically wet regions. Stratospheric volcanic aerosols reflect sunlight, reducing evaporation, whilst surface cooling stabilises the atmosphere and reduces its water-holding capacity. Circulation changes modulate this global precipitation reduction on regional scales. Despite the importance of rivers to people, it has until now been unclear whether volcanism causes detectable changes in streamflow given large natural variability. Here we analyse the response of 50 major world rivers using observational records, averaging across multiple eruptions to reduce noise. We find statistically significant reductions in flow for the Amazon, Congo, Nile, Orange, Ob, Yenisey and Kolyma amongst others. Results are clearer when neighbouring rivers are combined into regions based on the areas where climate models simulate either an increase or a decrease in precipitation following eruptions. We detect a significant streamflow decrease (privers, and on average across wet tropical and subtropical regions. We also detect a significant increase in southern South American and SW North American rivers. This significant change in global scale streamflow following volcanic eruptions has implications for predicting and mitigating the effects of future eruptions.

  16. AEROTAXI ground static test and finite element model validation

    Directory of Open Access Journals (Sweden)

    Radu BISCA

    2011-06-01

    Full Text Available In this presentation, we will concentrate on typical Ground Static Test (GST and Finite Element (FE software comparisons. It is necessary to note, that standard GST are obligatory for any new aircraft configuration. We can mention here the investigations of the AeroTAXITM, a small aircraft configuration, using PRODERA® equipment. A Finite Element Model (FEM of the AeroTAXITM has been developed in PATRAN/NASTRAN®, partly from a previous ANSYS® model. FEM can be used to investigate potential structural modifications or changes with realistic component corrections. Model validation should be part of every modern engineering analysis and quality assurance procedure.

  17. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  18. Circumplex Model VII: validation studies and FACES III.

    Science.gov (United States)

    Olson, D H

    1986-09-01

    This paper reviews some of the recent empirical studies validating the Circumplex Model and describes the newly developed self-report measure, FACES III. Studies testing hypotheses derived from the Circumplex Model regarding the three dimensions of cohesion, change, and communication are reviewed. Case illustrations using FACES III and the Clinical Rating Scale are presented. These two assessment tools can be used for making a diagnosis of family functioning and for assessing changes over the course of treatment. This paper reflects the continuing attempt to develop further the Circumplex Model and to bridge more adequately research, theory, and practice.

  19. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  1. Long-term variation analysis of a tropical river's annual streamflow regime over a 50-year period

    Science.gov (United States)

    Seyam, Mohammed; Othman, Faridah

    2015-07-01

    Studying the long-term changes of streamflow is an important tool for enhancing water resource and river system planning, design, and management. The aim of this work is to identify the long-term variations in annual streamflow regime over a 50-year period from 1961 to 2010 in the Selangor River, which is one of the main tropical rivers in Malaysia. Initially, the data underwent preliminary independence, normality, and homogeneity testing using the Pearson correlation coefficient and Shapiro-Wilk and Pettitt's tests, respectively. The work includes a study and analysis of the changes through nine variables describing the annual streamflow and variations in the yearly duration of high and low streamflows. The analyses were conducted via two time scales: yearly and sub-periodic. The sub-periods were obtained by segmenting the 50 years into seven sub-periods by two techniques, namely the change-point test and direct method. Even though analysis revealed nearly negligible changes in mean annual flow over the study period, the maximum annual flow generally increased while the minimum annual flow significantly decreased with respect to time. It was also observed that the variables describing the dispersion in streamflow continually increased with respect to time. An obvious increase was detected in the yearly duration of danger level of streamflow, a slight increase was noted in the yearly duration of warning and alert levels, and a slight decrease in the yearly duration of low streamflow was found. The perceived changes validate the existence of long-term changes in annual streamflow regime, which increase the probability of floods and droughts occurring in future. In light of the results, attention should be drawn to developing water resource management and flood protection plans in order to avert the harmful effects potentially resulting from the expected changes in annual streamflow regime.

  2. Cross validation for the classical model of structured expert judgment

    International Nuclear Information System (INIS)

    Colson, Abigail R.; Cooke, Roger M.

    2017-01-01

    We update the 2008 TU Delft structured expert judgment database with data from 33 professionally contracted Classical Model studies conducted between 2006 and March 2015 to evaluate its performance relative to other expert aggregation models. We briefly review alternative mathematical aggregation schemes, including harmonic weighting, before focusing on linear pooling of expert judgments with equal weights and performance-based weights. Performance weighting outperforms equal weighting in all but 1 of the 33 studies in-sample. True out-of-sample validation is rarely possible for Classical Model studies, and cross validation techniques that split calibration questions into a training and test set are used instead. Performance weighting incurs an “out-of-sample penalty” and its statistical accuracy out-of-sample is lower than that of equal weighting. However, as a function of training set size, the statistical accuracy of performance-based combinations reaches 75% of the equal weight value when the training set includes 80% of calibration variables. At this point the training set is sufficiently powerful to resolve differences in individual expert performance. The information of performance-based combinations is double that of equal weighting when the training set is at least 50% of the set of calibration variables. Previous out-of-sample validation work used a Total Out-of-Sample Validity Index based on all splits of the calibration questions into training and test subsets, which is expensive to compute and includes small training sets of dubious value. As an alternative, we propose an Out-of-Sample Validity Index based on averaging the product of statistical accuracy and information over all training sets sized at 80% of the calibration set. Performance weighting outperforms equal weighting on this Out-of-Sample Validity Index in 26 of the 33 post-2006 studies; the probability of 26 or more successes on 33 trials if there were no difference between performance

  3. Seine estuary modelling and AirSWOT measurements validation

    Science.gov (United States)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being

  4. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  5. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  6. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  7. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  8. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  9. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  10. Validating unit commitment models: A case for benchmark test systems

    OpenAIRE

    Melhorn, Alexander C.; Li, Mingsong; Carroll, Paula; Flynn, Damian

    2016-01-01

    Due to increasing penetration of non-traditional power system resources; e.g. renewable generation, electric vehicles, demand response, etc. and computational power there has been an increased interest in research on unit commitment. It therefore may be important to take another look at how unit commitment models and algorithms are validated especially as improvements in solutions and algorithmic performance are desired to combat the added complexity of additional constraints. This paper expl...

  11. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  12. Experimental Validation of a Permeability Model for Enrichment Membranes

    International Nuclear Information System (INIS)

    Orellano, Pablo; Brasnarof, Daniel; Florido Pablo

    2003-01-01

    An experimental loop with a real scale diffuser, in a single enrichment-stage configuration, was operated with air at different process conditions, in order to characterize the membrane permeability.Using these experimental data, an analytical geometric-and-morphologic-based model was validated.It is conclude that a new set of independent measurements, i.e. enrichment, is necessary in order to fully characterize diffusers, because of its internal parameters are not univocally determinated with permeability experimental data only

  13. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  14. Advances in the spatially distributed ages-w model: parallel computation, java connection framework (JCF) integration, and streamflow/nitrogen dynamics assessment

    Science.gov (United States)

    AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...

  15. Parameterization and validation of an ungulate-pasture model.

    Science.gov (United States)

    Pekkarinen, Antti-Juhani; Kumpula, Jouko; Tahvonen, Olli

    2017-10-01

    Ungulate grazing and trampling strongly affect pastures and ecosystems throughout the world. Ecological population models are used for studying these systems and determining the guidelines for sustainable and economically viable management. However, the effect of trampling and other resource wastage is either not taken into account or quantified with data in earlier models. Also, the ability of models to describe the herbivore impact on pastures is usually not validated. We used a detailed model and data to study the level of winter- and summertime lichen wastage by reindeer and the effects of wastage on population sizes and management. We also validated the model with respect to its ability of predicting changes in lichen biomass and compared the actual management in herding districts with model results. The modeling efficiency value (0.75) and visual comparison between the model predictions and data showed that the model was able to describe the changes in lichen pastures caused by reindeer grazing and trampling. At the current lichen biomass levels in the northernmost Finland, the lichen wastage varied from 0 to 1 times the lichen intake during winter and from 6 to 10 times the intake during summer. With a higher value for wastage, reindeer numbers and net revenues were lower in the economically optimal solutions. Higher wastage also favored the use of supplementary feeding in the optimal steady state. Actual reindeer numbers in the districts were higher than in the optimal steady-state solutions for the model in 18 herding districts out of 20. Synthesis and applications . We show that a complex model can be used for analyzing ungulate-pasture dynamics and sustainable management if the model is parameterized and validated for the system. Wastage levels caused by trampling and other causes should be quantified with data as they strongly affect the results and management recommendations. Summertime lichen wastage caused by reindeer is higher than expected, which

  16. In ecoregions across western USA streamflow increases during post-wildfire recovery

    Science.gov (United States)

    Wine, Michael L.; Cadol, Daniel; Makhnin, Oleg

    2018-01-01

    Continued growth of the human population on Earth will increase pressure on already stressed terrestrial water resources required for drinking water, agriculture, and industry. This stress demands improved understanding of critical controls on water resource availability, particularly in water-limited regions. Mechanistic predictions of future water resource availability are needed because non-stationary conditions exist in the form of changing climatic conditions, land management paradigms, and ecological disturbance regimes. While historically ecological disturbances have been small and could be neglected relative to climatic effects, evidence is accumulating that ecological disturbances, particularly wildfire, can increase regional water availability. However, wildfire hydrologic impacts are typically estimated locally and at small spatial scales, via disparate measurement methods and analysis techniques, and outside the context of climate change projections. Consequently, the relative importance of climate change driven versus wildfire driven impacts on streamflow remains unknown across the western USA. Here we show that considering wildfire in modeling streamflow significantly improves model predictions. Mixed effects modeling attributed 2%-14% of long-term annual streamflow to wildfire effects. The importance of this wildfire-linked streamflow relative to predicted climate change-induced streamflow reductions ranged from 20%-370% of the streamflow decrease predicted to occur by 2050. The rate of post-wildfire vegetation recovery and the proportion of watershed area burned controlled the wildfire effect. Our results demonstrate that in large areas of the western USA affected by wildfire, regional predictions of future water availability are subject to greater structural uncertainty than previously thought. These results suggest that future streamflows may be underestimated in areas affected by increased prevalence of hydrologically relevant ecological

  17. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  18. Evaluation model and experimental validation of tritium in agricultural plant

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hee Suk; Keum, Dong Kwon; Lee, Han Soo; Jun, In; Choi, Yong Ho; Lee, Chang Woo [KAERI, Daejon (Korea, Republic of)

    2005-12-15

    This paper describes a compartment dynamic model for evaluating the contamination level of tritium in agricultural plants exposed by accidentally released tritium. The present model uses a time dependent growth equation of plant so that it can predict the effect of growth stage of plant during the exposure time. The model including atmosphere, soil and plant compartments is described by a set of nonlinear ordinary differential equations, and is able to predict time-dependent concentrations of tritium in the compartments. To validate the model, a series of exposure experiments of HTO vapor on Chinese cabbage and radish was carried out at the different growth stage of each plant. At the end of exposure, the tissue free water(TFWT) and the organically bound tritium (OBT) were measured. The measured concentrations were agreed well with model predictions.

  19. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  20. Validation of the WATEQ4 geochemical model for uranium

    International Nuclear Information System (INIS)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite [UO 2 (OH) 2 . H 2 O], UO 2 (OH) 2 , and rutherfordine ((UO 2 CO 3 ) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions

  1. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2015-01-01

    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  2. On the contribution of groundwater storage to interannual streamflow anomalies in the Colorado River basin

    Directory of Open Access Journals (Sweden)

    E. A. Rosenberg

    2013-04-01

    Full Text Available We assess the significance of groundwater storage for seasonal streamflow forecasts by evaluating its contribution to interannual streamflow anomalies in the 29 tributary sub-basins of the Colorado River. Monthly and annual changes in total basin storage are simulated by two implementations of the Variable Infiltration Capacity (VIC macroscale hydrology model – the standard release of the model, and an alternate version that has been modified to include the SIMple Groundwater Model (SIMGM, which represents an unconfined aquifer underlying the soil column. These estimates are compared to those resulting from basin-scale water balances derived exclusively from observational data and changes in terrestrial water storage from the Gravity Recovery and Climate Experiment (GRACE satellites. Changes in simulated groundwater storage are then compared to those derived via baseflow recession analysis for 72 reference-quality watersheds. Finally, estimates are statistically analyzed for relationships to interannual streamflow anomalies, and predictive capacities are compared across storage terms. We find that both model simulations result in similar estimates of total basin storage change, that these estimates compare favorably with those obtained from basin-scale water balances and GRACE data, and that baseflow recession analyses are consistent with simulated changes in groundwater storage. Statistical analyses reveal essentially no relationship between groundwater storage and interannual streamflow anomalies, suggesting that operational seasonal streamflow forecasts, which do not account for groundwater conditions implicitly or explicitly, are likely not detrimentally affected by this omission in the Colorado River basin.

  3. Regime-shifting streamflow processes: Implications for water supply reservoir operations

    Science.gov (United States)

    Turner, S. W. D.; Galelli, S.

    2016-05-01

    This paper examines the extent to which regime-like behavior in streamflow time series impacts reservoir operating policy performance. We begin by incorporating a regime state variable into a well-established stochastic dynamic programming model. We then simulate and compare optimized release policies—with and without the regime state variable—to understand how regime shifts affect operating performance in terms of meeting water delivery targets. Our optimization approach uses a Hidden Markov Model to partition the streamflow time series into a small number of separate regime states. The streamflow persistence structures associated with each state define separate month-to-month streamflow transition probability matrices for computing penalty cost expectations within the optimization procedure. The algorithm generates a four-dimensional array of release decisions conditioned on the within-year time period, reservoir storage state, inflow class, and underlying regime state. Our computational experiment is executed on 99 distinct, hypothetical water supply reservoirs fashioned from the Australian Bureau of Meteorology's Hydrologic Reference Stations. Results show that regime-like behavior is a major cause of suboptimal operations in water supply reservoirs; conventional techniques for optimal policy design may misguide the operator, particularly in regions susceptible to multiyear drought. Stationary streamflow models that allow for regime-like behavior can be incorporated into traditional stochastic optimization models to enhance the flexibility of operations.

  4. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Hassan; J. Chapman

    2008-11-01

    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of

  5. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  6. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  7. A global evaluation of streamflow drought characteristics

    Directory of Open Access Journals (Sweden)

    A. K. Fleig

    2006-01-01

    Full Text Available How drought is characterised depends on the purpose and region of the study and the available data. In case of regional applications or global comparison a standardisation of the methodology to characterise drought is preferable. In this study the threshold level method in combination with three common pooling procedures is applied to daily streamflow series from a wide range of hydrological regimes. Drought deficit characteristics, such as drought duration and deficit volume, are derived, and the methods are evaluated for their applicability for regional studies. Three different pooling procedures are evaluated: the moving-average procedure (MA-procedure, the inter-event time method (IT-method, and the sequent peak algorithm (SPA. The MA-procedure proved to be a flexible approach for the different series, and its parameter, the averaging interval, can easily be optimised for each stream. However, it modifies the discharge series and might introduce dependency between drought events. For the IT-method it is more difficult to find an optimal value for its parameter, the length of the excess period, in particular for flashy streams. The SPA can only be recommended as pooling procedure for the selection of annual maximum series of deficit characteristics and for very low threshold levels to ensure that events occurring shortly after major events are recognized. Furthermore, a frequency analysis of deficit volume and duration is conducted based on partial duration series of drought events. According to extreme value theory, excesses over a certain limit are Generalized Pareto (GP distributed. It was found that this model indeed performed better than or equally to other distribution models. In general, the GP-model could be used for streams of all regime types. However, for intermittent streams, zero-flow periods should be treated as censored data. For catchments with frost during the winter season, summer and winter droughts have to be analysed

  8. Condensation of steam in horizontal pipes: model development and validation

    International Nuclear Information System (INIS)

    Szijarto, R.

    2015-01-01

    This thesis submitted to the Swiss Federal Institute of Technology ETH in Zurich presents the development and validation of a model for the condensation of steam in horizontal pipes. Condensation models were introduced and developed particularly for the application in the emergency cooling system of a Gen-III+ boiling water reactor. Such an emergency cooling system consists of slightly inclined horizontal pipes, which are immersed in a cold water tank. The pipes are connected to the reactor pressure vessel. They are responsible for a fast depressurization of the reactor core in the case of accident. Condensation in horizontal pipes was investigated with both one-dimensional system codes (RELAP5) and three-dimensional computational fluid dynamics software (ANSYS FLUENT). The performance of the RELAP5 code was not sufficient for transient condensation processes. Therefore, a mechanistic model was developed and implemented. Four models were tested on the LAOKOON facility, which analysed direct contact condensation in a horizontal duct

  9. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... of firn compaction to correct ICESat measurements and assessing the present mass loss of the Greenland ice sheet. Validation of the model against the radar data gives good results and confidence in using the model to answer important questions. Questions such as; how large is the firn compaction...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  10. Spatiotemporal impacts of LULC changes on hydrology from the perspective of runoff generation mechanism using SWAT model with evolving parameters

    Science.gov (United States)

    Li, Y.; Chang, J.; Luo, L.

    2017-12-01

    It is of great importance for water resources management to model the truly hydrological process under changing environment, especially under significant changes of underlying surfaces like the Wei River Bain (WRB) where the subsurface hydrology is highly influenced by human activities, and to systematically investigate the interactions among LULC change, streamflow variation and changes in runoff generation process. Therefore, we proposed the idea of evolving parameters in hydrological model (SWAT) to reflect the changes in physical environment with different LULC conditions. Then with these evolving parameters, the spatiotemporal impacts of LULC changes on streamflow were quantified, and qualitative analysis was conducted to further explore how LULC changes affect the streamflow from the perspective of runoff generation mechanism. Results indicate the following: 1) evolving parameter calibration is not only effective but necessary to ensure the validity of the model when dealing with significant changes in underlying surfaces due to human activities. 2) compared to the baseline period, the streamflow in wet seasons increased in the 1990s but decreased in the 2000s. While at yearly and dry seasonal scales, the streamflow decreased in both two decades; 3) the expansion of cropland is the major contributor to the reduction of surface water component, thus causing the decline in streamflow at yearly and dry seasonal scales. While compared to the 1990s, the expansions of woodland in the middle stream and grassland in the downstream are the main stressors that increased the soil water component, thus leading to the more decline of the streamflow in the 2000s.

  11. Interaction between stream temperature, streamflow, and groundwater exchanges in alpine streams

    Science.gov (United States)

    Constantz, James E.

    1998-01-01

    Four alpine streams were monitored to continuously collect stream temperature and streamflow for periods ranging from a week to a year. In a small stream in the Colorado Rockies, diurnal variations in both stream temperature and streamflow were significantly greater in losing reaches than in gaining reaches, with minimum streamflow losses occurring early in the day and maximum losses occurring early in the evening. Using measured stream temperature changes, diurnal streambed infiltration rates were predicted to increase as much as 35% during the day (based on a heat and water transport groundwater model), while the measured increase in streamflow loss was 40%. For two large streams in the Sierra Nevada Mountains, annual stream temperature variations ranged from 0° to 25°C. In summer months, diurnal stream temperature variations were 30–40% of annual stream temperature variations, owing to reduced streamflows and increased atmospheric heating. Previous reports document that one Sierra stream site generally gains groundwater during low flows, while the second Sierra stream site may lose water during low flows. For August the diurnal streamflow variation was 11% at the gaining stream site and 30% at the losing stream site. On the basis of measured diurnal stream temperature variations, streambed infiltration rates were predicted to vary diurnally as much as 20% at the losing stream site. Analysis of results suggests that evapotranspiration losses determined diurnal streamflow variations in the gaining reaches, while in the losing reaches, evapotranspiration losses were compounded by diurnal variations in streambed infiltration. Diurnal variations in stream temperature were reduced in the gaining reaches as a result of discharging groundwater of relatively constant temperature. For the Sierra sites, comparison of results with those from a small tributary demonstrated that stream temperature patterns were useful in delineating discharges of bank storage following

  12. Multi-site calibration, validation, and sensitivity analysis of the MIKE SHE Model for a large watershed in northern China

    Directory of Open Access Journals (Sweden)

    S. Wang

    2012-12-01

    Full Text Available Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped calibration protocol that used streamflow measured at one single watershed outlet to a multi-site calibration method which employed streamflow measurements at three stations within the large Chaohe River basin in northern China. Simulation results showed that the single-site calibrated model was able to sufficiently simulate the hydrographs for two of the three stations (Nash-Sutcliffe coefficient of 0.65–0.75, and correlation coefficient 0.81–0.87 during the testing period, but the model performed poorly for the third station (Nash-Sutcliffe coefficient only 0.44. Sensitivity analysis suggested that streamflow of upstream area of the watershed was dominated by slow groundwater, whilst streamflow of middle- and down- stream areas by relatively quick interflow. Therefore, a multi-site calibration protocol was deemed necessary. Due to the potential errors and uncertainties with respect to the representation of spatial variability, performance measures from the multi-site calibration protocol slightly decreased for two of the three stations, whereas it was improved greatly for the third station. We concluded that multi-site calibration protocol reached a compromise in term of model performance for the three stations, reasonably representing the hydrographs of all three stations with Nash-Sutcliffe coefficient ranging from 0.59–072. The multi-site calibration protocol applied in the analysis generally has advantages to the single site calibration protocol.

  13. Development and Validation of a 3-Dimensional CFB Furnace Model

    Science.gov (United States)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  14. Validation of thermal models for a prototypical MEMS thermal actuator.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

    2008-09-01

    This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the

  15. Proceedings of the first SRL model validation workshop

    International Nuclear Information System (INIS)

    Buckner, M.R.

    1981-10-01

    The Clean Air Act and its amendments have added importance to knowing the accuracy of mathematical models used to assess transport and diffusion of environmental pollutants. These models are the link between air quality standards and emissions. To test the accuracy of a number of these models, a Model Validation Workshop was held. The meteorological, source-term, and Kr-85 concentration data bases for emissions from the separations areas of the Savannah River Plant during 1975 through 1977 were used to compare calculations from various atmospheric dispersion models. The results of statistical evaluation of the models show a degradation in the ability to predict pollutant concentrations as the time span over which the calculations are made is reduced. Forecasts for annual time periods were reasonably accurate. Weighted-average squared correlation coefficients (R 2 ) were 0.74 for annual, 0.28 for monthly, 0.21 for weekly, and 0.18 for twice-daily predictions. Model performance varied within each of these four categories; however, the results indicate that the more complex, three-dimensional models provide only marginal increases in accuracy. The increased costs of running these codes is not warranted for long-term releases or for conditions of relatively simple terrain and meteorology. The overriding factor in the calculational accuracy is the accurate description of the wind field. Further improvements of the numerical accuracy of the complex models is not nearly as important as accurate calculations of the meteorological transport conditions

  16. Towards reliable seasonal ensemble streamflow forecasts for ephemeral rivers

    Science.gov (United States)

    Bennett, James; Wang, Qj; Li, Ming; Robertson, David

    2016-04-01

    Despite their inherently variable nature, ephemeral rivers are an important water resource in many dry regions. Water managers are likely benefit considerably from even mildly skilful ensemble forecasts of streamflow in ephemeral rivers. As with any ensemble forecast, forecast uncertainty - i.e., the spread of the ensemble - must be reliably quantified to allow users of the forecasts to make well-founded decisions. Correctly quantifying uncertainty in ephemeral rivers is particularly challenging because of the high incidence of zero flows, which are difficult to handle with conventional statistical techniques. Here we apply a seasonal streamflow forecasting system, the model for generating Forecast Guided Stochastic Scenarios (FoGSS), to 26 Australian ephemeral rivers. FoGSS uses post-processed ensemble rainfall forecasts from a coupled ocean-atmosphere prediction system to force an initialised monthly rainfall runoff model, and then applies a staged hydrological error model to describe and propagate hydrological uncertainty in the forecast. FoGSS produces 12-month streamflow forecasts; as forecast skill declines with lead time, the forecasts are designed to transit seamlessly to stochastic scenarios. The ensemble rainfall forecasts used in FoGSS are known to be unbiased and reliable, and we concentrate here on the hydrological error model. The FoGSS error model has several features that make it well suited to forecasting ephemeral rivers. First, FoGSS models the error after data is transformed with a log-sinh transformation. The log-sinh transformation is able to normalise even highly skewed data and homogenise its variance, allowing us to assume that errors are Gaussian. Second, FoGSS handles zero values using data censoring. Data censoring allows streamflow in ephemeral rivers to be treated as a continuous variable, rather than having to model the occurrence of non-zero values and the distribution of non-zero values separately. This greatly simplifies parameter

  17. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  18. Simulation of groundwater conditions and streamflow depletion to evaluate water availability in a Freeport, Maine, watershed

    Science.gov (United States)

    Nielsen, Martha G.; Locke, Daniel B.

    2012-01-01

    In order to evaluate water availability in the State of Maine, the U.S. Geological Survey (USGS) and the Maine Geological Survey began a cooperative investigation to provide the first rigorous evaluation of watersheds deemed "at risk" because of the combination of instream flow requirements and proportionally large water withdrawals. The study area for this investigation includes the Harvey and Merrill Brook watersheds and the Freeport aquifer in the towns of Freeport, Pownal, and Yarmouth, Maine. A numerical groundwater- flow model was used to evaluate groundwater withdrawals, groundwater-surface-water interactions, and the effect of water-management practices on streamflow. The water budget illustrates the effect that groundwater withdrawals have on streamflow and the movement of water within the system. Streamflow measurements were made following standard USGS techniques, from May through September 2009 at one site in the Merrill Brook watershed and four sites in the Harvey Brook watershed. A record-extension technique was applied to estimate long-term monthly streamflows at each of the five sites. The conceptual model of the groundwater system consists of a deep, confined aquifer (the Freeport aquifer) in a buried valley that trends through the middle of the study area, covered by a discontinuous confining unit, and topped by a thin upper saturated zone that is a mixture of sandy units, till, and weathered clay. Harvey and Merrill Brooks flow southward through the study area, and receive groundwater discharge from the upper saturated zone and from the deep aquifer through previously unknown discontinuities in the confining unit. The Freeport aquifer gets most of its recharge from local seepage around the edges of the confining unit, the remainder is received as inflow from the north within the buried valley. Groundwater withdrawals from the Freeport aquifer in the study area were obtained from the local water utility and estimated for other categories. Overall

  19. On the Value of Climate Elasticity Indices to Assess the Impact of Climate Change on Streamflow Projection using an ensemble of bias corrected CMIP5 dataset

    Science.gov (United States)

    Demirel, Mehmet; Moradkhani, Hamid

    2015-04-01

    Changes in two climate elasticity indices, i.e. temperature and precipitation elasticity of streamflow, were investigated using an ensemble of bias corrected CMIP5 dataset as forcing to two hydrologic models. The Variable Infiltration Capacity (VIC) and the Sacramento Soil Moisture Accounting (SAC-SMA) hydrologic models, were calibrated at 1/16 degree resolution and the simulated streamflow was routed to the basin outlet of interest. We estimated precipitation and temperature elasticity of streamflow from: (1) observed streamflow; (2) simulated streamflow by VIC and SAC-SMA models using observed climate for the current climate (1963-2003); (3) simulated streamflow using simulated climate from 10 GCM - CMIP5 dataset for the future climate (2010-2099) including two concentration pathways (RCP4.5 and RCP8.5) and two downscaled climate products (BCSD and MACA). The streamflow sensitivity to long-term (e.g., 30-year) average annual changes in temperature and precipitation is estimated for three periods i.e. 2010-40, 2040-70 and 2070-99. We compared the results of the three cases to reflect on the value of precipitation and temperature indices to assess the climate change impacts on Columbia River streamflow. Moreover, these three cases for two models are used to assess the effects of different uncertainty sources (model forcing, model structure and different pathways) on the two climate elasticity indices.

  20. CrowdWater - Can people observe what models need?

    Science.gov (United States)

    van Meerveld, I. H. J.; Seibert, J.; Vis, M.; Etter, S.; Strobl, B.

    2017-12-01

    CrowdWater (www.crowdwater.ch) is a citizen science project that explores the usefulness of crowd-sourced data for hydrological model calibration and prediction. Hydrological models are usually calibrated based on observed streamflow data but it is likely easier for people to estimate relative stream water levels, such as the water level above or below a rock, than streamflow. Relative stream water levels may, therefore, be a more suitable variable for citizen science projects than streamflow. In order to test this assumption, we held surveys near seven different sized rivers in Switzerland and asked more than 450 volunteers to estimate the water level class based on a picture with a virtual staff gauge. The results show that people can generally estimate the relative water level well, although there were also a few outliers. We also asked the volunteers to estimate streamflow based on the stick method. The median estimated streamflow was close to the observed streamflow but the spread in the streamflow estimates was large and there were very large outliers, suggesting that crowd-based streamflow data is highly uncertain. In order to determine the potential value of water level class data for model calibration, we converted streamflow time series for 100 catchments in the US to stream level class time series and used these to calibrate the HBV model. The model was then validated using the streamflow data. The results of this modeling exercise show that stream level class data are useful for constraining a simple runoff model. Time series of only two stream level classes, e.g. above or below a rock in the stream, were already informative, especially when the class boundary was chosen towards the highest stream levels. There was hardly any improvement in model performance when more than five water level classes were used. This suggests that if crowd-sourced stream level observations are available for otherwise ungauged catchments, these data can be used to constrain

  1. Contaminant transport model validation: The Oak Ridge Reservation

    International Nuclear Information System (INIS)

    Lee, R.R.; Ketelle, R.H.

    1988-09-01

    In the complex geologic setting on the Oak Ridge Reservation, hydraulic conductivity is anisotropic and flow is strongly influenced by an extensive and largely discontinuous fracture network. Difficulties in describing and modeling the aquifer system prompted a study to obtain aquifer property data to be used in a groundwater flow model validation experiment. Characterization studies included the performance of an extensive suite of aquifer test within a 600-square-meter area to obtain aquifer property values to describe the flow field in detail. Following aquifer test, a groundwater tracer test was performed under ambient conditions to verify the aquifer analysis. Tracer migration data in the near-field were used in model calibration to predict tracer arrival time and concentration in the far-field. Despite the extensive aquifer testing, initial modeling inaccurately predicted tracer migration direction. Initial tracer migration rates were consistent with those predicted by the model; however, changing environmental conditions resulted in an unanticipated decay in tracer movement. Evaluation of the predictive accuracy of groundwater flow and contaminant transport models on the Oak Ridge Reservation depends on defining the resolution required, followed by field testing and model grid definition at compatible scales. The use of tracer tests, both as a characterization method and to verify model results, provides the highest level of resolution of groundwater flow characteristics. 3 refs., 4 figs

  2. Assimilating in situ and radar altimetry data into a large-scale hydrologic-hydrodynamic model for streamflow forecast in the Amazon

    Directory of Open Access Journals (Sweden)

    R. C. D. Paiva

    2013-07-01

    Full Text Available In this work, we introduce and evaluate a data assimilation framework for gauged and radar altimetry-based discharge and water levels applied to a large scale hydrologic-hydrodynamic model for stream flow forecasts over the Amazon River basin. We used the process-based hydrological model called MGB-IPH coupled with a river hydrodynamic module using a storage model for floodplains. The Ensemble Kalman Filter technique was used to assimilate information from hundreds of gauging and altimetry stations based on ENVISAT satellite data. Model state variables errors were generated by corrupting precipitation forcing, considering log-normally distributed, time and spatially correlated errors. The EnKF performed well when assimilating in situ discharge, by improving model estimates at the assimilation sites (change in root-mean-squared error Δrms = −49% and also transferring information to ungauged rivers reaches (Δrms = −16%. Altimetry data assimilation improves results, in terms of water levels (Δrms = −44% and discharges (Δrms = −15% to a minor degree, mostly close to altimetry sites and at a daily basis, even though radar altimetry data has a low temporal resolution. Sensitivity tests highlighted the importance of the magnitude of the precipitation errors and that of their spatial correlation, while temporal correlation showed to be dispensable. The deterioration of model performance at some unmonitored reaches indicates the need for proper characterisation of model errors and spatial localisation techniques for hydrological applications. Finally, we evaluated stream flow forecasts for the Amazon basin based on initial conditions produced by the data assimilation scheme and using the ensemble stream flow prediction approach where the model is forced by past meteorological forcings. The resulting forecasts agreed well with the observations and maintained meaningful skill at large rivers even for long lead times, e.g. >90 days at the Solim

  3. Multi-Site Validation of the SWAT Model on the Bani Catchment: Model Performance and Predictive Uncertainty

    Directory of Open Access Journals (Sweden)

    Jamilatou Chaibou Begou

    2016-04-01

    Full Text Available The objective of this study was to assess the performance and predictive uncertainty of the Soil and Water Assessment Tool (SWAT model on the Bani River Basin, at catchment and subcatchment levels. The SWAT model was calibrated using the Generalized Likelihood Uncertainty Estimation (GLUE approach. Potential Evapotranspiration (PET and biomass were considered in the verification of model outputs accuracy. Global Sensitivity Analysis (GSA was used for identifying important model parameters. Results indicated a good performance of the global model at daily as well as monthly time steps with adequate predictive uncertainty. PET was found to be overestimated but biomass was better predicted in agricultural land and forest. Surface runoff represents the dominant process on streamflow generation in that region. Individual calibration at subcatchment scale yielded better performance than when the global parameter sets were applied. These results are very useful and provide a support to further studies on regionalization to make prediction in ungauged basins.

  4. Recent tree die-off has little effect on streamflow in contrast to expected increases from historical studies

    Science.gov (United States)

    Biederman, Joel A.; Somor, Andrew J.; Harpold, Adrian A.; Gutmann, Ethan D.; Breshears, David D.; Troch, Peter A.; Gochis, David J.; Scott, Russell L.; Meddens, Arjan J. H.; Brooks, Paul D.

    2015-12-01

    Recent bark beetle epidemics have caused regional-scale tree mortality in many snowmelt-dominated headwater catchments of western North America. Initial expectations of increased streamflow have not been supported by observations, and the basin-scale response of annual streamflow is largely unknown. Here we quantified annual streamflow responses during the decade following tree die-off in eight infested catchments in the Colorado River headwaters and one nearby control catchment. We employed three alternative empirical methods: (i) double-mass comparison between impacted and control catchments, (ii) runoff ratio comparison before and after die-off, and (iii) time-trend analysis using climate-driven linear models. In contrast to streamflow increases predicted by historical paired catchment studies and recent modeling, we did not detect streamflow changes in most basins following die-off, while one basin consistently showed decreased streamflow. The three analysis methods produced generally consistent results, with time-trend analysis showing precipitation was the strongest predictor of streamflow variability (R2 = 74-96%). Time-trend analysis revealed post-die-off streamflow decreased in three catchments by 11-29%, with no change in the other five catchments. Although counter to initial expectations, these results are consistent with increased transpiration by surviving vegetation and the growing body of literature documenting increased snow sublimation and evaporation from the subcanopy following die-off in water-limited, snow-dominated forests. The observations presented here challenge the widespread expectation that streamflow will increase following beetle-induced forest die-off and highlight the need to better understand the processes driving hydrologic response to forest disturbance.

  5. Climate and streamflow characteristics for selected streamgages in eastern South Dakota, water years 1945–2013

    Science.gov (United States)

    Hoogestraat, Galen K.; Stamm, John F.

    2015-11-02

    Upward trends in precipitation and streamflow have been observed in the northeastern Missouri River Basin during the past century, including the area of eastern South Dakota. Some of the identified upward trends were anomalously large relative to surrounding parts of the northern Great Plains. Forcing factors for streamflow trends in eastern South Dakota are not well understood, and it is not known whether streamflow trends are driven primarily by climatic changes or various land-use changes. Understanding the effects that climate (specifically precipitation and temperature) has on streamflow characteristics within a region will help to better understand additional factors such as land-use alterations that may affect the hydrology of the region. To aid in this understanding, a study was completed by the U.S. Geological Survey, in cooperation with the East Dakota Water Development District and James River Water Development District, to assess trends in climate and streamflow characteristics at 10 selected streamgages in eastern South Dakota for water years (WYs) 1945–2013 (69 years) and WYs 1980–2013 (34 years). A WY is the 12-month period, October 1 through September 30, and is designated by the calendar year in which it ends. One streamgage is on the Whetstone River, a tributary to the Minnesota River, and the other streamgages are in the James, Big Sioux, and Vermillion River Basins. The watersheds for two of the James River streamgages extend into North Dakota, and parts of the watersheds for two of the Big Sioux River streamgages extend into Minnesota and Iowa. The objectives of this study were to document trends in streamflow and precipitation in these watersheds, and characterize the residual streamflow variability that might be attributed to factors other than precipitation. Residuals were computed as the departure from a locally-weighted scatterplot smoothing (LOWESS) model. Significance of trends was based on the Mann-Kendall nonparametric test at a 0

  6. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto

    2013-03-01

    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  7. Experimental validation of the multiphase extended Leblond's model

    Science.gov (United States)

    Weisz-Patrault, Daniel

    2017-10-01

    Transformation induced plasticity is a crucial contribution of the simulation of several forming processes involving phase transitions under mechanical loads, resulting in large irreversible strain even though the applied stress is under the yield stress. One of the most elegant and widely used models is based on analytic homogenization procedures and has been proposed by Leblond et al. [1-4]. Very recently, a simple extension of the Leblond's model has been developed by Weisz-Patrault [8]. Several product phases are taken into account and several assumptions are relaxed in order to extend the applicability of the model. The present contribution compares experimental tests with numerical computations, in order to discuss the validity of the developed theory. Thus, experimental results extracted from the existing literature are analyzed. Results show a good agreement between measurements and theoretical computations.

  8. Decreased Streamflow in the Yellow River Basin,  China: Climate Change or Human‐Induced?

    Directory of Open Access Journals (Sweden)

    Bin Li

    2017-02-01

    Full Text Available Decreased streamflow of the Yellow River basin has become the subject of considerable concern in recent years due to the critical importance of the water resources of the Yellow River basin for northern China. This study investigates the changing properties and underlying causes for the decreased streamflow by applying streamflow data for the period 1960 to 2014 to both the Budyko framework and the hydrological modelling techniques. The results indicate that (1 streamflow decreased 21% during the period 1980–2000, and decreased 19% during the period 2000–2014 when compared to 1960–1979; (2 higher precipitation and relative humidity boost streamflow, while maximum/minimum air temperature, solar radiation, wind speed, and the underlying parameter, n, all have the potential to adversely affect them; (3 decreased streamflow is also linked to increased cropland, grass, reservoir, urban land, and water areas and other human activities associated with GDP and population; (4 human activity is the main reason for the decrease of streamflow in the Yellow River basin, with the mean fractional contribution of 73.4% during 1980–2000 and 82.5% during 2001–2014. It is clear that the continuing growth of humaninduced impacts on streamflow likely to add considerable uncertainty to the management of increasingly scarce water resources. Overall, these results provide strong evidence to suggest that human activity is the key factor behind the decreased streamflow in the Yellow River basin.

  9. Streamflow Simulations and Percolation Estimates Using the Soil and Water Assessment Tool for Selected Basins in North-Central Nebraska, 1940-2005

    Science.gov (United States)

    Strauch, Kellan R.; Linard, Joshua I.

    2009-01-01

    The U.S. Geological Survey, in cooperation with the Upper Elkhorn, Lower Elkhorn, Upper Loup, Lower Loup, Middle Niobrara, Lower Niobrara, Lewis and Clark, and Lower Platte North Natural Resources Districts, used the Soil and Water Assessment Tool to simulate streamflow and estimate percolation in north-central Nebraska to aid development of long-term strategies for management of hydrologically connected ground and surface water. Although groundwater models adequately simulate subsurface hydrologic processes, they often are not designed to simulate the hydrologically complex processes occurring at or near the land surface. The use of watershed models such as the Soil and Water Assessment Tool, which are designed specifically to simulate surface and near-subsurface processes, can provide helpful insight into the effects of surface-water hydrology on the groundwater system. The Soil and Water Assessment Tool was calibrated for five stream basins in the Elkhorn-Loup Groundwater Model study area in north-central Nebraska to obtain spatially variable estimates of percolation. Six watershed models were calibrated to recorded streamflow in each subbasin by modifying the adjustment parameters. The calibrated parameter sets were then used to simulate a validation period; the validation period was half of the total streamflow period of record with a minimum requirement of 10 years. If the statistical and water-balance results for the validation period were similar to those for the calibration period, a model was considered satisfactory. Statistical measures of each watershed model's performance were variable. These objective measures included the Nash-Sutcliffe measure of efficiency, the ratio of the root-mean-square error to the standard deviation of the measured data, and an estimate of bias. The model met performance criteria for the bias statistic, but failed to meet statistical adequacy criteria for the other two performance measures when evaluated at a monthly time

  10. A validation study of a stochastic model of human interaction

    Science.gov (United States)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  11. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  12. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  13. Lessons learned from recent geomagnetic disturbance model validation activities

    Science.gov (United States)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  14. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  15. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... noise emission, trying at the same time to preserve some of its aerodynamic and geometric characteristics. The new designs are characterized by less cambered airfoils and flatter suction sides. The resulting noise reductions seem to be mainly achieved by a reduction in the turbulent kinetic energy...

  16. Validating modeled turbulent heat fluxes across large freshwater surfaces

    Science.gov (United States)

    Lofgren, B. M.; Fujisaki-Manome, A.; Gronewold, A.; Anderson, E. J.; Fitzpatrick, L.; Blanken, P.; Spence, C.; Lenters, J. D.; Xiao, C.; Charusambot, U.

    2017-12-01

    Turbulent fluxes of latent and sensible heat are important physical processes that influence the energy and water budgets of the Great Lakes. Validation and improvement of bulk flux algorithms to simulate these turbulent heat fluxes are critical for accurate prediction of hydrodynamics, water levels, weather, and climate over the region. Here we consider five heat flux algorithms from several model systems; the Finite-Volume Community Ocean Model, the Weather Research and Forecasting model, and the Large Lake Thermodynamics Model, which are used in research and operational environments and concentrate on different aspects of the Great Lakes' physical system, but interface at the lake surface. The heat flux algorithms were isolated from each model and driven by meteorological data from over-lake stations in the Great Lakes Evaporation Network. The simulation results were compared with eddy covariance flux measurements at the same stations. All models show the capacity to the seasonal cycle of the turbulent heat fluxes. Overall, the Coupled Ocean Atmosphere Response Experiment algorithm in FVCOM has the best agreement with eddy covariance measurements. Simulations with the other four algorithms are overall improved by updating the parameterization of roughness length scales of temperature and humidity. Agreement between modelled and observed fluxes notably varied with geographical locations of the stations. For example, at the Long Point station in Lake Erie, observed fluxes are likely influenced by the upwind land surface while the simulations do not take account of the land surface influence, and therefore the agreement is worse in general.

  17. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  18. A Report on the Validation of Beryllium Strength Models

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    This report discusses work on validating beryllium strength models with flyer plate and Taylor rod experimental data. Strength models are calibrated with Hopkinson bar and quasi-static data. The Hopkinson bar data for beryllium provides strain rates up to about 4000 per second. A limitation of the Hopkinson bar data for beryllium is that it only provides information on strain up to about 0.15. The lack of high strain data at high strain rates makes it difficult to distinguish between various strength model settings. The PTW model has been calibrated many different times over the last 12 years. The lack of high strain data for high strain rates has resulted in these calibrated PTW models for beryllium exhibiting significantly different behavior when extrapolated to high strain. For beryllium, the α parameter of PTW has recently been calibrated to high precision shear modulus data. In the past the α value for beryllium was set based on expert judgment. The new α value for beryllium was used in a calibration of the beryllium PTW model by Sky Sjue. The calibration by Sjue used EOS table information to model the temperature dependence of the heat capacity. Also, the calibration by Sjue used EOS table information to model the density changes of the beryllium sample during the Hopkinson bar and quasi-static experiments. In this paper, the calibrated PTW model by Sjue is compared against experimental data and other strength models. The other strength models being considered are a PTW model calibrated by Shuh- Rong Chen and a Steinberg-Guinan type model by John Pedicini. The three strength models are used in a comparison against flyer plate and Taylor rod data. The results show that the Chen PTW model provides better agreement to this data. The Chen PTW model settings have been previously adjusted to provide a better fit to flyer plate data, whereas the Sjue PTW model has not been changed based on flyer plate data. However, the Sjue model provides a reasonable fit to

  19. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    Science.gov (United States)

    2011-01-01

    Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments

  20. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  1. Validation of Symptom Validity Tests Using a "Child-model" of Adult Cognitive Impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P. E. J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children's cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  2. One-day-ahead streamflow forecasting via super-ensembles of several neural network architectures based on the Multi-Level Diversity Model

    Science.gov (United States)

    Brochero, Darwin; Hajji, Islem; Pina, Jasson; Plana, Queralt; Sylvain, Jean-Daniel; Vergeynst, Jenna; Anctil, Francois

    2015-04-01

    Theories about generalization error with ensembles are mainly based on the diversity concept, which promotes resorting to many members of different properties to support mutually agreeable decisions. Kuncheva (2004) proposed the Multi Level Diversity Model (MLDM) to promote diversity in model ensembles, combining different data subsets, input subsets, models, parameters, and including a combiner level in order to optimize the final ensemble. This work tests the hypothesis about the minimisation of the generalization error with ensembles of Neural Network (NN) structures. We used the MLDM to evaluate two different scenarios: (i) ensembles from a same NN architecture, and (ii) a super-ensemble built by a combination of sub-ensembles of many NN architectures. The time series used correspond to the 12 basins of the MOdel Parameter Estimation eXperiment (MOPEX) project that were used by Duan et al. (2006) and Vos (2013) as benchmark. Six architectures are evaluated: FeedForward NN (FFNN) trained with the Levenberg Marquardt algorithm (Hagan et al., 1996), FFNN trained with SCE (Duan et al., 1993), Recurrent NN trained with a complex method (Weins et al., 2008), Dynamic NARX NN (Leontaritis and Billings, 1985), Echo State Network (ESN), and leak integrator neuron (L-ESN) (Lukosevicius and Jaeger, 2009). Each architecture performs separately an Input Variable Selection (IVS) according to a forward stepwise selection (Anctil et al., 2009) using mean square error as objective function. Post-processing by Predictor Stepwise Selection (PSS) of the super-ensemble has been done following the method proposed by Brochero et al. (2011). IVS results showed that the lagged stream flow, lagged precipitation, and Standardized Precipitation Index (SPI) (McKee et al., 1993) were the most relevant variables. They were respectively selected as one of the firsts three selected variables in 66, 45, and 28 of the 72 scenarios. A relationship between aridity index (Arora, 2002) and NN

  3. Ovarian volume throughout life: a validated normative model.

    Science.gov (United States)

    Kelsey, Thomas W; Dodwell, Sarah K; Wilkinson, A Graham; Greve, Tine; Andersen, Claus Y; Anderson, Richard A; Wallace, W Hamish B

    2013-01-01

    The measurement of ovarian volume has been shown to be a useful indirect indicator of the ovarian reserve in women of reproductive age, in the diagnosis and management of a number of disorders of puberty and adult reproductive function, and is under investigation as a screening tool for ovarian cancer. To date there is no normative model of ovarian volume throughout life. By searching the published literature for ovarian volume in healthy females, and using our own data from multiple sources (combined n=59,994) we have generated and robustly validated the first model of ovarian volume from conception to 82 years of age. This model shows that 69% of the variation in ovarian volume is due to age alone. We have shown that in the average case ovarian volume rises from 0.7 mL (95% CI 0.4-1.1 mL) at 2 years of age to a peak of 7.7 mL (95% CI 6.5-9.2 mL) at 20 years of age with a subsequent decline to about 2.8 mL (95% CI 2.7-2.9 mL) at the menopause and smaller volumes thereafter. Our model allows us to generate normal values and ranges for ovarian volume throughout life. This is the first validated normative model of ovarian volume from conception to old age; it will be of use in the diagnosis and management of a number of diverse gynaecological and reproductive conditions in females from birth to menopause and beyond.

  4. Challenges in validating model results for first year ice

    Science.gov (United States)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  5. Modeling Clinically Validated Physical Activity Assessments Using Commodity Hardware.

    Science.gov (United States)

    Winfree, Kyle N; Dominick, Gregory

    2018-03-01

    Consumer-grade wearable activity devices such as Fitbits are increasingly being used in research settings to promote physical activity (PA) due to their low-cost and widespread popularity. However, Fitbit-derived measures of activity intensity are consistently reported to be less accurate than intensity estimates obtained from research-grade accelerometers (i.e., ActiGraph). As such, the potential for using a Fitbit to measure PA intensity within research contexts remains limited. This study aims to model ActiGraph-based intensity estimates from the validated Freedson vector magnitude (VM3) algorithm using measures of steps, metabolic equivalents, and intensity levels obtained from Fitbit. Minute-level data collected from 19 subjects, who concurrently wore the ActiGraph GT3X and Fitbit Flex devices for an average of 1.8 weeks, were used to generate the model. After testing several modeling methods, a naïve Bayes classifier was chosen based on the lowest achieved error rate. Overall, the model reduced Fitbit to ActiGraph errors from 19.97% to 16.32%. Moreover, the model reduced misclassification of Fitbit-based estimates of moderate-to-vigorous physical activity (MVPA) by 40%, eliminating a statistically significant difference between MVPA estimates derived from ActiGraph and Fitbit. Study findings support the general utility of the model for measuring MVPA with the Fitbit Flex in place of the more costly ActiGraph GT3X accelerometer for young healthy adults.

  6. Validation of a non-linear model of health.

    Science.gov (United States)

    Topolski, Stefan; Sturmberg, Joachim

    2014-12-01

    The purpose of this study was to evaluate the veracity of a theoretically derived model of health that describes a non-linear trajectory of health from birth to death with available population data sets. The distribution of mortality by age is directly related to health at that age, thus health approximates 1/mortality. The inverse of available all-cause mortality data from various time periods and populations was used as proxy data to compare with the theoretically derived non-linear health model predictions, using both qualitative approaches and quantitative one-sample Kolmogorov-Smirnov analysis with Monte Carlo simulation. The mortality data's inverse resembles a log-normal distribution as predicted by the proposed health model. The curves have identical slopes from birth and follow a logarithmic decline from peak health in young adulthood. A majority of the sampled populations had a good to excellent quantitative fit to a log-normal distribution, supporting the underlying model assumptions. Post hoc manipulation showed the model predictions to be stable. This is a first theory of health to be validated by proxy data, namely the inverse of all-cause mortality. This non-linear model, derived from the notion of the interaction of physical, environmental, mental, emotional, social and sense-making domains of health, gives physicians a more rigorous basis to direct health care services and resources away from disease-focused elder care towards broad-based biopsychosocial interventions earlier in life. © 2014 John Wiley & Sons, Ltd.

  7. Nonlinear ultrasound modelling and validation of fatigue damage

    Science.gov (United States)

    Fierro, G. P. Malfense; Ciampa, F.; Ginzburg, D.; Onder, E.; Meo, M.

    2015-05-01

    Nonlinear ultrasound techniques have shown greater sensitivity to microcracks and they can be used to detect structural damages at their early stages. However, there is still a lack of numerical models available in commercial finite element analysis (FEA) tools that are able to simulate the interaction of elastic waves with the materials nonlinear behaviour. In this study, a nonlinear constitutive material model was developed to predict the structural response under continuous harmonic excitation of a fatigued isotropic sample that showed anharmonic effects. Particularly, by means of Landau's theory and Kelvin tensorial representation, this model provided an understanding of the elastic nonlinear phenomena such as the second harmonic generation in three-dimensional solid media. The numerical scheme was implemented and evaluated using a commercially available FEA software LS-DYNA, and it showed a good numerical characterisation of the second harmonic amplitude generated by the damaged region known as the nonlinear response area (NRA). Since this process requires only the experimental second-order nonlinear parameter and rough damage size estimation as an input, it does not need any baseline testing with the undamaged structure or any dynamic modelling of the fatigue crack growth. To validate this numerical model, the second-order nonlinear parameter was experimentally evaluated at various points over the fatigue life of an aluminium (AA6082-T6) coupon and the crack propagation was measured using an optical microscope. A good correlation was achieved between the experimental set-up and the nonlinear constitutive model.

  8. MT3DMS: Model use, calibration, and validation

    Science.gov (United States)

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  9. Validation of kinetic modeling of progesterone release from polymeric membranes

    Directory of Open Access Journals (Sweden)

    Analia Irma Romero

    2018-01-01

    Full Text Available Mathematical modeling in drug release systems is fundamental in development and optimization of these systems, since it allows to predict drug release rates and to elucidate the physical transport mechanisms involved. In this paper we validate a novel mathematical model that describes progesterone (Prg controlled release from poly-3-hydroxybutyric acid (PHB membranes. A statistical analysis was conducted to compare the fitting of our model with six different models and the Akaike information criterion (AIC was used to find the equation with best-fit. A simple relation between mass and drug released rate was found, which allows predicting the effect of Prg loads on the release behavior. Our proposed model was the one with minimum AIC value, and therefore it was the one that statistically fitted better the experimental data obtained for all the Prg loads tested. Furthermore, the initial release rate was calculated and therefore, the interface mass transfer coefficient estimated and the equilibrium distribution constant of Prg between the PHB and the release medium was also determined. The results lead us to conclude that our proposed model is the one which best fits the experimental data and can be successfully used to describe Prg drug release in PHB membranes.

  10. Non-Linear Slosh Damping Model Development and Validation

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  11. Model Validation of Radiocaesium Transfer from Soil to Leafy Vegetables

    Directory of Open Access Journals (Sweden)

    P. Sukmabuana

    2012-04-01

    Full Text Available The accumulation of radionuclide in plant tissues can be estimated using a mathematical model, however the applicability of the model into field experiment still needs to be evaluated. A model validation has been conducted for radiocaesium transfer from soil to two leafy vegetables generally consumed by Indonesian people, i.e. spinach and morning glory in order to validate the transfer model toward field experimental data. The vegetable plants were grown on the soil contaminated with 134CsNO3 of 19 MBq for about 70 days. As the control, vegetables plant were also grown on soil without 134CsNO3 contamination. Every 5 days, both of contaminated and un contaminated plants were sampled for 3 persons respectively. The soil media was also tested. The samples were dried by infra red lamp and then the radioactivity was counted using gamma spectrometer. Data of 134Cs radioactivity on soil and plants were substituted into mathematical equation to obtain the coeficient of transfer rate (k12. The values of k12 were then used for calculating the 134Cs radioactivity in the vegetable plants. The 134Cs radioactivity in plants obtained from mathematical model analysis was compared with the radioactivity data obtained from the experiment. Correlation of 134Cs radioactivity in vegetables plant obtained from the experiment with those obtained from model analysis was expressed as correlation coefficient, and it was obtained to be 0.90 and 0.71 for spinach and morning glory plants respectively. The values of 134Cs in plants obtained from the model analysis can be corrected using standard deviation values, namely 48.65 and 20 for spinach at 0model analysis and experiment data, the model of 134Cs transfer from soil to plant can be used for analysing 134Cs radioactivity

  12. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  13. Experimental validation of models for Plasma Focus devices

    International Nuclear Information System (INIS)

    Rodriguez Palomino, Luis; Gonzalez, Jose; Clausse, Alejandro

    2003-01-01

    Plasma Focus(PF) Devices are thermonuclear pulsators that produce short pulsed radiation (X-ray, charged particles and neutrons). Since Filippov and Mather, investigations have been used to study plasma properties. Nowadays the interest about PF is focused in technology applications, related to the use of these devices as pulsed neutron sources. In the numerical calculus the Inter institutional PLADEMA (PLAsmas DEnsos MAgnetizados) network is developing three models. Each one is useful in different engineering stages of the Plasma Focus design. One of the main objectives in this work is a comparative study on the influence of the different parameters involved in each models. To validate these results, several experimental measurements under different geometry and initial conditions were performed. (author)

  14. Development and validation of a liquid composite molding model

    Science.gov (United States)

    Bayldon, John Michael

    2007-12-01

    In composite manufacturing, Vacuum Assisted Resin Transfer Molding (VARTM) is becoming increasingly important as a cost effective manufacturing method of structural composites. In this process the dry preform (reinforcement) is placed on a rigid tool and covered by a flexible film to form an airtight vacuum bag. Liquid resin is drawn under vacuum through the preform inside the vacuum bag. Modeling of this process relies on a good understanding of closely coupled phenomena. The resin flow depends on the preform permeability, which in turn depends on the local fluid pressure and the preform compaction behavior. VARTM models for predicting the flow rate in this process do exist, however, they are not able to properly predict the flow for all classes of reinforcement material. In this thesis, the continuity equation used in VARTM models is reexamined and a modified form proposed. In addition, the compaction behavior of the preform in both saturated and dry states is studied in detail and new models are proposed for the compaction behavior. To assess the validity of the proposed models, the shadow moire method was adapted and used to perform full field measurement of the preform thickness during infusion, in addition to the usual measurements of flow front position. A new method was developed and evaluated for the analysis of the moire data related to the VARTM process, however, the method has wider applicability to other full field thickness measurements. The use of this measurement method demonstrated that although the new compaction models work well in the characterization tests, they do not properly describe all the preform features required for modeling the process. In particular the effect of varying saturation on the preform's behavior requires additional study. The flow models developed did, however, improve the prediction of the flow rate for the more compliant preform material tested, and the experimental techniques have shown where additional test methods

  15. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  16. Effects of climate change on streamflow extremes and implications for reservoir inflow in the United States

    Science.gov (United States)

    Naz, Bibi S.; Kao, Shih-Chieh; Ashfaq, Moetasim; Gao, Huilin; Rastogi, Deeksha; Gangrade, Sudershan

    2018-01-01

    The magnitude and frequency of hydrometeorological extremes are expected to increase in the conterminous United States (CONUS) over the rest of this century, and their increase will significantly impact water resource management. In this study, we evaluated the large-scale climate change effects on extreme hydrological events and their implications for reservoir inflows in 138 headwater subbasins located upstream of reservoirs across CONUS using the Variable Infiltration Capacity (VIC) hydrologic model. The VIC model was forced with a 10-member ensemble of global circulation models under the Representative Concentration Pathway 8.5 that were dynamically downscaled using a regional climate model (RegCM4) and bias-corrected to 1/24° grid cell resolution. Four commonly used indices, including mean annual flow, annual center timing, 100-year daily high streamflow, and 10-year 7-day average low streamflow were used for evaluation. The results projected an increase in the high streamflow by 44% for a majority of subbasins upstream of flood control reservoirs in the central United States (US) and a decrease in the low streamflow by 11% for subbasins upstream of hydropower reservoirs across the western US. In the eastern US, frequencies of both high and low streamflow were projected to increase in the majority of subbasins upstream of both hydropower and flood control reservoirs. Increased frequencies of both high and low streamflow events can potentially make reservoirs across CONUS more vulnerable to future climate conditions. This study estimates reservoir inflow changes over the next several decades, which can be used to optimize water supply management downstream.

  17. Accessing the capability of TRMM 3B42 V7 to simulate streamflow ...

    Indian Academy of Sciences (India)

    Brijesh Kumar

    2018-03-06

    Mar 6, 2018 ... The paper examines the quality of Tropical Rainfall Monitoring Mission (TRMM) 3B42 V7 precipitation product to simulate the streamflow using Soil Water Assessment Tool (SWAT) model for various rainfall intensities over the Himalayan region. The SWAT model has been set up for Gandak River Basin.

  18. Experimental validation of solid rocket motor damping models

    Science.gov (United States)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2017-12-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  19. Design-validation of a hand exoskeleton using musculoskeletal modeling.

    Science.gov (United States)

    Hansen, Clint; Gosselin, Florian; Ben Mansour, Khalil; Devos, Pierre; Marin, Frederic

    2018-04-01

    Exoskeletons are progressively reaching homes and workplaces, allowing interaction with virtual environments, remote control of robots, or assisting human operators in carrying heavy loads. Their design is however still a challenge as these robots, being mechanically linked to the operators who wear them, have to meet ergonomic constraints besides usual robotic requirements in terms of workspace, speed, or efforts. They have in particular to fit the anthropometry and mobility of their users. This traditionally results in numerous prototypes which are progressively fitted to each individual person. In this paper, we propose instead to validate the design of a hand exoskeleton in a fully digital environment, without the need for a physical prototype. The purpose of this study is thus to examine whether finger kinematics are altered when using a given hand exoskeleton. Therefore, user specific musculoskeletal models were created and driven by a motion capture system to evaluate the fingers' joint kinematics when performing two industrial related tasks. The kinematic chain of the exoskeleton was added to the musculoskeletal models and its compliance with the hand movements was evaluated. Our results show that the proposed exoskeleton design does not influence fingers' joints angles, the coefficient of determination between the model with and without exoskeleton being consistently high (R 2 ¯=0.93) and the nRMSE consistently low (nRMSE¯ = 5.42°). These results are promising and this approach combining musculoskeletal and robotic modeling driven by motion capture data could be a key factor in the ergonomics validation of the design of orthotic devices and exoskeletons prior to manufacturing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Hydraulic Hybrid Excavator—Mathematical Model Validation and Energy Analysis

    Directory of Open Access Journals (Sweden)

    Paolo Casoli

    2016-11-01

    Full Text Available Recent demands to reduce pollutant emissions and improve energy efficiency have driven the implementation of hybrid solutions in mobile machinery. This paper presents the results of a numerical and experimental analysis conducted on a hydraulic hybrid excavator (HHE. The machinery under study is a middle size excavator, whose standard version was modified with the introduction of an energy recovery system (ERS. The proposed ERS layout was designed to recover the potential energy of the boom, using a hydraulic accumulator as a storage device. The recovered energy is utilized through the pilot pump of the machinery which operates as a motor, thus reducing the torque required from the internal combustion engine (ICE. The analysis reported in this paper validates the HHE model by comparing numerical and experimental data in terms of hydraulic and mechanical variables and fuel consumption. The mathematical model shows its capability to reproduce the realistic operating conditions of the realized prototype, tested on the field. A detailed energy analysis comparison between the standard and the hybrid excavator models was carried out to evaluate the energy flows along the system, showing advantages, weaknesses and possibilities to further improve the machinery efficiency. Finally, the fuel consumption estimated by the model and that measured during the experiments are presented to highlight the fuel saving percentages. The HHE model is an important starting point for the development of other energy saving solutions.

  1. Development and validation of a habitat suitability model for ...

    Science.gov (United States)

    We developed a spatially-explicit, flexible 3-parameter habitat suitability model that can be used to identify and predict areas at higher risk for non-native dwarf eelgrass (Zostera japonica) invasion. The model uses simple environmental parameters (depth, nearshore slope, and salinity) to quantitatively describe habitat suitable for Z. japonica invasion based on ecology and physiology from the primary literature. Habitat suitability is defined with values ranging from zero to one, where one denotes areas most conducive to Z. japonica and zero denotes areas not likely to support Z. japonica growth. The model was applied to Yaquina Bay, Oregon, USA, an area that has well documented Z. japonica expansion over the last two decades. The highest suitability values for Z. japonica occurred in the mid to upper portions of the intertidal zone, with larger expanses occurring in the lower estuary. While the upper estuary did contain suitable habitat, most areas were not as large as in the lower estuary, due to inappropriate depth, a steeply sloping intertidal zone, and lower salinity. The lowest suitability values occurred below the lower intertidal zone, within the Yaquina River channel. The model was validated by comparison to a multi-year time series of Z. japonica maps, revealing a strong predictive capacity. Sensitivity analysis performed to evaluate the contribution of each parameter to the model prediction revealed that depth was the most important factor. Sh

  2. Validation of Storm Water Management Model Storm Control Measures Modules

    Science.gov (United States)

    Simon, M. A.; Platz, M. C.

    2017-12-01

    EPA's Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance. Many municipalities are relying on SWMM results to design multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970's, EPA and others have developed five major releases, the most recent ones containing storm control measures modules for green infrastructure. The main objective of this study was to quantify the accuracy with which SWMM v5.1.10 simulates the hydrologic activity of previously monitored low impact developments. Model performance was evaluated with a mathematical comparison of outflow hydrographs and total outflow volumes, using empirical data and a multi-event, multi-objective calibration method. The calibration methodology utilized PEST++ Version 3, a parameter estimation tool, which aided in the selection of unmeasured hydrologic parameters. From the validation study and sensitivity analysis, several model improvements were identified to advance SWMM LID Module performance for permeable pavements, infiltration units and green roofs, and these were performed and reported herein. Overall, it was determined that SWMM can successfully simulate low impact development controls given accurate model confirmation, parameter measurement, and model calibration.

  3. Developing and investigating validity of a knowledge management game simulation model

    NARCIS (Netherlands)

    Tsjernikova, Irina

    2009-01-01

    The goals of this research project were to develop a game simulation model which supports learning knowledge management in a game environment and to investigate the validity of that model. The validity of the model is approached from two perspectives: educational validity and representational

  4. Validating neural-network refinements of nuclear mass models

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2018-01-01

    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  5. First approximations in avalanche model validations using seismic information

    Science.gov (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  6. Technical note: Combining quantile forecasts and predictive distributions of streamflows

    Science.gov (United States)

    Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano

    2017-11-01

    The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.

  7. A physical framework for evaluating net effects of wet meadow restoration on late summer streamflow

    Science.gov (United States)

    Grant, G.; Nash, C.; Selker, J. S.; Lewis, S.; Noël, P.

    2017-12-01

    Restoration of degraded wet meadows that develop on upland valley floors is intended to achieve a range of ecological benefits. A widely cited benefit is the potential for meadow restoration to augment late-season streamflow; however, there has been little field data demonstrating increased summer flows following restoration. Instead, the hydrologic consequences of restoration have typically been explored using coupled groundwater and surface water flow models at instrumented sites. The expected magnitude and direction of change provided by models has, however, been inconclusive. Here, we assess the streamflow benefit that can be obtained by wet meadow restoration using a parsimonious, physically-based approach. We use a one-dimensional linearized Boussinesq equation with a superimposed solution for changes in storage due to groundwater upwelling and and explicitly calculate evapotranspiration using the White Method. The model accurately predicts water table elevations from field data in the Middle Fork John Day watershed in Oregon, USA. The full solution shows that while raising channel beds can increase total water storage via increases in water table elevation in upland valley bottoms, the contributions of both lateral and longitudinal drainage from restored floodplains to late summer streamflow are undetectably small, while losses in streamflow due to greater transpiration, lower hydraulic gradients, and less drainable pore volume are substantial. Although late-summer streamflow increases should not be expected as a direct result of wet meadow restoration, these approaches offer benefits for improving the quality and health of riparian and meadow vegetation that would warrant considering such measures, even at the cost of increased water demand and reduced streamflow.

  8. Streamflow trends in Europe: evidence from a dataset of near-natural catchments

    Science.gov (United States)

    Stahl, K.; Hisdal, H.; Hannaford, J.; Tallaksen, L. M.; van Lanen, H. A. J.; Sauquet, E.; Demuth, S.; Fendekova, M.; Jódar, J.

    2010-12-01

    Streamflow observations from near-natural catchments are of paramount importance for detection and attribution studies, evaluation of large-scale model simulations, and assessment of water management, adaptation and policy options. This study investigates streamflow trends in a newly-assembled, consolidated dataset of near-natural streamflow records from 441 small catchments in 15 countries across Europe. The period 1962-2004 provided the best spatial coverage, but analyses were also carried out for longer time periods (with fewer stations), starting in 1932, 1942 and 1952. Trends were calculated by the slopes of the Kendall-Theil robust line for standardized annual and monthly streamflow, as well as for summer low flow magnitude and timing. A regionally coherent picture of annual streamflow trends emerged, with negative trends in southern and eastern regions, and generally positive trends elsewhere. Trends in monthly streamflow for 1962-2004 elucidated potential causes for these changes, as well as for changes in hydrological regimes across Europe. Positive trends were found in the winter months in most catchments. A marked shift towards negative trends was observed in April, gradually spreading across Europe to reach a maximum extent in August. Low flows have decreased in most regions where the lowest mean monthly flow occurs in summer, but vary for catchments which have flow minima in winter and secondary low flows in summer. The study largely confirms findings from national and regional scale trend analyses, but clearly adds to these by confirming that these tendencies are part of coherent patterns of change, which cover a much larger region. The broad, continental-scale patterns of change are mostly congruent with the hydrological responses expected from future climatic changes, as projected by climate models. The patterns observed could hence provide a valuable benchmark for a number of different studies and model simulations.

  9. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  10. System-Level Validation High-Level Modeling and Directed Test Generation Techniques

    CERN Document Server

    Chen, Mingsong; Koo, Heon-Mo; Mishra, Prabhat

    2013-01-01

    This book covers state-of-the art techniques for high-level modeling and validation of complex hardware/software systems, including those with multicore architectures.  Readers will learn to avoid time-consuming and error-prone validation from the comprehensive coverage of system-level validation, including high-level modeling of designs and faults, automated generation of directed tests, and efficient validation methodology using directed tests and assertions.  The methodologies described in this book will help designers to improve the quality of their validation, performing as much validation as possible in the early stages of the design, while reducing the overall validation effort and cost.

  11. Validity of the Neuromuscular Recovery Scale: a measurement model approach.

    Science.gov (United States)

    Velozo, Craig; Moorhouse, Michael; Ardolino, Elizabeth; Lorenz, Doug; Suter, Sarah; Basso, D Michele; Behrman, Andrea L

    2015-08-01

    To determine how well the Neuromuscular Recovery Scale (NRS) items fit the Rasch, 1-parameter, partial-credit measurement model. Confirmatory factor analysis (CFA) and principal components analysis (PCA) of residuals were used to determine dimensionality. The Rasch, 1-parameter, partial-credit rating scale model was used to determine rating scale structure, person/item fit, point-measure item correlations, item discrimination, and measurement precision. Seven NeuroRecovery Network clinical sites. Outpatients (N=188) with spinal cord injury. Not applicable. NRS. While the NRS met 1 of 3 CFA criteria, the PCA revealed that the Rasch measurement dimension explained 76.9% of the variance. Ten of 11 items and 91% of the patients fit the Rasch model, with 9 of 11 items showing high discrimination. Sixty-nine percent of the ratings met criteria. The items showed a logical item-difficulty order, with Stand retraining as the easiest item and Walking as the most challenging item. The NRS showed no ceiling or floor effects and separated the sample into almost 5 statistically distinct strata; individuals with an American Spinal Injury Association Impairment Scale (AIS) D classification showed the most ability, and those with an AIS A classification showed the least ability. Items not meeting the rating scale criteria appear to be related to the low frequency counts. The NRS met many of the Rasch model criteria for construct validity. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. Validation of a Simplified Model to Generate Multispectral Synthetic Images

    Directory of Open Access Journals (Sweden)

    Ion Sola

    2015-03-01

    Full Text Available A new procedure to assess the quality of topographic correction (TOC algorithms applied to remote sensing imagery was previously proposed by the authors. This procedure was based on a model that simulated synthetic scenes, representing the radiance an optical sensor would receive from an area under some specific conditions. TOC algorithms were then applied to synthetic scenes and the resulting corrected scenes were compared with a horizontal synthetic scene free of topographic effect. This comparison enabled an objective and quantitative eva