Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling
Directory of Open Access Journals (Sweden)
T. O. Sonnenborg
2015-04-01
Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.
Uncertainty and the Conceptual Site Model
Price, V.; Nicholson, T. J.
2007-12-01
Our focus is on uncertainties in the underlying conceptual framework upon which all subsequent steps in numerical and/or analytical modeling efforts depend. Experienced environmental modelers recognize the value of selecting an optimal conceptual model from several competing site models, but usually do not formally explore possible alternative models, in part due to incomplete or missing site data, as well as relevant regional data for establishing boundary conditions. The value in and approach for developing alternative conceptual site models (CSM) is demonstrated by analysis of case histories. These studies are based on reported flow or transport modeling in which alternative site models are formulated using data that were not available to, or not used by, the original modelers. An important concept inherent to model abstraction of these alternative conceptual models is that it is "Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise." (Tukey, 1962) The case histories discussed here illustrate the value of formulating alternative models and evaluating them using site-specific data: (1) Charleston Naval Site where seismic characterization data allowed significant revision of the CSM and subsequent contaminant transport modeling; (2) Hanford 300-Area where surface- and ground-water interactions affecting the unsaturated zone suggested an alternative component to the site model; (3) Savannah River C-Area where a characterization report for a waste site within the modeled area was not available to the modelers, but provided significant new information requiring changes to the underlying geologic and hydrogeologic CSM's used; (4) Amargosa Desert Research Site (ADRS) where re-interpretation of resistivity sounding data and water-level data suggested an alternative geologic model. Simple 2-D spreadsheet modeling of the ADRS with the revised CSM provided an improved
Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff
Directory of Open Access Journals (Sweden)
A. P. Jacquin
2010-03-01
Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the model representations. The likelihood of the simulated glacier mass balance and snow cover are used for further assessing model credibility. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.
Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff
Directory of Open Access Journals (Sweden)
A. P. Jacquin
2010-08-01
Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the parameter vector. The plausibility of the simulated glacier mass balance and snow cover are used for further constraining the model representations. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.
Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty
Energy Technology Data Exchange (ETDEWEB)
Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.
2004-03-01
The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four
Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors
Carrera, J.; Pool, M.
2014-12-01
Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on
Directory of Open Access Journals (Sweden)
R. Rojas
2009-09-01
Full Text Available In this work we assess the uncertainty in modelling the groundwater flow for the Pampa del Tamarugal Aquifer (PTA – North Chile using a novel and fully integrated multi-model approach aimed at explicitly accounting for uncertainties arising from the definition of alternative conceptual models. The approach integrates the Generalized Likelihood Uncertainty Estimation (GLUE and Bayesian Model Averaging (BMA methods. For each member of an ensemble M of potential conceptualizations, model weights used in BMA for multi-model aggregation are obtained from GLUE-based likelihood values. These model weights are based on model performance, thus, reflecting how well a conceptualization reproduces an observed dataset D. GLUE-based cumulative predictive distributions for each member of M are then aggregated obtaining predictive distributions accounting for conceptual model uncertainties. For the PTA we propose an ensemble of eight alternative conceptualizations covering all major features of groundwater flow models independently developed in past studies and including two recharge mechanisms which have been source of debate for several years. Results showed that accounting for heterogeneities in the hydraulic conductivity field (a reduced the uncertainty in the estimations of parameters and state variables, and (b increased the corresponding model weights used for multi-model aggregation. This was more noticeable when the hydraulic conductivity field was conditioned on available hydraulic conductivity measurements. Contribution of conceptual model uncertainty to the predictive uncertainty varied between 6% and 64% for ground water head estimations and between 16% and 79% for ground water flow estimations. These results clearly illustrate the relevance of conceptual model uncertainty.
Directory of Open Access Journals (Sweden)
R. Rojas
2010-02-01
Full Text Available In this work we assess the uncertainty in modelling the groundwater flow for the Pampa del Tamarugal Aquifer (PTA – North Chile using a novel and fully integrated multi-model approach aimed at explicitly accounting for uncertainties arising from the definition of alternative conceptual models. The approach integrates the Generalized Likelihood Uncertainty Estimation (GLUE and Bayesian Model Averaging (BMA methods. For each member of an ensemble M of potential conceptualizations, model weights used in BMA for multi-model aggregation are obtained from GLUE-based likelihood values. These model weights are based on model performance, thus, reflecting how well a conceptualization reproduces an observed dataset D. GLUE-based cumulative predictive distributions for each member of M are then aggregated obtaining predictive distributions accounting for conceptual model uncertainties. For the PTA we propose an ensemble of eight alternative conceptualizations covering all major features of groundwater flow models independently developed in past studies and including two recharge mechanisms which have been source of debate for several years. Results showed that accounting for heterogeneities in the hydraulic conductivity field (a reduced the uncertainty in the estimations of parameters and state variables, and (b increased the corresponding model weights used for multi-model aggregation. This was more noticeable when the hydraulic conductivity field was conditioned on available hydraulic conductivity measurements. Contribution of conceptual model uncertainty to the predictive uncertainty varied between 6% and 64% for ground water head estimations and between 16% and 79% for ground water flow estimations. These results clearly illustrate the relevance of conceptual model uncertainty.
Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling
Abu Shoaib, S.; Marshall, L. A.; Sharma, A.
2015-12-01
Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.
Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.
2014-12-01
This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.
DEFF Research Database (Denmark)
Thomsen, Nanna Isbak; Troldborg, Mads; McKnight, Ursula S.
2012-01-01
) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk......Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1...... the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We...
Wei Wu; James Clark; James Vose
2010-01-01
Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model â GR4J â by coherently assimilating the uncertainties from the...
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads
2016-05-01
A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information
Thomsen, Nanna I; Binning, Philip J; McKnight, Ursula S; Tuxen, Nina; Bjerg, Poul L; Troldborg, Mads
2016-05-01
A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information
DEFF Research Database (Denmark)
Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.;
models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The developed BBN combines data from desk studies and initial site investigations with expert opinion to assess which of the conceptual models are more...... help inform future investigations at a contaminated site....
Sikorska, A. E.; Scheidegger, A.; Banasik, K.; Rieckermann, J.
2012-04-01
Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.
DEFF Research Database (Denmark)
Gondwe, Bibi Ruth Neuman; Merediz-Alonso, Gonzalo; Bauer-Gottwein, Peter
2011-01-01
to preserve water resources and maintain ecosystem services. Multiple Model Simulation highlights the impact of model structure uncertainty on management decisions using several plausible conceptual models. Multiple Model Simulation was used for this purpose on the Yucatan Peninsula, which is one of the world’s...... abstractions and pollution threatens the fresh water resource, and consequently the ecosystem integrity of both Sian Ka’an and the adjacent coastal environment. Seven different catchment-scale conceptual models were implemented in a distributed hydrological modelling approach. Equivalent porous medium...... largest karstic aquifers. The aquifer is the only available fresh water source for human users and ecosystems on the Peninsula. One of Mexico’s largest protected areas, the groundwater-dependent Sian Ka’an Biosphere Reserve (5280km2) is fed by the aquifer’s thin freshwater lens. Increasing groundwater...
Directory of Open Access Journals (Sweden)
A. E. Sikorska
2011-12-01
Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced by 150% with Bayesian updating, using only a few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.
Directory of Open Access Journals (Sweden)
A. E. Sikorska
2012-04-01
Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.
Mockler, Eva M.; O'Loughlin, Fiachra E.; Bruen, Michael
2016-05-01
Increasing pressures on water quality due to intensification of agriculture have raised demands for environmental modeling to accurately simulate the movement of diffuse (nonpoint) nutrients in catchments. As hydrological flows drive the movement and attenuation of nutrients, individual hydrological processes in models should be adequately represented for water quality simulations to be meaningful. In particular, the relative contribution of groundwater and surface runoff to rivers is of interest, as increasing nitrate concentrations are linked to higher groundwater discharges. These requirements for hydrological modeling of groundwater contribution to rivers initiated this assessment of internal flow path partitioning in conceptual hydrological models. In this study, a variance based sensitivity analysis method was used to investigate parameter sensitivities and flow partitioning of three conceptual hydrological models simulating 31 Irish catchments. We compared two established conceptual hydrological models (NAM and SMARG) and a new model (SMART), produced especially for water quality modeling. In addition to the criteria that assess streamflow simulations, a ratio of average groundwater contribution to total streamflow was calculated for all simulations over the 16 year study period. As observations time-series of groundwater contributions to streamflow are not available at catchment scale, the groundwater ratios were evaluated against average annual indices of base flow and deep groundwater flow for each catchment. The exploration of sensitivities of internal flow path partitioning was a specific focus to assist in evaluating model performances. Results highlight that model structure has a strong impact on simulated groundwater flow paths. Sensitivity to the internal pathways in the models are not reflected in the performance criteria results. This demonstrates that simulated groundwater contribution should be constrained by independent data to ensure results
Directory of Open Access Journals (Sweden)
Marc A. Nelitz
2015-12-01
Full Text Available Complexity and uncertainty are inherent in social-ecological systems. Although they can create challenges for scientists and decision makers, they cannot be a reason for delaying decision making. Two strategies have matured in recent decades to address these challenges. Systems thinking, as embodied by conceptual modeling, is a holistic approach in which a system can be better understood by examining it as a whole. Expert elicitation represents a second strategy that enables a greater diversity of inputs to understand complex systems. We explored the use of conceptual models and expert judgments to inform expansion of monitoring around oil sands development in northern Alberta, Canada, particularly related to migratory forest birds. This study area is a complex social-ecological system for which there is an abundance of specific information, but a relatively weak understanding about system behavior. Multiple conceptual models were developed to represent complexity and provide a more fulsome view of influences across the landscape. A hierarchical approach proved useful, and a mechanistic structure of the models clarified the cumulative and interactive nature of factors within and outside the study area. To address gaps in understanding, expert judgments were integrated using a series of structured exercises to derive "weightings" of importance of different components in the conceptual models, specifically pairwise comparisons, Likert scaling, and a maximum difference conjoint approach. These exercises were helpful for discriminating the importance of different influences and illuminating the competing beliefs of experts. Various supporting tools helped us engage a group of experts from across North America, which included a virtual meeting, online polling, desktop sharing, web survey, and financial incentive. This combination of techniques was innovative and proved useful for addressing complexity and uncertainty in a specific natural resource
DEFF Research Database (Denmark)
Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.;
2016-01-01
to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert...... with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based...
Energy Technology Data Exchange (ETDEWEB)
Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.
2000-02-28
This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases.
Gondwe, Bibi R. N.; Merediz-Alonso, Gonzalo; Bauer-Gottwein, Peter
2011-03-01
SummaryGroundwater management in karst is often based on limited hydrologic understanding of the aquifer. The geologic heterogeneities controlling the water flow are often insufficiently mapped. As karst aquifers are very vulnerable to pollution, groundwater protection and land use management are crucial to preserve water resources and maintain ecosystem services. Multiple Model Simulation highlights the impact of model structure uncertainty on management decisions using several plausible conceptual models. Multiple Model Simulation was used for this purpose on the Yucatan Peninsula, which is one of the world's largest karstic aquifers. The aquifer is the only available fresh water source for human users and ecosystems on the Peninsula. One of Mexico's largest protected areas, the groundwater-dependent Sian Ka'an Biosphere Reserve (5280 km 2) is fed by the aquifer's thin freshwater lens. Increasing groundwater abstractions and pollution threatens the fresh water resource, and consequently the ecosystem integrity of both Sian Ka'an and the adjacent coastal environment. Seven different catchment-scale conceptual models were implemented in a distributed hydrological modelling approach. Equivalent porous medium conceptualizations with uniform and heterogeneous distributions of hydraulic conductivities were used. The models demonstrated that Sian Ka'an's wetlands are indeed groundwater-fed. The water quantities in the wetlands and the flooding dynamics are determined by the larger groundwater catchment. The overall water balance for the model domain showed that recharge constitutes 4400 ± 700 million m 3/year. Of this, 4-12% exits as overland flow, and 88-96% exits as groundwater flow. Net groundwater outflow from the model domain to the north via the Holbox fracture zone appears as an important cross-basin transfer between regions of the Peninsula. Probability maps of Sian Ka'an's catchment were obtained through automatic calibration and stochastic modelling
He, Yujie; Yang, Jinyan; Zhuang, Qianlai; McGuire, Anthony; Zhu, Qing; Liu, Yaling; Teskey, Robert O.
2014-01-01
Conventional Q10 soil organic matter decomposition models and more complex microbial models are available for making projections of future soil carbon dynamics. However, it is unclear (1) how well the conceptually different approaches can simulate observed decomposition and (2) to what extent the trajectories of long-term simulations differ when using the different approaches. In this study, we compared three structurally different soil carbon (C) decomposition models (one Q10 and two microbial models of different complexity), each with a one- and two-horizon version. The models were calibrated and validated using 4 years of measurements of heterotrophic soil CO2 efflux from trenched plots in a Dahurian larch (Larix gmelinii Rupr.) plantation. All models reproduced the observed heterotrophic component of soil CO2 efflux, but the trajectories of soil carbon dynamics differed substantially in 100 year simulations with and without warming and increased litterfall input, with microbial models that produced better agreement with observed changes in soil organic C in long-term warming experiments. Our results also suggest that both constant and varying carbon use efficiency are plausible when modeling future decomposition dynamics and that the use of a short-term (e.g., a few years) period of measurement is insufficient to adequately constrain model parameters that represent long-term responses of microbial thermal adaption. These results highlight the need to reframe the representation of decomposition models and to constrain parameters with long-term observations and multiple data streams. We urge caution in interpreting future soil carbon responses derived from existing decomposition models because both conceptual and parameter uncertainties are substantial.
Energy Technology Data Exchange (ETDEWEB)
Meyer, Philip D.; Ye, Ming; Rockhold, Mark L.; Neuman, Shlomo P.; Cantrell, Kirk J.
2007-07-30
This report to the Nuclear Regulatory Commission (NRC) describes the development and application of a methodology to systematically and quantitatively assess predictive uncertainty in groundwater flow and transport modeling that considers the combined impact of hydrogeologic uncertainties associated with the conceptual-mathematical basis of a model, model parameters, and the scenario to which the model is applied. The methodology is based on a n extension of a Maximum Likelihood implementation of Bayesian Model Averaging. Model uncertainty is represented by postulating a discrete set of alternative conceptual models for a site with associated prior model probabilities that reflect a belief about the relative plausibility of each model based on its apparent consistency with available knowledge and data. Posterior model probabilities are computed and parameter uncertainty is estimated by calibrating each model to observed system behavior; prior parameter estimates are optionally included. Scenario uncertainty is represented as a discrete set of alternative future conditions affecting boundary conditions, source/sink terms, or other aspects of the models, with associated prior scenario probabilities. A joint assessment of uncertainty results from combining model predictions computed under each scenario using as weight the posterior model and prior scenario probabilities. The uncertainty methodology was applied to modeling of groundwater flow and uranium transport at the Hanford Site 300 Area. Eight alternative models representing uncertainty in the hydrogeologic and geochemical properties as well as the temporal variability were considered. Two scenarios represent alternative future behavior of the Columbia River adjacent to the site were considered. The scenario alternatives were implemented in the models through the boundary conditions. Results demonstrate the feasibility of applying a comprehensive uncertainty assessment to large-scale, detailed groundwater flow
Transport Behavior in Fractured Rock under Conceptual and Parametric Uncertainty
Pham, H. V.; Parashar, R.; Sund, N. L.; Pohlmann, K.
2016-12-01
Lack of hydrogeological data and knowledge leads to uncertainty in numerical modeling, and many conceptualizations are often proposed to represent uncertain model components derived from the same data. This study investigates how conceptual and parametric uncertainty influence transport behavior in three-dimensional discrete fracture networks (DFN). dfnWorks, a parallelized computational suite developed at the Los Alamos National Laboratory is used to simulate flow and transport in simple 3D percolating DFNs. Model averaging techniques in a Monte-Carlo framework are adopted to effectively predict contaminant plumes and to quantify prediction uncertainty arising from conceptual and parametric uncertainties. The method is applied to stochastic fracture networks with orthogonal sets of background fractures and domain spanning faults. The sources of uncertainty are the boundary conditions and the fault characteristics. Spatial and temporal analyses of the contaminant plumes are conducted to compute influence of the uncertainty sources on the transport behavior. The flow and transport characteristics of 3D stochastic DFNs under uncertainty help in laying the groundwork for model development and analysis of field scale fractured rock systems.
Hernández-López, Mario R.; Romero-Cuéllar, Jonathan; Camilo Múnera-Estrada, Juan; Coccia, Gabriele; Francés, Félix
2017-04-01
It is noticeably important to emphasize the role of uncertainty particularly when the model forecasts are used to support decision-making and water management. This research compares two approaches for the evaluation of the predictive uncertainty in hydrological modeling. First approach is the Bayesian Joint Inference of hydrological and error models. Second approach is carried out through the Model Conditional Processor using the Truncated Normal Distribution in the transformed space. This comparison is focused on the predictive distribution reliability. The case study is applied to two basins included in the Model Parameter Estimation Experiment (MOPEX). These two basins, which have different hydrological complexity, are the French Broad River (North Carolina) and the Guadalupe River (Texas). The results indicate that generally, both approaches are able to provide similar predictive performances. However, the differences between them can arise in basins with complex hydrology (e.g. ephemeral basins). This is because obtained results with Bayesian Joint Inference are strongly dependent on the suitability of the hypothesized error model. Similarly, the results in the case of the Model Conditional Processor are mainly influenced by the selected model of tails or even by the selected full probability distribution model of the data in the real space, and by the definition of the Truncated Normal Distribution in the transformed space. In summary, the different hypotheses that the modeler choose on each of the two approaches are the main cause of the different results. This research also explores a proper combination of both methodologies which could be useful to achieve less biased hydrological parameter estimation. For this approach, firstly the predictive distribution is obtained through the Model Conditional Processor. Secondly, this predictive distribution is used to derive the corresponding additive error model which is employed for the hydrological parameter
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
In order to set up a conceptual data model that reflects the real world as accurately as possible,this paper firstly reviews and analyzes the disadvantages of previous conceptual data models used by traditional GIS in simulating geographic space,gives a new explanation to geographic space and analyzes its various essential characteristics.Finally,this paper proposes several detailed key points for designing a new type of GIS data model and gives a simple holistic GIS data model.
Uncertainty in biodiversity science, policy and management: a conceptual overview
Directory of Open Access Journals (Sweden)
Yrjö Haila
2014-10-01
Full Text Available The protection of biodiversity is a complex societal, political and ultimately practical imperative of current global society. The imperative builds upon scientific knowledge on human dependence on the life-support systems of the Earth. This paper aims at introducing main types of uncertainty inherent in biodiversity science, policy and management, as an introduction to a companion paper summarizing practical experiences of scientists and scholars (Haila et al. 2014. Uncertainty is a cluster concept: the actual nature of uncertainty is inherently context-bound. We use semantic space as a conceptual device to identify key dimensions of uncertainty in the context of biodiversity protection; these relate to [i] data; [ii] proxies; [iii] concepts; [iv] policy and management; and [v] normative goals. Semantic space offers an analytic perspective for drawing critical distinctions between types of uncertainty, identifying fruitful resonances that help to cope with the uncertainties, and building up collaboration between different specialists to support mutual social learning.
Multidisciplinary aircraft conceptual design optimization considering fidelity uncertainties
Neufeld, Daniel
Aircraft conceptual design traditionally utilizes simplified analysis methods and empirical equations to establish the basic layout of new aircraft. Applying optimization methods to aircraft conceptual design may yield solutions that are found to violate constraints when more sophisticated analysis methods are introduced. The designer's confidence that proposed conceptual designs will meet their performance targets is limited when conventional optimization approaches are utilized. Therefore, there is a need for an optimization approach that takes into account the uncertainties that arise when traditional analysis methods are used in aircraft conceptual design optimization. This research introduces a new aircraft conceptual design optimization approach that utilizes the concept of Reliability Based Design Optimization (RBDO). RyeMDO, a framework for multi-objective, multidisciplinary RBDO was developed for this purpose. The performance and effectiveness of the RBDO-MDO approaches implemented in RyeMDO were evaluated to identify the most promising approaches for aircraft conceptual design optimization. Additionally, an approach for quantifying the errors introduced by approximate analysis methods was developed. The approach leverages available historical data to quantify the uncertainties introduced by approximate analysis methods in two engineering case studies: the conceptual design optimization of an aircraft wing box structure and the conceptual design optimization of a commercial aircraft. The case studies were solved with several of the most promising RBDO-MDO integrated approaches. The proposed approach yields more conservative solutions and estimates the risk associated with each solution, enabling designers to reduce the likelihood that conceptual aircraft designs will fail to meet objectives later in the design process.
Arnaoudova, Kristina; Stanchev, Peter
2015-11-01
The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.
Urban drainage models - making uncertainty analysis simple
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana;
2012-01-01
There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...... probability distributions (often used for sensitivity analyses) and prediction intervals. To demonstrate the new method, it is applied to a conceptual rainfall-runoff model using a dataset collected from Melbourne, Australia....
Conceptual modeling and the Lexicon
Hoppenbrouwers, J.J.A.C.
1997-01-01
'Conceptual Modeling and the Lexicon' investigates the linguistic aspects of conceptual modeling, concentrating on the terminology part. The author combines theoretical ideas and empirical facts from various scientific fields, such as cognitive psychology, computer science, lexicography, psycholingu
Uncertainties in repository modeling
Energy Technology Data Exchange (ETDEWEB)
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Gwo, Jin-Ping; Jardine, Philip M.; Sanford, William E.
2005-03-01
Multiple factors may affect the scale-up of laboratory multi-tracer injection into structured porous media to the field. Under transient flow conditions and with multiscale heterogeneities in the field, previous attempts to scale-up laboratory experiments have not answered definitely the questions about the governing mechanisms and the spatial extent of the influence of small-scale mass transfer processes such as matrix diffusion. The objective of this research is to investigate the effects of multiscale heterogeneity, mechanistic and site model conceptualization, and source term density effect on elucidating and interpreting tracer movement in the field. Tracer release and monitoring information previously obtained in a field campaign of multiple, conservative tracer injection under natural hydraulic gradients at a low-level waste disposal site in eastern Tennessee, United States, is used for the research. A suite of two-pore-domain, or fracture-matrix, groundwater flow and transport models are calibrated and used to conduct model parameter and prediction uncertainty analyses. These efforts are facilitated by a novel nested Latin-hypercube sampling technique. Our results verify, at field scale, a multiple-pore-domain, multiscale mechanistic conceptual model that was used previously to interpret only laboratory observations. The results also suggest that, integrated over the entire field site, mass flux rates attributable to small-scale mass transfer are comparable to that of field-scale solute transport. The uncertainty analyses show that fracture spacing is the most important model parameter and model prediction uncertainty is relatively higher at the interface between the preferred flow path and its parent bedrock. The comparisons of site conceptual models indicate that the effect of matrix diffusion may be confined to the immediate neighborhood of the preferential flow path. Finally, because the relatively large amount of tracer needed for field studies, it is
Pragmatic aspects of uncertainty propagation: A conceptual review
Thacker, W. Carlisle; Iskandarani, Mohamed; Gonçalves, Rafael C.; Srinivasan, Ashwanth; Knio, Omar M.
2015-11-01
When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and (ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.
Pragmatic aspects of uncertainty propagation: A conceptual review
Thacker, W.Carlisle
2015-09-11
When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.
Evaluating uncertainty in simulation models
Energy Technology Data Exchange (ETDEWEB)
McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.
1998-12-01
The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.
Chemical model reduction under uncertainty
Najm, Habib
2016-01-05
We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.
Conceptual Modeling for Discrete-Event Simulation
Robinson, Stewart
2010-01-01
What is a conceptual model? How is conceptual modeling performed in general and in specific modeling domains? What is the role of established approaches in conceptual modeling? This book addresses these issues
Representing uncertainty on model analysis plots
Smith, Trevor I.
2016-12-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.
Ecosystem conceptual model- Mercury
Alpers, Charles N.; Eagles-Smith, Collin A.; Foe, Chris; Klasing, Susan; Marvin-DiPasquale, Mark C.; Slotton, Darell G.; Windham-Myers, Lisamarie
2008-01-01
mercury conceptual model and its four submodels (1. Methylation, 2. Bioaccumulation, 3. Human Health Effects, and 4. Wildlife Heath Effects) can be used to understand the general relationships among drivers and outcomes associated with mercury cycling in the Delta. Several linkages between important drivers and outcomes have been identified as important but highly uncertain (i.e. poorly understood). For example, there may be significant wildlife health effect of mercury on mammals and reptiles in the Delta, but there is currently very little or no information about it. The characteristics of such linkages are important when prioritizing and funding restoration projects and associated monitoring in the Delta and its tributaries.
Uncertainty in Air Quality Modeling.
Fox, Douglas G.
1984-01-01
Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that
Event-Based Conceptual Modeling
DEFF Research Database (Denmark)
Bækgaard, Lars
2009-01-01
The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...
Numerical modeling of economic uncertainty
DEFF Research Database (Denmark)
Schjær-Jacobsen, Hans
2007-01-01
Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...
Uncertainties in Nuclear Proliferation Modeling
Energy Technology Data Exchange (ETDEWEB)
Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)
2015-05-15
There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies.
Uncertainty in hydrological change modelling
DEFF Research Database (Denmark)
Seaby, Lauren Paige
Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...
Model Breaking Points Conceptualized
Vig, Rozy; Murray, Eileen; Star, Jon R.
2014-01-01
Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…
Model Breaking Points Conceptualized
Vig, Rozy; Murray, Eileen; Star, Jon R.
2014-01-01
Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…
Numerical modeling of economic uncertainty
DEFF Research Database (Denmark)
Schjær-Jacobsen, Hans
2007-01-01
Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....
Use of models in conceptual design
Bonnema, G. Maarten; Houten, van Fred J.A.M.
2006-01-01
This article investigates the use of product models by conceptual designers. After a short introduction, abstraction applied in conceptual design is described. A model that places conceptual design in a three-dimensional space is used. Applications of conceptual design from the literature are used t
Use of models in conceptual design
Bonnema, Gerrit Maarten; van Houten, Frederikus J.A.M.
2006-01-01
This article investigates the use of product models by conceptual designers. After a short introduction, abstraction applied in conceptual design is described. A model that places conceptual design in a three-dimensional space is used. Applications of conceptual design from the literature are used t
Are models, uncertainty, and dispute resolution compatible?
Anderson, J. D.; Wilson, J. L.
2013-12-01
Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see
Model Uncertainty for Bilinear Hysteric Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...... density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model...... uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used....
Adressing Replication and Model Uncertainty
DEFF Research Database (Denmark)
Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld
Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...
Adressing Replication and Model Uncertainty
DEFF Research Database (Denmark)
Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld
Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...
Model uncertainty in growth empirics
Prüfer, P.
2008-01-01
This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high gro
The conceptualization model problem—surprise
Bredehoeft, John
2005-03-01
The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait
Conceptual Models for Search Engines
Hendry, D. G.; Efthimiadis, E. N.
Search engines have entered popular culture. They touch people in diverse private and public settings and thus heighten the importance of such important social matters as information privacy and control, censorship, and equitable access. To fully benefit from search engines and to participate in debate about their merits, people necessarily appeal to their understandings for how they function. In this chapter we examine the conceptual understandings that people have of search engines by performing a content analysis on the sketches that 200 undergraduate and graduate students drew when asked to draw a sketch of how a search engine works. Analysis of the sketches reveals a diverse range of conceptual approaches, metaphors, representations, and misconceptions. On the whole, the conceptual models articulated by these students are simplistic. However, students with higher levels of academic achievement sketched more complete models. This research calls attention to the importance of improving students' technical knowledge of how search engines work so they can be better equipped to develop and advocate policies for how search engines should be embedded in, and restricted from, various private and public information settings.
Uncertainty quantification for environmental models
Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming
2012-01-01
Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10
Chemical model reduction under uncertainty
Malpica Galassi, Riccardo
2017-03-06
A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.
Critical conceptualism in environmental modeling and prediction.
Christakos, G
2003-10-15
Many important problems in environmental science and engineering are of a conceptual nature. Research and development, however, often becomes so preoccupied with technical issues, which are themselves fascinating, that it neglects essential methodological elements of conceptual reasoning and theoretical inquiry. This work suggests that valuable insight into environmental modeling can be gained by means of critical conceptualism which focuses on the software of human reason and, in practical terms, leads to a powerful methodological framework of space-time modeling and prediction. A knowledge synthesis system develops the rational means for the epistemic integration of various physical knowledge bases relevant to the natural system of interest in order to obtain a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, generate meaningful predictions of environmental processes in space-time, and produce science-based decisions. No restriction is imposed on the shape of the distribution model or the form of the predictor (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated). The scientific reasoning structure underlying knowledge synthesis involves teleologic criteria and stochastic logic principles which have important advantages over the reasoning method of conventional space-time techniques. Insight is gained in terms of real world applications, including the following: the study of global ozone patterns in the atmosphere using data sets generated by instruments on board the Nimbus 7 satellite and secondary information in terms of total ozone-tropopause pressure models; the mapping of arsenic concentrations in the Bangladesh drinking water by assimilating hard and soft data from an extensive network of monitoring wells; and the dynamic imaging of probability distributions of pollutants across the Kalamazoo river.
Uncertainty Quantification in Climate Modeling
Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.
2011-12-01
We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis
toolkit computational mesh conceptual model.
Energy Technology Data Exchange (ETDEWEB)
Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.
2010-03-01
The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.
Applied research in uncertainty modeling and analysis
Ayyub, Bilal
2005-01-01
Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2009-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....
Committee of machine learning predictors of hydrological models uncertainty
Kayastha, Nagendra; Solomatine, Dimitri
2014-05-01
In prediction of uncertainty based on machine learning methods, the results of various sampling schemes namely, Monte Carlo sampling (MCS), generalized likelihood uncertainty estimation (GLUE), Markov chain Monte Carlo (MCMC), shuffled complex evolution metropolis algorithm (SCEMUA), differential evolution adaptive metropolis (DREAM), particle swarm optimization (PSO) and adaptive cluster covering (ACCO)[1] used to build a predictive models. These models predict the uncertainty (quantiles of pdf) of a deterministic output from hydrological model [2]. Inputs to these models are the specially identified representative variables (past events precipitation and flows). The trained machine learning models are then employed to predict the model output uncertainty which is specific for the new input data. For each sampling scheme three machine learning methods namely, artificial neural networks, model tree, locally weighted regression are applied to predict output uncertainties. The problem here is that different sampling algorithms result in different data sets used to train different machine learning models which leads to several models (21 predictive uncertainty models). There is no clear evidence which model is the best since there is no basis for comparison. A solution could be to form a committee of all models and to sue a dynamic averaging scheme to generate the final output [3]. This approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model HBV in the Nzoia catchment in Kenya. [1] N. Kayastha, D. L. Shrestha and D. P. Solomatine. Experiments with several methods of parameter uncertainty estimation in hydrological modeling. Proc. 9th Intern. Conf. on Hydroinformatics, Tianjin, China, September 2010. [2] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press
Real-world semantics of conceptual models
Wieringa, Roel
2011-01-01
Conceptual modelling is the addition of more real-world semantics to the computations performed by a computer. It is argued that in a proper engineering approach to computing, three kinds of conceptual modelling need to be distinguished, (1) modelling a software solution, (2) modelling the domain in
Dealing with uncertainties in fusion power plant conceptual development
Kemp, R.; Lux, H.; Kovari, M.; Morris, J.; Wenninger, R.; Zohm, H.; Biel, W.; Federici, G.
2017-04-01
Although the ultimate goal of most current fusion research is to build an economically attractive power plant, the present status of physics and technology does not provide the performance necessary to achieve this goal. Therefore, in order to model how such plants may operate and what their output might be, extrapolations must be made from existing experimental data and technology. However, the expected performance of a plant built to the operating point specifications can only ever be a ‘best guess’. Extrapolations far beyond the current operating regimes are necessarily uncertain, and some important interactions, for example the coupling of conducted power from the scape-off layer to the divertor surface, lack reliable predictive models. This means both that the demands on plant systems at the target operating point can vary significantly from the nominal value, and that the overall plant performance may potentially fall short of design targets. In this contribution we discuss tools and techniques that have been developed to assess the robustness of the operating points for the EU-DEMO tokamak-based demonstration power plant, and the consequences for its design. The aim is to make explicit the design choices and areas where improved modelling and DEMO-relevant experiments will have the greatest impact on confidence in a successful DEMO design.
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2009-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario...
A Conceptual Data Model of Datum Systems
McCaleb, Michael R.
1999-01-01
A new conceptual data model that addresses the geometric dimensioning and tolerancing concepts of datum systems, datums, datum features, datum targets, and the relationships among these concepts, is presented. Additionally, a portion of a related data model, Part 47 of STEP (ISO 10303-47), is reviewed and a comparison is made between it and the new conceptual data model.
Conceptual and logical level of database modeling
Hunka, Frantisek; Matula, Jiri
2016-06-01
Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.
Indian Academy of Sciences (India)
Diego Rivera; Yessica Rivas; Alex Godoy
2015-02-01
Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s−1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.
Ozkal, Kudret; Tekkaya, Ceren; Cakiroglu, Jale; Sungur, Semra
2009-01-01
This study proposed a conceptual model of relationships among constructivist learning environment perception variables (Personal Relevance, Uncertainty, Critical Voice, Shared Control, and Student Negotiation), scientific epistemological belief variables (fixed and tentative), and learning approach. It was proposed that learning environment…
Uncertainty analysis of fluvial outcrop data for stochastic reservoir modelling
Energy Technology Data Exchange (ETDEWEB)
Martinius, A.W. [Statoil Research Centre, Trondheim (Norway); Naess, A. [Statoil Exploration and Production, Stjoerdal (Norway)
2005-07-01
Uncertainty analysis and reduction is a crucial part of stochastic reservoir modelling and fluid flow simulation studies. Outcrop analogue studies are often employed to define reservoir model parameters but the analysis of uncertainties associated with sedimentological information is often neglected. In order to define uncertainty inherent in outcrop data more accurately, this paper presents geometrical and dimensional data from individual point bars and braid bars, from part of the low net:gross outcropping Tortola fluvial system (Spain) that has been subjected to a quantitative and qualitative assessment. Four types of primary outcrop uncertainties are discussed: (1) the definition of the conceptual depositional model; (2) the number of observations on sandstone body dimensions; (3) the accuracy and representativeness of observed three-dimensional (3D) sandstone body size data; and (4) sandstone body orientation. Uncertainties related to the depositional model are the most difficult to quantify but can be appreciated qualitatively if processes of deposition related to scales of time and the general lack of information are considered. Application of the N
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2012-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....
Towards Ontological Foundations for UML Conceptual Models
Guizzardi, Giancarlo; Herre, Heinrich; Wagner, Gerd; Meerman, Robert; Tari, Zahir
2002-01-01
UML class diagrams can be used as a language for expressing a conceptual model of a domain. We use the General Ontological Language (GOL) and its underlying upper level ontology, proposed in [1], to evaluate the ontological correctness of a conceptual UML class model and to develop guidelines for
A Multivariate Model of Conceptual Change
Taasoobshirazi, Gita; Heddy, Benjamin; Bailey, MarLynn; Farley, John
2016-01-01
The present study used the Cognitive Reconstruction of Knowledge Model (CRKM) model of conceptual change as a framework for developing and testing how key cognitive, motivational, and emotional variables are linked to conceptual change in physics. This study extends an earlier study developed by Taasoobshirazi and Sinatra ("J Res Sci…
Uncertainty in tsunami sediment transport modeling
Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.
2016-01-01
Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.
Bayesian Uncertainty Analyses Via Deterministic Model
Krzysztofowicz, R.
2001-05-01
Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.
Wastewater treatment modelling: dealing with uncertainties
DEFF Research Database (Denmark)
Belia, E.; Amerlinck, Y.; Benedetti, L.;
2009-01-01
This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...
Uncertainty propagation within the UNEDF models
Haverinen, T
2016-01-01
The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties on binding energies for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.
Uncertainty propagation within the UNEDF models
Haverinen, T.; Kortelainen, M.
2017-04-01
The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.
Conceptual Models Core to Good Design
Johnson, Jeff
2011-01-01
People make use of software applications in their activities, applying them as tools in carrying out tasks. That this use should be good for people--easy, effective, efficient, and enjoyable--is a principal goal of design. In this book, we present the notion of Conceptual Models, and argue that Conceptual Models are core to achieving good design. From years of helping companies create software applications, we have come to believe that building applications without Conceptual Models is just asking for designs that will be confusing and difficult to learn, remember, and use. We show how Concept
The role of observational uncertainties in testing model hypotheses
Westerberg, I. K.; Birkel, C.
2012-12-01
Knowledge about hydrological processes and the spatial and temporal distribution of water resources is needed as a basis for managing water for hydropower, agriculture and flood-protection. Conceptual hydrological models may be used to infer knowledge on catchment functioning but are affected by uncertainties in the model representation of reality as well as in the observational data used to drive the model and to evaluate model performance. Therefore, meaningful hypothesis testing of the hydrological functioning of a catchment requires such uncertainties to be carefully estimated and accounted for in model calibration and evaluation. The aim of this study was to investigate the role of observational uncertainties in hypothesis testing, in particular whether it was possible to detect model-structural representations that were wrong in an important way given the uncertainties in the observational data. We studied the relatively data-scarce tropical Sarapiqui catchment in Costa Rica, Central America, where water resources play a vital part for hydropower production and livelihood. We tested several model structures of varying complexity as hypotheses about catchment functioning, but also hypotheses about the nature of the modelling errors. The tests were made within a learning framework for uncertainty estimation which enabled insights into data uncertainties, suitable model-structural representations and appropriate likelihoods. The observational uncertainty in discharge data was estimated from a rating-curve analysis and precipitation measurement errors through scenarios relating the error to, for example, canopy interception, wind-driven rain and the elevation gradient. The hypotheses were evaluated in a posterior analysis of the simulations where the performance of each simulation was analysed relative to the observational uncertainties for the entire hydrograph as well as for different aspects of the hydrograph (e.g. peak flows, recession periods, and base flow
Parameter and Uncertainty Estimation in Groundwater Modelling
DEFF Research Database (Denmark)
Jensen, Jacob Birk
The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... and uncertainty estimation. Essential issues relating to calibration are discussed. The classical regression methods are described; however, the main focus is on the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The next two chapters describe case studies in which the GLUE methodology...
Return Predictability, Model Uncertainty, and Robust Investment
DEFF Research Database (Denmark)
Lukas, Manuel
Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...
Uncertainty in spatially explicit animal dispersal models
Mooij, W.M.; DeAngelis, D.L.
2003-01-01
Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three level
Modeling uncertainty in geographic information and analysis
Institute of Scientific and Technical Information of China (English)
2008-01-01
Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.
Front-end conceptual platform modeling
DEFF Research Database (Denmark)
Guðlaugsson, Tómas Vignir; Ravn, Poul Martin; Mortensen, Niels Henrik
2014-01-01
Platform thinking has been the subject of investigation and deployment in many projects in both academia and industry. Most contributions involve the restructuring of product programs, and only a few support front-end development of a new platform in parallel with technology development....... This contribution deals with the development of product platforms in front-end projects and introduces a modeling tool: the Conceptual Product Platform model. State of the art within platform modeling forms the base of a modeling formalism for a Conceptual Product Platform model. The modeling formalism is explored...... through an example and applied in a case in which the Conceptual Product Platform model has supported the front-end development of a platform for an electro-active polymer technology. The case describes the contents of the model and how its application supported the development work in the project...
Urban drainage models - making uncertainty analysis simple
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana
2012-01-01
There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...
Nuclear data requirements for the ADS conceptual design EFIT: Uncertainty and sensitivity study
Energy Technology Data Exchange (ETDEWEB)
Garcia-Herranz, N., E-mail: nuria@din.upm.e [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Cabellos, O. [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Alvarez-Velarde, F. [CIEMAT (Spain); Sanz, J. [Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Departamento de Ingenieria Energetica, UNED (Spain); Gonzalez-Romero, E.M. [CIEMAT (Spain); Juan, J. [Laboratorio de Estadistica, Universidad Politecnica de Madrid (Spain)
2010-11-15
In this paper, we assess the impact of activation cross-section uncertainties on relevant fuel cycle parameters for a conceptual design of a modular European Facility for Industrial Transmutation (EFIT) with a 'double strata' fuel cycle. Next, the nuclear data requirements are evaluated so that the parameters can meet the assigned design target accuracies. Different discharge burn-up levels are considered: a low burn-up, corresponding to the equilibrium cycle, and a high burn-up level, simulating the effects on the fuel of the multi-recycling scenario. In order to perform this study, we propose a methodology in two steps. Firstly, we compute the uncertainties on the system parameters by using a Monte Carlo simulation, as it is considered the most reliable approach to address this problem. Secondly, the analysis of the results is performed by a sensitivity technique, in order to identify the relevant reaction channels and prioritize the data improvement needs. Cross-section uncertainties are taken from the EAF-2007/UN library since it includes data for all the actinides potentially present in the irradiated fuel. Relevant uncertainties in some of the fuel cycle parameters have been obtained, and we conclude with recommendations for future nuclear data measurement programs, beyond the specific results obtained with the present nuclear data files and the limited available covariance information. A comparison with the uncertainty and accuracy analysis recently published by the WPEC-Subgroup26 of the OECD using BOLNA covariance matrices is performed. Despite the differences in the transmuter reactor used for the analysis, some conclusions obtained by Subgroup26 are qualitatively corroborated, and improvements for additional cross sections are suggested.
Incorporating Fuzzy Systems Modeling and Possibility Theory in Hydrogeological Uncertainty Analysis
Faybishenko, B.
2008-12-01
Hydrogeological predictions are subject to numerous uncertainties, including the development of conceptual, mathematical, and numerical models, as well as determination of their parameters. Stochastic simulations of hydrogeological systems and the associated uncertainty analysis are usually based on the assumption that the data characterizing spatial and temporal variations of hydrogeological processes are random, and the output uncertainty is quantified using a probability distribution. However, hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete or subjective information. One of the modern approaches to modeling and uncertainty quantification of such systems is based on using a combination of statistical and fuzzy-logic uncertainty analyses. The aims of this presentation are to: (1) present evidence of fuzziness in developing conceptual hydrogeological models, and (2) give examples of the integration of the statistical and fuzzy-logic analyses in modeling and assessing both aleatoric uncertainties (e.g., caused by vagueness in assessing the subsurface system heterogeneities of fractured-porous media) and epistemic uncertainties (e.g., caused by the selection of different simulation models) involved in hydrogeological modeling. The author will discuss several case studies illustrating the application of fuzzy modeling for assessing the water balance and water travel time in unsaturated-saturated media. These examples will include the evaluation of associated uncertainties using the main concepts of possibility theory, a comparison between the uncertainty evaluation using probabilistic and possibility theories, and a transformation of the probabilities into possibilities distributions (and vice versa) for modeling hydrogeological processes.
Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling
DEFF Research Database (Denmark)
Kano Glückstad, Fumiko; Herlau, Tue; Schmidt, Mikkel N.
2013-01-01
This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations dier according to dierent types of mother langua...
Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling
DEFF Research Database (Denmark)
Glückstad, Fumiko Kano; Herlau, Tue; Schmidt, Mikkel Nørgaard
2013-01-01
This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother la...
Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling
DEFF Research Database (Denmark)
Kano Glückstad, Fumiko; Herlau, Tue; Schmidt, Mikkel N.
2013-01-01
This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations dier according to dierent types of mother langua...
Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling
DEFF Research Database (Denmark)
Glückstad, Fumiko Kano; Herlau, Tue; Schmidt, Mikkel Nørgaard
2013-01-01
This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother la...
Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses
Energy Technology Data Exchange (ETDEWEB)
Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-08-01
We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.
Event-Based Conceptual Modeling
DEFF Research Database (Denmark)
Bækgaard, Lars
The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....
Selective Maintenance Model Considering Time Uncertainty
Le Chen; Zhengping Shu; Yuan Li; Xuezhi Lv
2012-01-01
This study proposes a selective maintenance model for weapon system during mission interval. First, it gives relevant definitions and operational process of material support system. Then, it introduces current research on selective maintenance modeling. Finally, it establishes numerical model for selecting corrective and preventive maintenance tasks, considering time uncertainty brought by unpredictability of maintenance procedure, indetermination of downtime for spares and difference of skil...
Uncertainty calculation in transport models and forecasts
DEFF Research Database (Denmark)
Manzo, Stefano; Prato, Carlo Giacomo
in a four-stage transport model related to different variable distributions (to be used in a Monte Carlo simulation procedure), assignment procedures and levels of congestion, at both the link and the network level. The analysis used as case study the Næstved model, referring to the Danish town of Næstved2...... the uncertainty propagation pattern over time specific for key model outputs becomes strategically important. 1 Manzo, S., Nielsen, O. A. & Prato, C. G. (2014). The Effects of uncertainty in speed-flow curve parameters on a large-scale model. Transportation Research Record, 1, 30-37. 2 Manzo, S., Nielsen, O. A...
FRSAD conceptual modeling of aboutness
Zeng, Marcia; Žumer, Maja
2012-01-01
The first comprehensive exploration of the development and use of the International Federation of Library Associations and Institutions' (IFLA) newly released model for subject authority data, covering everything from the rationale for creating the model to practical steps for implementing it.
Uncertainty modeling process for semantic technology
Directory of Open Access Journals (Sweden)
Rommel N. Carvalho
2016-08-01
Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.
Parametric uncertainty modeling for robust control
DEFF Research Database (Denmark)
Rasmussen, K.H.; Jørgensen, Sten Bay
1999-01-01
The dynamic behaviour of a non-linear process can often be approximated with a time-varying linear model. In the presented methodology the dynamics is modeled non-conservatively as parametric uncertainty in linear lime invariant models. The obtained uncertainty description makes it possible...... method can be utilized in identification of a nominal model with uncertainty description. The method is demonstrated on a binary distillation column operating in the LV configuration. The dynamics of the column is approximated by a second order linear model, wherein the parameters vary as the operating...... to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed...
Statistical assessment of predictive modeling uncertainty
Barzaghi, Riccardo; Marotta, Anna Maria
2017-04-01
When the results of geophysical models are compared with data, the uncertainties of the model are typically disregarded. We propose a method for defining the uncertainty of a geophysical model based on a numerical procedure that estimates the empirical auto and cross-covariances of model-estimated quantities. These empirical values are then fitted by proper covariance functions and used to compute the covariance matrix associated with the model predictions. The method is tested using a geophysical finite element model in the Mediterranean region. Using a novel χ2 analysis in which both data and model uncertainties are taken into account, the model's estimated tectonic strain pattern due to the Africa-Eurasia convergence in the area that extends from the Calabrian Arc to the Alpine domain is compared with that estimated from GPS velocities while taking into account the model uncertainty through its covariance structure and the covariance of the GPS estimates. The results indicate that including the estimated model covariance in the testing procedure leads to lower observed χ2 values that have better statistical significance and might help a sharper identification of the best-fitting geophysical models.
Uncertainty Quantification in Climate Modeling and Projection
Energy Technology Data Exchange (ETDEWEB)
Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel
2016-05-01
The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for
Handling Unquantifiable Uncertainties in Landslide Modelling
Almeida, S.; Holcombe, E.; Pianosi, F.; Wagener, T.
2015-12-01
Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, there is no agreement on what probability distribution should be used to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform adequately under a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use and combine several GSA methods including the Method of Morris, Regional Sensitivity Analysis and CART, as well as advanced visualization tools. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.
Logistics and Transport - a conceptual model
DEFF Research Database (Denmark)
Jespersen, Per Homann; Drewes, Lise
2004-01-01
This paper describes how the freight transport sector is influenced by logistical principles of production and distribution. It introduces new ways of understanding freight transport as an integrated part of the changing trends of mobility. By introducing a conceptual model for understanding...... the interaction between logistics and transport, it points at ways to over-come inherent methodological difficulties when studying this relation...
Conceptual Models of Frontal Cyclones.
Eagleman, Joe R.
1981-01-01
This discussion of weather models uses maps to illustrate the differences among three types of frontal cyclones (long wave, short wave, and troughs). Awareness of these cyclones can provide clues to atmospheric conditions which can lead toward accurate weather forecasting. (AM)
Model Uncertainty for Bilinear Hysteretic Systems
DEFF Research Database (Denmark)
1984-01-01
is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...
Coping with Uncertainty Modeling and Policy Issues
Marti, Kurt; Makowski, Marek
2006-01-01
Ongoing global changes bring fundamentally new scientific problems requiring new concepts and tools. The complexity of new problems does not allow to achieve enough certainty by increasing the resolution of models or by bringing in more links. This book talks about new tools for modeling and management of uncertainty.
Uncertainty in spatially explicit animal dispersal models
Mooij, Wolf M.; DeAngelis, Donald L.
2003-01-01
Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.
Uncertainty in biology a computational modeling approach
Gomez-Cabrero, David
2016-01-01
Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process. This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples. This book is intended for graduate stude...
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
Energy Technology Data Exchange (ETDEWEB)
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.
Calibration of Conceptual Rainfall-Runoff Models Using Global Optimization
Directory of Open Access Journals (Sweden)
Chao Zhang
2015-01-01
Full Text Available Parameter optimization for the conceptual rainfall-runoff (CRR model has always been the difficult problem in hydrology since watershed hydrological model is high-dimensional and nonlinear with multimodal and nonconvex response surface and its parameters are obviously related and complementary. In the research presented here, the shuffled complex evolution (SCE-UA global optimization method was used to calibrate the Xinanjiang (XAJ model. We defined the ideal data and applied the method to observed data. Our results show that, in the case of ideal data, the data length did not affect the parameter optimization for the hydrological model. If the objective function was selected appropriately, the proposed method found the true parameter values. In the case of observed data, we applied the technique to different lengths of data (1, 2, and 3 years and compared the results with ideal data. We found that errors in the data and model structure lead to significant uncertainties in the parameter optimization.
HESS Opinions "Topography driven conceptual modelling (FLEX-Topo"
Directory of Open Access Journals (Sweden)
H. H. G. Savenije
2010-07-01
Full Text Available Heterogeneity and complexity of hydrological processes offer substantial challenges to the hydrological modeller. Some hydrologists try to tackle this problem by introducing more and more detail in their models, or by setting-up more and more complicated models starting from basic principles at the smallest possible level. As we know, this reductionist approach leads to ever higher levels of equifinality and predictive uncertainty. On the other hand, simple, lumped and parsimonious models may be too simple to be realistic or representative of the dominant hydrological processes. In this commentary, a new model approach is proposed that tries to find the middle way between complex distributed and simple lumped modelling approaches. Here we try to find the right level of simplification while avoiding over-simplification. Paraphrasing Einstein, the maxim is: make a model as simple as possible, but not simpler than that. The approach presented is process based, but not physically based in the traditional sense. Instead, it is based on a conceptual representation of the dominant physical processes in certain key elements of the landscape. The essence of the approach is that the model structure is made dependent on a limited number of landscape classes in which the topography is the main driver, but which can include geological, geomorphological or land-use classification. These classes are then represented by lumped conceptual models that act in parallel. The advantage of this approach over a fully distributed conceptualisation is that it retains maximum simplicity while taking into account observable landscape characteristics.
Directory of Open Access Journals (Sweden)
Herwig Reiter
2010-01-01
Full Text Available The article proposes a general, empirically grounded model for analyzing biographical uncertainty. The model is based on findings from a qualitative-explorative study of transforming meanings of unemployment among young people in post-Soviet Lithuania. In a first step, the particular features of the uncertainty puzzle in post-communist youth transitions are briefly discussed. A historical event like the collapse of state socialism in Europe, similar to the recent financial and economic crisis, is a generator of uncertainty par excellence: it undermines the foundations of societies and the taken-for-grantedness of related expectations. Against this background, the case of a young woman and how she responds to the novel threat of unemployment in the transition to the world of work is introduced. Her uncertainty management in the specific time perspective of certainty production is then conceptually rephrased by distinguishing three types or levels of biographical uncertainty: knowledge, outcome, and recognition uncertainty. Biographical uncertainty, it is argued, is empirically observable through the analysis of acting and projecting at the biographical level. The final part synthesizes the empirical findings and the conceptual discussion into a stratification model of biographical uncertainty as a general tool for the biographical analysis of uncertainty phenomena. URN: urn:nbn:de:0114-fqs100120
Uncertainty in hydrological change modelling
DEFF Research Database (Denmark)
Seaby, Lauren Paige
methodology for basin discharge and groundwater heads. The ensemble of 11 climate models varied in strength, significance, and sometimes in direction of the climate change signal. The more complex daily DBS correction methods were more accurate at transferring precipitation changes in mean as well...... as the variance, and improving the characterisation of day to day variation as well as heavy events. However, the most highly parameterised of the DBS methods were less robust under climate change conditions. The spatial characteristics of groundwater head and stream discharge were best represented by DBS methods...... applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current...
HESS Opinions "Topography driven conceptual modelling (FLEX-Topo"
Directory of Open Access Journals (Sweden)
H. H. G. Savenije
2010-12-01
Full Text Available Heterogeneity and complexity of hydrological processes offer substantial challenges to the hydrological modeller. Some hydrologists try to tackle this problem by introducing more and more detail in their models, or by setting-up more and more complicated models starting from basic principles at the smallest possible level. As we know, this reductionist approach leads to ever higher levels of equifinality and predictive uncertainty. On the other hand, simple, lumped and parsimonious models may be too simple to be realistic or representative of the dominant hydrological processes. In this commentary, a new approach is proposed that tries to find the middle way between complex distributed and simple lumped modelling approaches. Here we try to find the right level of simplification while avoiding over-simplification. Paraphrasing Einstein, the maxim is: make a model as simple as possible, but not simpler than that. The approach presented is process based, but not physically based in the traditional sense. Instead, it is based on a conceptual representation of the dominant physical processes in certain key elements of the landscape. The essence of the approach is that the model structure is made dependent on a limited number of landscape classes in which the topography is the main driver, but which can include geological, geomorphological or land-use classification. These classes are then represented by lumped conceptual models that act in parallel. The advantage of this approach over a fully distributed conceptualisation is that it retains maximum simplicity while taking into account observable landscape characteristics.
Chou, Shuo-Ju
2011-12-01
-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling
Uncertainties in Surface Layer Modeling
Pendergrass, W.
2015-12-01
A central problem for micrometeorologists has been the relationship of air-surface exchange rates of momentum and heat to quantities that can be predicted with confidence. The flux-gradient profile developed through Monin-Obukhov Similarity Theory (MOST) provides an integration of the dimensionless wind shear expression where is an empirically derived expression for stable and unstable atmospheric conditions. Empirically derived expressions are far from universally accepted (Garratt, 1992, Table A5). Regardless of what form of these relationships might be used, their significance over any short period of time is questionable since all of these relationships between fluxes and gradients apply to averages that might rarely occur. It is well accepted that the assumption of stationarity and homogeneity do not reflect the true chaotic nature of the processes that control the variables considered in these relationships, with the net consequence that the levels of predictability theoretically attainable might never be realized in practice. This matter is of direct relevance to modern prognostic models which construct forecasts by assuming the universal applicability of relationships among averages for the lower atmosphere, which rarely maintains an average state. Under a Cooperative research and Development Agreement between NOAA and Duke Energy Generation, NOAA/ATDD conducted atmospheric boundary layer (ABL) research using Duke renewable energy sites as research testbeds. One aspect of this research has been the evaluation of legacy flux-gradient formulations (the ϕ functions, see Monin and Obukhov, 1954) for the exchange of heat and momentum. At the Duke Energy Ocotillo site, NOAA/ATDD installed sonic anemometers reporting wind and temperature fluctuations at 10Hz at eight elevations. From these observations, ϕM and ϕH were derived from a two-year database of mean and turbulent wind and temperature observations. From this extensive measurement database, using a
Jacquin, A. P.
2012-04-01
This study analyses the effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model's discharge estimates. Prediction uncertainty bounds are derived using the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation (at a single station within the catchment) and a precipitation factor FPi. Thus, these factors provide a simplified representation of the spatial variation of precipitation, specifically the shape of the functional relationship between precipitation and height. In the absence of information about appropriate values of the precipitation factors FPi, these are estimated through standard calibration procedures. The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. Monte Carlo samples of the model output are obtained by randomly varying the model parameters within their feasible ranges. In the first experiment, the precipitation factors FPi are considered unknown and thus included in the sampling process. The total number of unknown parameters in this case is 16. In the second experiment, precipitation factors FPi are estimated a priori, by means of a long term water balance between observed discharge at the catchment outlet, evapotranspiration estimates and observed precipitation. In this case, the number of unknown parameters reduces to 11. The feasible ranges assigned to the precipitation factors in the first experiment are slightly wider than the range of fixed precipitation factors used in the second experiment. The mean squared error of the Box-Cox transformed discharge during the calibration period is used for the evaluation of the
Uncertainty quantification and stochastic modeling with Matlab
Souza de Cursi, Eduardo
2015-01-01
Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no
Optical Model and Cross Section Uncertainties
Energy Technology Data Exchange (ETDEWEB)
Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.
2009-10-05
Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.
Uncertainty quantification for Markov chain models.
Meidani, Hadi; Ghanem, Roger
2012-12-01
Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated.
Internal Branding Implementation: Developing a Conceptual Model
Katja Terglav; Robert Kase; Maja Konecnik Ruzzier
2012-01-01
Internal branding is the process, which enables balanced view of the brand at all company levels. Its significance is aligning values and behaviors of employees with brand values and brand promises. In the article, we focus mainly on its implementation, which requires coordination of different functions in the company, for instance, internal marketing and human resource management. Based on findings of qualitative research, we present a conceptual model of internal branding implementation. Re...
Uncertainty Quantification for Optical Model Parameters
Lovell, A E; Sarich, J; Wild, S M
2016-01-01
Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of this work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fit and create corresponding 95\\% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. We study a number of reactions involving neutron and deuteron p...
Realising the Uncertainty Enabled Model Web
Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.
2012-12-01
The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address
Uncertainty in Regional Air Quality Modeling
Digar, Antara
Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor
Uncertainty and Sensitivity in Surface Dynamics Modeling
Kettner, Albert J.; Syvitski, James P. M.
2016-05-01
Papers for this special issue on 'Uncertainty and Sensitivity in Surface Dynamics Modeling' heralds from papers submitted after the 2014 annual meeting of the Community Surface Dynamics Modeling System or CSDMS. CSDMS facilitates a diverse community of experts (now in 68 countries) that collectively investigate the Earth's surface-the dynamic interface between lithosphere, hydrosphere, cryosphere, and atmosphere, by promoting, developing, supporting and disseminating integrated open source software modules. By organizing more than 1500 researchers, CSDMS has the privilege of identifying community strengths and weaknesses in the practice of software development. We recognize, for example, that progress has been slow on identifying and quantifying uncertainty and sensitivity in numerical modeling of earth's surface dynamics. This special issue is meant to raise awareness for these important subjects and highlight state-of-the-art progress.
A conceptual model of political market orientation
DEFF Research Database (Denmark)
Ormrod, Robert P.
2005-01-01
. The remaining four constructs are attitudinal, designed to capture the awareness of members to the activities and importance of stakeholder groups in society, both internal and external to the organisation. The model not only allows the level of a party's political market orientation to be assessed, but also......This article proposes eight constructs of a conceptual model of political market orientation, taking inspiration from the business and political marketing literature. Four of the constructs are 'behavioural' in that they aim to describe the process of how information flows through the organisation...
Systemic change increases model projection uncertainty
Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André
2014-05-01
Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship resulting from e.g., climatic or societal changes, thereby overlooking a source of uncertainty. We define systemic change as a change in the system indicated by a system state change that cannot be simulated using a constant model structure. We have developed a method to detect systemic change, using a Bayesian data assimilation technique, the particle filter. The particle filter was used to update the prior knowledge about the model structure. In contrast to the traditional particle filter approach (e.g., Verstegen et al., 2014), we apply the filter separately for each point in time for which observations are available, obtaining the optimal model structure for each of the time periods in between. This allows us to create a time series of the evolution of the model structure. The Runs test (Wald and Wolfowitz, 1940), a stationarity test, is used to check whether variation in this time series can be attributed to randomness or not. If not, this indicates systemic change. The uncertainty that the systemic change adds to the existing model projection uncertainty can be determined by comparing model outcomes of a model with a stationary model structure and a model with a model structure changing according to the variation found in the time series. To test the systemic change detection methodology, we apply it to a land use change cellular automaton (CA) (Verstegen et al., 2012) and use observations of real land use from all years from 2004 to 2012 and associated uncertainty as observational data in the particle filter. A systemic change was detected for the period 2006 to 2008. In this period the influence on the location of sugar cane expansion of the driver sugar cane in
Physical and Model Uncertainty for Fatigue Design of Composite Material
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...
Conceptual Model of Dynamic Geographic Environment
Directory of Open Access Journals (Sweden)
Martínez-Rosales Miguel Alejandro
2014-04-01
Full Text Available In geographic environments, there are many and different types of geographic entities such as automobiles, trees, persons, buildings, storms, hurricanes, etc. These entities can be classified into two groups: geographic objects and geographic phenomena. By its nature, a geographic environment is dynamic, thus, it’s static modeling is not sufficient. Considering the dynamics of geographic environment, a new type of geographic entity called event is introduced. The primary target is a modeling of geographic environment as an event sequence, because in this case the semantic relations are much richer than in the case of static modeling. In this work, the conceptualization of this model is proposed. It is based on the idea to process each entity apart instead of processing the environment as a whole. After that, the so called history of each entity and its spatial relations to other entities are defined to describe the whole environment. The main goal is to model systems at a conceptual level that make use of spatial and temporal information, so that later it can serve as the semantic engine for such systems.
Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model
Berman, Jeanette; Smyth, Robyn
2015-01-01
This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…
A Structural Equation Model of Conceptual Change in Physics
Taasoobshirazi, Gita; Sinatra, Gale M.
2011-01-01
A model of conceptual change in physics was tested on introductory-level, college physics students. Structural equation modeling was used to test hypothesized relationships among variables linked to conceptual change in physics including an approach goal orientation, need for cognition, motivation, and course grade. Conceptual change in physics…
Quantifying uncertainty in stable isotope mixing models
Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.
2015-05-01
Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated
Conceptual model of communications in public health
Directory of Open Access Journals (Sweden)
Марія Андріївна Знаменська
2015-07-01
Full Text Available Actuality. The role of communications in effective reform of public health in the country is discovered in scientific literature last time. But there are no works that fully present the system of communications in public health; this fact defined actuality of the given research.Methods. The next scientific methods are used in this work: structural and logical analysis, conceptual modeling. The systematic approach became a base of research. Results. There was elaborated conceptual model of the system of communications in public health its node idea is a consistent solution of the priority problem of supply the population of the country in whole and the separate task groups of communicative impact with complex objective information in the system of public health. At constructing of the model there were separated the next groups of problems: structural construction of the system of communication; supply of the system with resources; methods and means of communication; monitoring and assessment of efficiency of communication.Conclusions. The use of this model allows at optimal costs to eliminate the organizational and administrative defects and increase an awareness of the people in organization of public health, in maintenance and improvement of personal health.
Representing Turbulence Model Uncertainty with Stochastic PDEs
Oliver, Todd; Moser, Robert
2012-11-01
Validation of and uncertainty quantification for extrapolative predictions of RANS turbulence models are necessary to ensure that the models are not used outside of their domain of applicability and to properly inform decisions based on such predictions. In previous work, we have developed and calibrated statistical models for these purposes, but it has been found that incorporating all the knowledge of a domain expert--e.g., realizability, spatial smoothness, and known scalings--in such models is difficult. Here, we explore the use of stochastic PDEs for this purpose. The goal of this formulation is to pose the uncertainty model in a setting where it is easier for physical modelers to express what is known. To explore the approach, multiple stochastic models describing the error in the Reynolds stress are coupled with multiple deterministic turbulence models to make uncertain predictions of channel flow. These predictions are compared with DNS data to assess their credibility. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].
A conceptual model of referee efficacy.
Guillén, Félix; Feltz, Deborah L
2011-01-01
This paper presents a conceptual model of referee efficacy, defines the concept, proposes sources of referee specific efficacy information, and suggests consequences of having high or low referee efficacy. Referee efficacy is defined as the extent to which referees believe they have the capacity to perform successfully in their job. Referee efficacy beliefs are hypothesized to be influenced by mastery experiences, referee knowledge/education, support from significant others, physical/mental preparedness, environmental comfort, and perceived anxiety. In turn, referee efficacy beliefs are hypothesized to influence referee performance, referee stress, athlete rule violations, athlete satisfaction, and co-referee satisfaction.
A conceptual model of referee efficacy
Directory of Open Access Journals (Sweden)
Félix eGuillén
2011-02-01
Full Text Available This paper presents a conceptual model of referee efficacy, defines the concept, proposes sources of referee specific efficacy information, and suggests consequences of having high or low referee efficacy. Referee efficacy is defined as the extent to which referees believe they have the capacity to perform successfully in their job. Referee efficacy beliefs are hypothesized to be influenced by mastery experiences, referee knowledge/education, support from significant others, physical/mental preparedness, environmental comfort, and perceived anxiety. In turn, referee efficacy beliefs are hypothesized to influence referee performance, referee stress, athlete rule violations, athlete satisfaction, and co-referee satisfaction.
Values and uncertainties in the predictions of global climate models.
Winsberg, Eric
2012-06-01
Over the last several years, there has been an explosion of interest and attention devoted to the problem of Uncertainty Quantification (UQ) in climate science-that is, to giving quantitative estimates of the degree of uncertainty associated with the predictions of global and regional climate models. The technical challenges associated with this project are formidable, and so the statistical community has understandably devoted itself primarily to overcoming them. But even as these technical challenges are being met, a number of persistent conceptual difficulties remain. So why is UQ so important in climate science? UQ, I would like to argue, is first and foremost a tool for communicating knowledge from experts to policy makers in a way that is meant to be free from the influence of social and ethical values. But the standard ways of using probabilities to separate ethical and social values from scientific practice cannot be applied in a great deal of climate modeling, because the roles of values in creating the models cannot be discerned after the fact-the models are too complex and the result of too much distributed epistemic labor. I argue, therefore, that typical approaches for handling ethical/social values in science do not work well here.
Propulsion System Models for Rotorcraft Conceptual Design
Johnson, Wayne
2014-01-01
The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.
Modeling and inverse problems in the presence of uncertainty
Banks, H T; Thompson, W Clayton
2014-01-01
Modeling and Inverse Problems in the Presence of Uncertainty collects recent research-including the authors' own substantial projects-on uncertainty propagation and quantification. It covers two sources of uncertainty: where uncertainty is present primarily due to measurement errors and where uncertainty is present due to the modeling formulation itself. After a useful review of relevant probability and statistical concepts, the book summarizes mathematical and statistical aspects of inverse problem methodology, including ordinary, weighted, and generalized least-squares formulations. It then
Fault Detection under Fuzzy Model Uncertainty
Institute of Scientific and Technical Information of China (English)
Marek Kowal; Józef Korbicz
2007-01-01
The paper tackles the problem of robust fault detection using Takagi-Sugeno fuzzy models. A model-based strategy is employed to generate residuals in order to make a decision about the state of the process. Unfortunately, such a method is corrupted by model uncertainty due to the fact that in real applications there exists a model-reality mismatch. In order to ensure reliable fault detection the adaptive threshold technique is used to deal with the mentioned problem. The paper focuses also on fuzzy model design procedure. The bounded-error approach is applied to generating the rules for the model using available measurements. The proposed approach is applied to fault detection in the DC laboratory engine.
Facets of Uncertainty in Digital Elevation and Slope Modeling
Institute of Scientific and Technical Information of China (English)
ZHANG Jingxiong; LI Deren
2005-01-01
This paper investigates the differences that result from applying different approaches to uncertainty modeling and reports an experimental examining error estimation and propagation in elevation and slope,with the latter derived from the former. It is confirmed that significant differences exist between uncertainty descriptors, and propagation of uncertainty to end products is immensely affected by the specification of source uncertainty.
Product Modelling and Functional Reasoning in Conceptual Design
Institute of Scientific and Technical Information of China (English)
林志航; 宋慧军; 陈康宁
2004-01-01
In this paper, a product model in conceptual design, Domain Structure Template, is proposed, which combines the functional domain and the physical domain with the behaviour domain. Seven types of primary mappings units connecting functions, behaviours and carriers during conceptual design process are identified according to the characteristics of the conceptual design of mechanical products.Based on these seven primary mappings, a hierarchical functional reasoning framework characterizing the process of conceptual design is presented. A case study for the conceptual design of industry sewing machines with chain-stitch is described to demonstrate the product modelling and the scheme generation based on the presented model.
Representation of the Conceptual Change Model in Science Teacher Education.
Thorley, N. Richard; Stofflett, Rene T.
1996-01-01
Analyzes key concepts of the conceptual change model: intelligibility, plausibility, and fruitfulness, together with conceptions of learning as conceptual change and the nature of conceptual change teaching. Organizes representations of these around a framework developed for representing scientific conceptions in terms of verbal and symbolic…
A Bayesian Chance-Constrained Method for Hydraulic Barrier Design Under Model Structure Uncertainty
Chitsazan, N.; Pham, H. V.; Tsai, F. T. C.
2014-12-01
The groundwater community has widely recognized the model structure uncertainty as the major source of model uncertainty in groundwater modeling. Previous studies in the aquifer remediation design, however, rarely discuss the impact of the model structure uncertainty. This study combines the chance-constrained (CC) programming with the Bayesian model averaging (BMA) as a BMA-CC framework to assess the effect of model structure uncertainty in the remediation design. To investigate the impact of the model structure uncertainty on the remediation design, we compare the BMA-CC method with the traditional CC programming that only considers the model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from saltwater intrusion in the "1,500-foot" sand and the "1-700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address the model structure uncertainty, we develop three conceptual groundwater models based on three different hydrostratigraphy structures. The results show that using the traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from connector wells is higher than the total pumpage of the protected public supply wells. While reducing injection rate can be achieved by reducing reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station is not economically attractive.
A conceptual model for manufacturing performance improvement
Directory of Open Access Journals (Sweden)
M.A. Karim
2009-07-01
Full Text Available Purpose: Important performance objectives manufacturers sought can be achieved through adopting the appropriate manufacturing practices. This paper presents a conceptual model proposing relationship between advanced quality practices, perceived manufacturing difficulties and manufacturing performances.Design/methodology/approach: A survey-based approach was adopted to test the hypotheses proposed in this study. The selection of research instruments for inclusion in this survey was based on literature review, the pilot case studies and relevant industrial experience of the author. A sample of 1000 manufacturers across Australia was randomly selected. Quality managers were requested to complete the questionnaire, as the task of dealing with the quality and reliability issues is a quality manager’s major responsibility.Findings: Evidence indicates that product quality and reliability is the main competitive factor for manufacturers. Design and manufacturing capability and on time delivery came second. Price is considered as the least important factor for the Australian manufacturers. Results show that collectively the advanced quality practices proposed in this study neutralize the difficulties manufacturers face and contribute to the most performance objectives of the manufacturers. The companies who have put more emphasize on the advanced quality practices have less problem in manufacturing and better performance in most manufacturing performance indices. The results validate the proposed conceptual model and lend credence to hypothesis that proposed relationship between quality practices, manufacturing difficulties and manufacturing performances.Practical implications: The model shown in this paper provides a simple yet highly effective approach to achieving significant improvements in product quality and manufacturing performance. This study introduces a relationship based ‘proactive’ quality management approach and provides great
Intrinsic Uncertainties in Modeling Complex Systems.
Energy Technology Data Exchange (ETDEWEB)
Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.
2014-09-01
Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.
DEFF Research Database (Denmark)
Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.
2007-01-01
Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models of th...... models, is applied to two different sets of measured plant data. The computed uncertainty bounds cover the measured plant output, while the nominal prediction is outside these uncertainty bounds for some samples in these examples. ......Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models...... of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...
Model uncertainty and Bayesian model averaging in vector autoregressive processes
R.W. Strachan (Rodney); H.K. van Dijk (Herman)
2006-01-01
textabstractEconomic forecasts and policy decisions are often informed by empirical analysis based on econometric models. However, inference based upon a single model, when several viable models exist, limits its usefulness. Taking account of model uncertainty, a Bayesian model averaging procedure i
Lark, Adam
Scientific Community Laboratories, developed by The University of Maryland, have shown initial promise as laboratories meant to emulate the practice of doing physics. These laboratories have been re-created by incorporating their design elements with the University of Toledo course structure and resources. The laboratories have been titled the Scientific Learning Community (SLC) Laboratories. A comparative study between these SLC laboratories and the University of Toledo physics department's traditional laboratories was executed during the fall 2012 semester on first semester calculus-based physics students. Three tests were executed as pre-test and post-tests to capture the change in students' concept knowledge, attitudes, and understanding of uncertainty. The Force Concept Inventory (FCI) was used to evaluate students' conceptual changes through the semester and average normalized gains were compared between both traditional and SLC laboratories. The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) was conducted to elucidate students' change in attitudes through the course of each laboratory. Finally, interviews regarding data analysis and uncertainty were transcribed and coded to track changes in the way students understand uncertainty and data analysis in experimental physics after their participation in both laboratory type. Students in the SLC laboratories showed a notable an increase conceptual knowledge and attitudes when compared to traditional laboratories. SLC students' understanding of uncertainty showed most improvement, diverging completely from students in the traditional laboratories, who declined throughout the semester.
A conceptual model of an Arctic sea
St-Laurent, P.; Straneo, F.; Barber, D. G.
2012-06-01
We propose a conceptual model for an Arctic sea that is driven by river runoff, atmospheric fluxes, sea ice melt/growth, and winds. The model domain is divided into two areas, the interior and boundary regions, that are coupled through Ekman and eddy fluxes of buoyancy. The model is applied to Hudson and James Bays (HJB, a large inland basin in northeastern Canada) for the period 1979-2007. Several yearlong records from instruments moored within HJB show that the model results are consistent with the real system. The model notably reproduces the seasonal migration of the halocline, the baroclinic boundary current, spatial variability of freshwater content, and the fall maximum in freshwater export. The simulations clarify the important differences in the freshwater balance of the western and eastern sides of HJB. The significant role played by the boundary current in the freshwater budget of the system, and its sensitivity to the wind-forcing, are also highlighted by the simulations and new data analyses. We conclude that the model proposed is useful for the interpretation of observed data from Arctic seas and model outputs from more complex coupled/climate models.
Conceptual Data Modelling of Modern Human Migration
Directory of Open Access Journals (Sweden)
Kosta Sotiroski
2012-12-01
Full Text Available The processes of human migrations have been present for ages, since the very beginnings of human history on the planet Earth. Nowadays, these are amplified to a large scale due to the modern means of communication, transportation, information and knowledge exchange, as well as the complex processes of globalization. Knowing the social, demographic, ethnical and educational structure of the migrants, as well as their geographical trajectory and temporal dynamics of their spatial moving across territories, countries and continents, is of a crucial meaning for both national governments and international policies. There is an emphasized need for identifying, acquiring, organizing, storing, retrieving and analyzing data related to human migration processes. The relational databases provide an ultimate solution, whilst the E-R diagram represents a common graphical tool for conceptual data modelling and relational database design. Within the paper we develop and propose a logical data model of the modern human migration.
Quantum Gravity models - brief conceptual summary
Lukierski, jerzy
2014-01-01
After short historical overview we describe the difficulties with application of standard QFT methods in quantum gravity (QG). The incompatibility of QG with the use of classical continuous space-time required conceptually new approach. We present briefly three proposals: loop quantum gravity (LQG), the field-theoretic framework on noncommutative space-time and QG models formulated on discretized (triangularized) space-time. We evaluate these models as realizing expected important properties of QG: background independence, consistent quantum diffeomorphisms, noncommutative or discrete structure of space-time at very short distances, finite/renormalizable QG corrections. We only briefly outline an important issue of embedding QG into larger geometric and dynamical frameworks (e.g. supergravity, (super)strings, p-branes, M-theory), with the aim to achieve full unification of all fundamental interactions.
A conceptual model of political market orientation
DEFF Research Database (Denmark)
Ormrod, Robert P.
2005-01-01
This article proposes eight constructs of a conceptual model of political market orientation, taking inspiration from the business and political marketing literature. Four of the constructs are 'behavioural' in that they aim to describe the process of how information flows through the organisation....... The remaining four constructs are attitudinal, designed to capture the awareness of members to the activities and importance of stakeholder groups in society, both internal and external to the organisation. The model not only allows the level of a party's political market orientation to be assessed, but also...... aids the party in making a context-specific decision with regard to the reallocation - or not - of party resources in order to attain the party's long-term objectives....
Conceptual models used in clinical practice.
Wardle, M G; Mandle, C L
1989-02-01
Nurses' difficulties in articulation of conceptual models may be due to several factors--not the least of which are the existence of discrete theories for each area of nursing specialization, dissociation in curricula of theory from practice, a holistic conceptual framework that may be inadequately defined at the process level, and an impulse toward idealism on the part of the nurses themselves. These observations challenge both the theorists and the practitioners of modern nursing to describe more clearly the definition of quality for the science and art of nursing. Nurses are beginning to grasp the idea of holism. It is not the summation of parts to make a whole. Holism is the identification of life patterns, which are reflective of the whole. Nurses in practice and research are starting to create methods of inquiry that portray the wholeness of the autonomous person in continual, dynamic change and exchange with a changing universe. These initial explorations are leading to the evolution of the concepts of person, environment, and health into a distinctive theoretical base for nursing practice. In practice, research, and education, nurses must be committed to excellent, current descriptions of these human life patterns.
The Conceptual Integration Modeling Framework: Abstracting from the Multidimensional Model
Rizzolo, Flavio; Pottinger, Rachel; Wong, Kwok
2010-01-01
Data warehouses are overwhelmingly built through a bottom-up process, which starts with the identification of sources, continues with the extraction and transformation of data from these sources, and then loads the data into a set of data marts according to desired multidimensional relational schemas. End user business intelligence tools are added on top of the materialized multidimensional schemas to drive decision making in an organization. Unfortunately, this bottom-up approach is costly both in terms of the skilled users needed and the sheer size of the warehouses. This paper proposes a top-down framework in which data warehousing is driven by a conceptual model. The framework offers both design time and run time environments. At design time, a business user first uses the conceptual modeling language as a multidimensional object model to specify what business information is needed; then she maps the conceptual model to a pre-existing logical multidimensional representation. At run time, a system will tra...
Uncertainty Assessment in Urban Storm Water Drainage Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren
The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...
Quantifying Groundwater Recharge Uncertainty: A Multiple-Model Framework and Case Study
Kikuchi, C.; Ferré, T. P. A.
2014-12-01
In practice, it is difficult to estimate groundwater recharge accurately. Despite this challenge, most recharge investigations produce a single, best estimate of recharge. However, there is growing recognition that quantification of natural recharge uncertainty is critical for groundwater management. We present a multiple-model framework for estimating recharge uncertainty. In addition, we show how direct water flux measurements can be used to reduce the uncertainty of estimates of total basin recharge for an arid, closed hydrologic basin in the Atacama Desert, Chile. We first formulated multiple hydrogeologic conceptual models of the basin based on existing data, and implemented each conceptual model for the purpose of conducting numerical simulations. For each conceptual model, groundwater recharge was inversely estimated; then, Null-Space Monte Carlo techniques were used to quantify the uncertainty on the initial estimate of total basin recharge. Second, natural recharge components - including both deep percolation and streambed infiltration - were estimated from field data. Specifically, vertical temperature profiles were measured in monitoring wells and streambeds, and water fluxes were estimated from thermograph analysis. Third, calculated water fluxes were incorporated as prior information to the model calibration and Null-Space Monte Carlo procedures, yielding revised estimates of both total basin recharge and associated uncertainty. The fourth and final component of this study uses value of information analyses to identify potentially informative locations for additional water flux measurements. The uncertainty quantification framework presented here is broadly transferable; furthermore, this research provides an applied example of the extent to which water flux measurements may serve to reduce groundwater recharge uncertainty at the basin scale.
Methodology for characterizing modeling and discretization uncertainties in computational simulation
Energy Technology Data Exchange (ETDEWEB)
ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.
2000-03-01
This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.
DEFICIENT INFORMATION MODELING OF MECHANICAL PRODUCTS FOR CONCEPTUAL SHAPE DESIGN
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
In allusion to the deficient feature of product information in conceptual design, a framework of deficient information modeling for conceptual shape design is put forward, which includes qualitative shape modeling (a qualitative solid model), uncertain shape modeling (an uncertain relation model) and imprecise shape modeling (an imprecise region model). In the framework, the qualitative solid model is the core, which represents qualitatively (using symbols) the conceptual shapes of mechanical products. The uncertain relation model regarding domain relations as objects and the imprecise region model regarding domains as objects are used to deal with the uncertain and imprecise issues respectively, which arise from qualitative shape modeling or exist in product information itself.
A unifying conceptual model of entrepreneurial management
DEFF Research Database (Denmark)
Senderovitz, Martin
This article offers a systematic analysis and synthesis of the area of entrepreneurial management. Through a presentation of two main perspectives on entrepreneurial management and a newly developed unifying conceptual entrepreneurial management model, the paper discusses a number of theoretical...... disagreements, managerial dilemmas and paradoxes. On the basis of the findings and conclusions of the study, the article contributes with and overview of the entrepreneurial management field, and offers an answer to the overall research question: What constitutes the most essential areas and challenges...... of entrepreneurial management? The paper builts on the seminal work by Stevenson (1983, 1990) and proposes a discussion and elaboration of the understanding and definition of entrepreneurial management in terms of the relationship between entrepreneurial opportunities and firm resources....
Energy Technology Data Exchange (ETDEWEB)
Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA
2017-04-01
Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.
Stormwater infiltration trenches: a conceptual modelling approach.
Freni, Gabriele; Mannina, Giorgio; Viviani, Gaspare
2009-01-01
In recent years, limitations linked to traditional urban drainage schemes have been pointed out and new approaches are developing introducing more natural methods for retaining and/or disposing of stormwater. These mitigation measures are generally called Best Management Practices or Sustainable Urban Drainage System and they include practices such as infiltration and storage tanks in order to reduce the peak flow and retain part of the polluting components. The introduction of such practices in urban drainage systems entails an upgrade of existing modelling frameworks in order to evaluate their efficiency in mitigating the impact of urban drainage systems on receiving water bodies. While storage tank modelling approaches are quite well documented in literature, some gaps are still present about infiltration facilities mainly dependent on the complexity of the involved physical processes. In this study, a simplified conceptual modelling approach for the simulation of the infiltration trenches is presented. The model enables to assess the performance of infiltration trenches. The main goal is to develop a model that can be employed for the assessment of the mitigation efficiency of infiltration trenches in an integrated urban drainage context. Particular care was given to the simulation of infiltration structures considering the performance reduction due to clogging phenomena. The proposed model has been compared with other simplified modelling approaches and with a physically based model adopted as benchmark. The model performed better compared to other approaches considering both unclogged facilities and the effect of clogging. On the basis of a long-term simulation of six years of rain data, the performance and the effectiveness of an infiltration trench measure are assessed. The study confirmed the important role played by the clogging phenomenon on such infiltration structures.
Management of California Oak Woodlands: Uncertainties and Modeling
Jay E. Noel; Richard P. Thompson
1995-01-01
A mathematical policy model of oak woodlands is presented. The model illustrates the policy uncertainties that exist in the management of oak woodlands. These uncertainties include: (1) selection of a policy criterion function, (2) woodland dynamics, (3) initial and final state of the woodland stock. The paper provides a review of each of the uncertainty issues. The...
Semantic techniques for enabling knowledge reuse in conceptual modelling
Gracia, J.; Liem, J.; Lozano, E.; Corcho, O.; Trna, M.; Gómez-Pérez, A.; Bredeweg, B.
2010-01-01
Conceptual modelling tools allow users to construct formal representations of their conceptualisations. These models are typically developed in isolation, unrelated to other user models, thus losing the opportunity of incorporating knowledge from other existing models or ontologies that might enrich
Reservoir Stochastic Modeling Constrained by Quantitative Geological Conceptual Patterns
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
This paper discusses the principles of geologic constraints on reservoir stochastic modeling. By using the system science theory, two kinds of uncertainties, including random uncertainty and fuzzy uncertainty, are recognized. In order to improve the precision of stochastic modeling and reduce the uncertainty in realization, the fuzzy uncertainty should be stressed, and the "geological genesis-controlled modeling" is conducted under the guidance of a quantitative geological pattern. An example of the Pingqiao horizontal-well division of the Ansai Oilfield in the Ordos Basin is taken to expound the method of stochastic modeling.
Representing and managing uncertainty in qualitative ecological models
Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.
2009-01-01
Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete
Creating a Universe, a Conceptual Model
Directory of Open Access Journals (Sweden)
James R. Johnson
2016-10-01
Full Text Available Space is something. Space inherently contains laws of nature: universal rules (mathematics, space dimensions, types of forces, types of fields, and particle species, laws (relativity, quantum mechanics, thermodynamics, and electromagnetism and symmetries (Lorentz, Gauge, and symmetry breaking. We have significant knowledge about these laws of nature because all our scientific theories assume their presence. Their existence is critical for developing either a unique theory of our universe or more speculative multiverse theories. Scientists generally ignore the laws of nature because they “are what they are” and because visualizing different laws of nature challenges the imagination. This article defines a conceptual model separating space (laws of nature from the universe’s energy source (initial conditions and expansion (big bang. By considering the ramifications of changing the laws of nature, initial condition parameters, and two variables in the big bang theory, the model demonstrates that traditional fine tuning is not the whole story when creating a universe. Supporting the model, space and “nothing” are related to the laws of nature, mathematics and multiverse possibilities. Speculation on the beginning of time completes the model.
Gaze categorization under uncertainty: psychophysics and modeling.
Mareschal, Isabelle; Calder, Andrew J; Dadds, Mark R; Clifford, Colin W G
2013-04-22
The accurate perception of another person's gaze direction underlies most social interactions and provides important information about his or her future intentions. As a first step to measuring gaze perception, most experiments determine the range of gaze directions that observers judge as being direct: the cone of direct gaze. This measurement has revealed the flexibility of observers' perception of gaze and provides a useful benchmark against which to test clinical populations with abnormal gaze behavior. Here, we manipulated effective signal strength by adding noise to the eyes of synthetic face stimuli or removing face information. We sought to move beyond a descriptive account of gaze categorization by fitting a model to the data that relies on changing the uncertainty associated with an estimate of gaze direction as a function of the signal strength. This model accounts for all the data and provides useful insight into the visual processes underlying normal gaze perception.
National Identity: Conceptual models, discourses and political change
DEFF Research Database (Denmark)
Harder, Peter
2014-01-01
on the interplay between conceptual, geopolitical and social factors in shaping the ongoing change in the role and nature of ‘Britishness’. A key question for this article is: What are relations between conceptual models and macro-social, causal factors in shaping the intersubjective status of Britishness?......Cognitive Linguistics has demonstrated the applicability of a conceptual approach to the understanding of political issues, cf. Lakoff (2008) and many others. From a different perspective, critical discourse analysis has approached political concepts with a focus on issues involving potentially...... divisive features such as race, class, gender and ethnic identity. Although discourses are not identical to conceptual models, conceptual models are typically manifested in discourse, and discourses are typically reflections of conceptualizations, a theme explored e.g. in Hart and Lukes (2007). As argued...
Uncertainty of GIA models across the Greenland
Ruggieri, Gabriella
2013-04-01
In the last years various remote sensing techniques have been employed to estimate the current mass balance of the Greenland ice sheet (GIS). In this regards GRACE, laser and radar altimetry observations, employed to constrain the mass balance, consider the glacial isostatic adjustment (GIA) a source of noise. Several GIA models have been elaborated for the Greenland but they differ from each other for mantle viscosity profile and for time history of ice melting. In this work we use the well know ICE-5G (VM2) ice model by Peltier (2004) and two others alternative scenarios of ice melting, ANU05 by Lambeck et al. (1998) and the new regional ice model HUY2 by Simpson et al. (2009) in order to asses the amplitude of the uncertainty related to the GIA predictions. In particular we focus on rates of vertical displacement field, sea surface variations and sea-level change at regional scale. The GIA predictions are estimated using an improved version of SELEN code that solve the sea-level equation for a spherical self-gravitating, incompressible and viscoelastic Earth structure. GIA uncertainty shows a highly variable geographic distribution across the Greenland. Considering the spatial pattern of the GIA predictions related to the three ice models, the western sector of the Greenland Ice Sheets (GrIS) between Thule and Upernavik and around the area of Paamiut, show good agreement while the northeast portion of the Greenland is characterized by a large discrepancy of the GIA predictions inferred by the ice models tested in this work. These differences are ultimately the consequence of the different sets of global relative sea level data and modern geodetic observations used by the authors to constrain the model parameters. Finally GPS Network project (GNET), recently installed around the periphery of the GrIS, are used as a tool to discuss the discrepancies among the GIA models. Comparing the geodetic analysis recently available, appears that among the GPS sites the
A conceptual, distributed snow redistribution model
Frey, S.; Holzmann, H.
2015-11-01
When applying conceptual hydrological models using a temperature index approach for snowmelt to high alpine areas often accumulation of snow during several years can be observed. Some of the reasons why these "snow towers" do not exist in nature are vertical and lateral transport processes. While snow transport models have been developed using grid cell sizes of tens to hundreds of square metres and have been applied in several catchments, no model exists using coarser cell sizes of 1 km2, which is a common resolution for meso- and large-scale hydrologic modelling (hundreds to thousands of square kilometres). In this paper we present an approach that uses only gravity and snow density as a proxy for the age of the snow cover and land-use information to redistribute snow in alpine basins. The results are based on the hydrological modelling of the Austrian Inn Basin in Tyrol, Austria, more specifically the Ötztaler Ache catchment, but the findings hold for other tributaries of the river Inn. This transport model is implemented in the distributed rainfall-runoff model COSERO (Continuous Semi-distributed Runoff). The results of both model concepts with and without consideration of lateral snow redistribution are compared against observed discharge and snow-covered areas derived from MODIS satellite images. By means of the snow redistribution concept, snow accumulation over several years can be prevented and the snow depletion curve compared with MODIS (Moderate Resolution Imaging Spectroradiometer) data could be improved, too. In a 7-year period the standard model would lead to snow accumulation of approximately 2900 mm SWE (snow water equivalent) in high elevated regions whereas the updated version of the model does not show accumulation and does also predict discharge with more accuracy leading to a Kling-Gupta efficiency of 0.93 instead of 0.9. A further improvement can be shown in the comparison of MODIS snow cover data and the calculated depletion curve, where
CONCEPTUAL MODELING BASED ON LOGICAL EXPRESSION AND EVOLVEMENT
Institute of Scientific and Technical Information of China (English)
Yl Guodong; ZHANG Shuyou; TAN Jianrong; JI Yangjian
2007-01-01
Aiming at the problem of abstract and polytype information modeling in product conceptual design, a method of conceptual modeling based on logical expression and evolvement is presented. Based on the logic expressions of the product conceptual design information, a function/logic/structure mapping model is set up. First, the function semantics is transformed into logical expressions through function/logic mapping. Second, the methods of logical evolvement are utilized to describe the function analysis, function/structure mapping and structure combination. Last, the logical structure scheme is transformed into geometrical sketch through logic/structure mapping. The conceptual design information and modeling process are described uniformly with logical methods in the model, and an effective method for computer aided conceptual design based on the model is implemented.
Modelling of data uncertainties on hybrid computers
Energy Technology Data Exchange (ETDEWEB)
Schneider, Anke (ed.)
2016-06-15
The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the
A market model: uncertainty and reachable sets
Directory of Open Access Journals (Sweden)
Raczynski Stanislaw
2015-01-01
Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.
Showing Automatically Generated Students' Conceptual Models to Students and Teachers
Perez-Marin, Diana; Pascual-Nieto, Ismael
2010-01-01
A student conceptual model can be defined as a set of interconnected concepts associated with an estimation value that indicates how well these concepts are used by the students. It can model just one student or a group of students, and can be represented as a concept map, conceptual diagram or one of several other knowledge representation…
Conceptual models of the wind-driven and thermohaline circulation
Drijfhout, S.S.; Marshall, D.P.; Dijkstra, H.A.
2013-01-01
Conceptual models are a vital tool for understanding the processes that maintain the global ocean circulation, both in nature and in complex numerical ocean models. In this chapter we provide a broad overview of our conceptual understanding of the wind-driven circulation, the thermohaline
Conceptual models of the wind-driven and thermohaline circulation
Drijfhout, S.S.; Marshall, D.P.; Dijkstra, H.A.
2013-01-01
Conceptual models are a vital tool for understanding the processes that maintain the global ocean circulation, both in nature and in complex numerical ocean models. In this chapter we provide a broad overview of our conceptual understanding of the wind-driven circulation, the thermohaline circulatio
Conceptual models of the wind-driven and thermohaline circulation
Drijfhout, S.S.; Marshall, D.P.; Dijkstra, H.A.
2013-01-01
Conceptual models are a vital tool for understanding the processes that maintain the global ocean circulation, both in nature and in complex numerical ocean models. In this chapter we provide a broad overview of our conceptual understanding of the wind-driven circulation, the thermohaline circulatio
A framework for modeling uncertainty in regional climate change
In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...
Uncertainty propagation in urban hydrology water quality modelling
Torres Matallana, Arturo; Leopold, U.; Heuvelink, G.B.M.
2016-01-01
Uncertainty is often ignored in urban hydrology modelling. Engineering practice typically ignores uncertainties and uncertainty propagation. This can have large impacts, such as the wrong dimensioning of urban drainage systems and the inaccurate estimation of pollution in the environment caused by c
Solomatine, Dimitri
2016-04-01
When speaking about model uncertainty many authors implicitly assume the data uncertainty (mainly in parameters or inputs) which is probabilistically described by distributions. Often however it is look also into the residual uncertainty as well. It is hence reasonable to classify the main approaches to uncertainty analysis with respect to the two main types of model uncertainty that can be distinguished: A. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. it uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. The following methods can be mentioned: (a) quantile regression (QR) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) a more recent approach that takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (neural networks, model trees etc.) - the UNEEC method [2,3,7] (c) and even more recent DUBRAUE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals (it corrects the model residual first and then carries out the uncertainty prediction by a autoregressive statistical model) [5] B. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. In case of simple functions representing models analytical approaches can be used, or approximation methods (e.g., first-order second moment method). However, for real complex non-linear models implemented in software there is no other choice except using
The views of scientific experts on how the public conceptualize uncertainty
Frewer, L.J.; Hunt, S.; Brennan, M.; Kuznesof, S.; Ness, M.; Ritson, C.
2003-01-01
Scientific experts (drawn from scientific institutions, universities, industry, and government) were interviewed about how they thought the general public might handle information about uncertainty associated with risk analysis. It was found that, for many people within the scientific community, the
A History of U.S. Military Conceptual Solutions to the Uncertainty of Expeditions
2016-06-10
This tailorability reflected a unique demand for the Armored Force given the uncertainty inherent in dealing with new technology on the battlefield... uncertainty inherent in predicting the future. The process the Marine Corps employed and the risks they assumed in making these sweeping changes offer...United States Marine Corps. Princeton, NJ: D. Van Nostrand Company , 1966. Donovan Jr, James A. The United States Marine Corps. New York, NY
Institute of Scientific and Technical Information of China (English)
LIDian-qing; ZHANGSheng-kun
2004-01-01
The classical probability theory cannot effectively quantify the parameter uncertainty in probability of detection.Furthermore,the conventional data analytic method and expert judgment method fail to handle the problem of model uncertainty updating with the information from nondestructive inspection.To overcome these disadvantages,a Bayesian approach was proposed to quantify the parameter uncertainty in probability of detection.Furthermore,the formulae of the multiplication factors to measure the statistical uncertainties in the probability of detection following the Weibull distribution were derived.A Bayesian updating method was applied to compute the posterior probabilities of model weights and the posterior probability density functions of distribution parameters of probability of detection.A total probability model method was proposed to analyze the problem of multi-layered model uncertainty updating.This method was then applied to the problem of multilayered corrosion model uncertainty updating for ship structures.The results indicate that the proposed method is very effective in analyzing the problem of multi-layered model uncertainty updating.
Uncertainty in a spatial evacuation model
Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De
2017-08-01
Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.
Identification and communication of uncertainties of phenomenological models in PSA
Energy Technology Data Exchange (ETDEWEB)
Pulkkinen, U.; Simola, K. [VTT Automation (Finland)
2001-11-01
This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)
Uncertainty in surface water flood risk modelling
Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.
2009-04-01
uniform flow formulae (Manning's Equation) to direct flow over the model domain, sourcing water from the channel or sea so as to provide a detailed representation of river and coastal flood risk. The initial development step was to include spatially-distributed rainfall as a new source term within the model domain. This required optimisation to improve computational efficiency, given the ubiquity of ‘wet' cells early on in the simulation. Collaboration with UK water companies has provided detailed drainage information, and from this a simplified representation of the drainage system has been included in the model via the inclusion of sinks and sources of water from the drainage network. This approach has clear advantages relative to a fully coupled method both in terms of reduced input data requirements and computational overhead. Further, given the difficulties associated with obtaining drainage information over large areas, tests were conducted to evaluate uncertainties associated with excluding drainage information and the impact that this has upon flood model predictions. This information can be used, for example, to inform insurance underwriting strategies and loss estimation as well as for emergency response and planning purposes. The Flowroute surface-water flood risk platform enables efficient mapping of areas sensitive to flooding from high-intensity rainfall events due to topography and drainage infrastructure. As such, the technology has widespread potential for use as a risk mapping tool by the UK Environment Agency, European Member States, water authorities, local governments and the insurance industry. Keywords: Surface water flooding, Model Uncertainty, Insurance Underwriting, Flood inundation modelling, Risk mapping.
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen
2012-01-01
. The analysis is based on the combination of variance-decomposition Global Sensitivity Analysis (GSA) with the Generalized Likelihood Uncertainty Estimation (GLUE) technique. The GSA-GLUE approach highlights the correlation between the model factors defining the mass of pollutant in the system......The need for estimating micropollutants fluxes in stormwater systems increases the role of stormwater quality models as support for urban water managers, although the application of such models is affected by high uncertainty. This study presents a procedure for identifying the major sources...... of uncertainty in a conceptual lumped dynamic stormwater runoff quality model that is used in a study catchment to estimate (i) copper loads, (ii) compliance with dissolved Cu concentration limits on stormwater discharge and (iii) the fraction of Cu loads potentially intercepted by a planned treatment facility...
A novel approach to parameter uncertainty analysis of hydrological models using neural networks
Directory of Open Access Journals (Sweden)
D. P. Solomatine
2009-07-01
Full Text Available In this study, a methodology has been developed to emulate a time consuming Monte Carlo (MC simulation by using an Artificial Neural Network (ANN for the assessment of model parametric uncertainty. First, MC simulation of a given process model is run. Then an ANN is trained to approximate the functional relationships between the input variables of the process model and the synthetic uncertainty descriptors estimated from the MC realizations. The trained ANN model encapsulates the underlying characteristics of the parameter uncertainty and can be used to predict uncertainty descriptors for the new data vectors. This approach was validated by comparing the uncertainty descriptors in the verification data set with those obtained by the MC simulation. The method is applied to estimate the parameter uncertainty of a lumped conceptual hydrological model, HBV, for the Brue catchment in the United Kingdom. The results are quite promising as the prediction intervals estimated by the ANN are reasonably accurate. The proposed techniques could be useful in real time applications when it is not practicable to run a large number of simulations for complex hydrological models and when the forecast lead time is very short.
Quantifying uncertainty in LCA-modelling of waste management systems.
Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H
2012-12-01
Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.
Conceptual model for heart failure disease management.
Andrikopoulou, Efstathia; Abbate, Kariann; Whellan, David J
2014-03-01
The objective of this review is to propose a conceptual model for heart failure (HF) disease management (HFDM) and to define the components of an efficient HFDM plan in reference to this model. Articles that evaluated 1 or more of the following aspects of HFDM were reviewed: (1) outpatient clinic follow-up; (2) self-care interventions to enhance patient skills; and (3) remote evaluation of worsening HF either using structured telephone support (STS) or by monitoring device data (telemonitoring). The success of programs in reducing readmissions and mortality were mixed. Outpatient follow-up programs generally resulted in improved outcomes, including decreased readmissions. Based on 1 meta-analysis, specialty clinics improved outcomes and nonspecialty clinics did not. Results from self-care programs were inconsistent and might have been affected by patient cognitive status and educational level, and intervention intensity. Telemonitoring, despite initially promising meta-analyses demonstrating a decrease in the number and duration of HF-related readmissions and all-cause mortality rates at follow-up, has not been shown in randomized trials to consistently reduce readmissions or mortality. However, evidence from device monitoring trials in particular might have been influenced by technology and design issues that might be rectified in future trials. Results from the literature suggest that the ideal HFDM plan would include outpatient follow-up at an HF specialty clinic and continuous education to improve patient self-care. The end result of this plan would lead to better understanding on the part of the patient and improved patient ability to recognize and respond to signs of decompensation. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.
Aerosol model selection and uncertainty modelling by adaptive MCMC technique
Directory of Open Access Journals (Sweden)
M. Laine
2008-12-01
Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.
The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.
We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.
Utilizing Uncertainty Multidisciplinary Design Optimization for Conceptual Design of Space Systems
Yao, W.; Guo, J.; Chen, X.; Van Tooren, M.
2010-01-01
With progress of space technology and increase of space mission demand, requirements for robustness and reliability of space systems are ever-increasing. For the whole space mission life cycle, the most important decisions are made in the conceptual design phase, so it is very crucial to take uncert
Utilizing Uncertainty Multidisciplinary Design Optimization for Conceptual Design of Space Systems
Yao, W.; Guo, J.; Chen, X.; Van Tooren, M.
2010-01-01
With progress of space technology and increase of space mission demand, requirements for robustness and reliability of space systems are ever-increasing. For the whole space mission life cycle, the most important decisions are made in the conceptual design phase, so it is very crucial to take
Utilizing Uncertainty Multidisciplinary Design Optimization for Conceptual Design of Space Systems
Yao, W.; Guo, J.; Chen, X.; Van Tooren, M.
2010-01-01
With progress of space technology and increase of space mission demand, requirements for robustness and reliability of space systems are ever-increasing. For the whole space mission life cycle, the most important decisions are made in the conceptual design phase, so it is very crucial to take uncert
A Conceptual Model for Water Sensitive City in Surabaya
Pamungkas, A.; Tucunan, K. P.; Navastara, A.; Idajati, H.; Pratomoatmojo, N. A.
2017-08-01
Frequent inundated areas, low quality of water supply, highly dependent water sources from external are some key problems in Surabaya water balance. Many aspects of urban development have stimulated those problems. To uncover the complexity of water balance in Surabaya, a conceptual model for water sensitive city is constructed to find the optimum solution. A system dynamic modeling is utilized to assist and enrich the idea of conceptual model. A secondary analysis to a wide range data directs the process in making a conceptual model. FGD involving many experts from multidiscipline are also used to finalize the conceptual model. Based on those methods, the model has four main sub models that are; flooding, land use change, water demand and water supply. The model consists of 35 key variables illustrating challenges in Surabaya urban water.
Design developing conceptual models blade with geometric combined surface
Juraev, T. H.; Bukhara Technological Institute of Engineering, Bukhara
2013-01-01
A conceptual model of the blade using the methods of geometricmodeling principles of industrial design and CAD technologies. Variants are functional use of the proposed model for various working bodies.
Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz
2016-04-01
~100 mm for SWAP and indicate a greater impact of conceptual than parameter uncertainty and demonstrate the need for further research concerning water balance modeling for irrigation management.
Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project
National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...
Energy Technology Data Exchange (ETDEWEB)
Follin, Sven (SF GeoLogic AB, Taeby (Sweden)); Hartley, Lee; Jackson, Peter; Roberts, David (Serco TAP (United Kingdom)); Marsic, Niko (Kemakta Konsult AB, Stockholm (Sweden))
2008-05-15
Three versions of a site descriptive model (SDM) have been completed for the Forsmark area. Version 0 established the state of knowledge prior to the start of the site investigation programme. Version 1.1 was essentially a training exercise and was completed during 2004. Version 1.2 was a preliminary site description and concluded the initial site investigation work (ISI) in June 2005. Three modelling stages are planned for the complete site investigation work (CSI). These are labelled stage 2.1, 2.2 and 2.3, respectively. An important component of each of these stages is to address and continuously try to resolve discipline-specific uncertainties of importance for repository engineering and safety assessment. Stage 2.1 included an updated geological model for Forsmark and aimed to provide a feedback from the modelling working group to the site investigation team to enable completion of the site investigation work. Stage 2.2 described the conceptual understanding and the numerical modelling of the bedrock hydrogeology in the Forsmark area based on data freeze 2.2. The present report describes the modelling based on data freeze 2.3, which is the final data freeze in Forsmark. In comparison, data freeze 2.3 is considerably smaller than data freeze 2.2. Therefore, stage 2.3 deals primarily with model confirmation and uncertainty analysis, e.g. verification of important hypotheses made in stage 2.2 and the role of parameter uncertainty in the numerical modelling. On the whole, the work reported here constitutes an addendum to the work reported in stage 2.2. Two changes were made to the CONNECTFLOW code in stage 2.3. These serve to: 1) improve the representation of the hydraulic properties of the regolith, and 2) improve the conditioning of transmissivity of the deformation zones against single-hole hydraulic tests. The changes to the modelling of the regolith were made to improve the consistency with models made with the MIKE SHE code, which involved the introduction
Toolkit for Conceptual Modeling (TCM): User's Guide and Reference
Dehne, F.; Wieringa, R.J.
1997-01-01
The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes
Toolkit for Conceptual Modeling (TCM): User's Guide and Reference
Dehne, F.; Wieringa, Roelf J.
The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes
An ontologically well-founded profile for UML conceptual models
Guizzardi, Giancarlo; Wagner, Gerd; Guarino, Nicola; Sinderen, van Marten; Persson, Anne; Stirna, Janis
2004-01-01
UML class diagrams can be used as a language for expressing a conceptual model of a domain. In a series of papers [1,2,3] we have been using the General Ontological Language (GOL) and its underlying upper level ontology, proposed in [4,5], to evaluate the ontological correctness of a conceptual UML
Model of Conceptual Change for INQPRO: A Bayesian Network Approach
Ting, Choo-Yee; Sam, Yok-Cheng; Wong, Chee-Onn
2013-01-01
Constructing a computational model of conceptual change for a computer-based scientific inquiry learning environment is difficult due to two challenges: (i) externalizing the variables of conceptual change and its related variables is difficult. In addition, defining the causal dependencies among the variables is also not trivial. Such difficulty…
Using Conceptual Change Theories to Model Position Concepts in Astronomy
Yang, Chih-Chiang; Hung, Jeng-Fung
2012-01-01
The roles of conceptual change and model building in science education are very important and have a profound and wide effect on teaching science. This study examines the change in children's position concepts after instruction, based on different conceptual change theories. Three classes were chosen and divided into three groups, including a…
Kayastha, Nagendra; Solomatine, Dimitri; Lal Shrestha, Durga; van Griensven, Ann
2013-04-01
In recent years, a lot of attention in the hydrologic literature is given to model parameter uncertainty analysis. The robustness estimation of uncertainty depends on the efficiency of sampling method used to generate the best fit responses (outputs) and on ease of use. This paper aims to investigate: (1) how sampling strategies effect the uncertainty estimations of hydrological models, (2) how to use this information in machine learning predictors of models uncertainty. Sampling of parameters may employ various algorithms. We compared seven different algorithms namely, Monte Carlo (MC) simulation, generalized likelihood uncertainty estimation (GLUE), Markov chain Monte Carlo (MCMC), shuffled complex evolution metropolis algorithm (SCEMUA), differential evolution adaptive metropolis (DREAM), partical swarm optimization (PSO) and adaptive cluster covering (ACCO) [1]. These methods were applied to estimate uncertainty of streamflow simulation using conceptual model HBV and Semi-distributed hydrological model SWAT. Nzoia catchment in West Kenya is considered as the case study. The results are compared and analysed based on the shape of the posterior distribution of parameters, uncertainty results on model outputs. The MLUE method [2] uses results of Monte Carlo sampling (or any other sampling shceme) to build a machine learning (regression) model U able to predict uncertainty (quantiles of pdf) of a hydrological model H outputs. Inputs to these models are specially identified representative variables (past events precipitation and flows). The trained machine learning models are then employed to predict the model output uncertainty which is specific for the new input data. The problem here is that different sampling algorithms result in different data sets used to train such a model U, which leads to several models (and there is no clear evidence which model is the best since there is no basis for comparison). A solution could be to form a committee of all models U and
Imprecision and Uncertainty in the UFO Database Model.
Van Gyseghem, Nancy; De Caluwe, Rita
1998-01-01
Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects, and thus…
Imprecision and Uncertainty in the UFO Database Model.
Van Gyseghem, Nancy; De Caluwe, Rita
1998-01-01
Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…
Imprecision and Uncertainty in the UFO Database Model.
Van Gyseghem, Nancy; De Caluwe, Rita
1998-01-01
Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…
Estimating the magnitude of prediction uncertainties for the APLE model
Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...
Marzocchi, Warner; Jordan, Thomas
2014-05-01
Probabilistic assessment has become a widely accepted procedure to estimate quantitatively natural hazards. In essence probabilities are meant to quantify the ubiquitous and deep uncertainties that characterize the evolution of natural systems. However, notwithstanding the very wide use of the terms 'uncertainty' and 'probability' in natural hazards, the way in which they are linked, how they are estimated and their scientific meaning are far from being clear, as testified by the last Intergovernmental Panel on Climate Change (IPCC) report and by its subsequent review. The lack of a formal framework to interpret uncertainty and probability coherently has paved the way for some of the strongest critics of hazard analysis; in fact, it has been argued that most of natural hazard analyses are intrinsically 'unscientific'. For example, among the concerns is the use of expert opinion to characterize the so-called epistemic uncertainties; many have argued that such personal degrees of belief cannot be measured and, by implication, cannot be tested. The purpose of this talk is to confront and clarify the conceptual issues associated with the role of uncertainty and probability in natural hazard analysis and the conditions that make a hazard model testable and then 'scientific'. Specifically, we show that testability of hazard models requires a suitable taxonomy of uncertainty embedded in a proper logical framework. This taxonomy of uncertainty is composed by aleatory variability, epistemic uncertainty, and ontological error. We discuss their differences, the link with the probability, and their estimation using data, models, and subjective expert opinion. We show that these different uncertainties, and the testability of hazard models, can be unequivocally defined only for a well-defined experimental concept that is a concept external to the model under test. All these discussions are illustrated through simple examples related to the probabilistic seismic hazard analysis.
Indian Academy of Sciences (India)
Rituparna Chutia; Supahi Mahanta; D Datta
2014-04-01
The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that available information is interpreted in probabilistic sense. Probability theory is a well-established theory to measure such kind of variability. However, not all available information, data or model parameters affected by variability, imprecision and uncertainty, can be handled by traditional probability theory. Uncertainty or imprecision may occur due to incomplete information or data, measurement error or data obtained from expert judgement or subjective interpretation of available data or information. Thus for model parameters, data may be affected by subjective uncertainty. Traditional probability theory is inappropriate to represent subjective uncertainty. Possibility theory is used as a tool to describe parameters with insufficient knowledge. Based on the polynomial chaos expansion, stochastic response surface method has been utilized in this article for the uncertainty propagation of atmospheric dispersion model under consideration of both probabilistic and possibility information. The proposed method has been demonstrated through a hypothetical case study of atmospheric dispersion.
Stolarski, R. S.; Butler, D. M.; Rundel, R. D.
1977-01-01
A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.
Multi-model ensemble hydrologic prediction and uncertainties analysis
Directory of Open Access Journals (Sweden)
S. Jiang
2014-09-01
Full Text Available Modelling uncertainties (i.e. input errors, parameter uncertainties and model structural errors inevitably exist in hydrological prediction. A lot of recent attention has focused on these, of which input error modelling, parameter optimization and multi-model ensemble strategies are the three most popular methods to demonstrate the impacts of modelling uncertainties. In this paper the Xinanjiang model, the Hybrid rainfall–runoff model and the HYMOD model were applied to the Mishui Basin, south China, for daily streamflow ensemble simulation and uncertainty analysis. The three models were first calibrated by two parameter optimization algorithms, namely, the Shuffled Complex Evolution method (SCE-UA and the Shuffled Complex Evolution Metropolis method (SCEM-UA; next, the input uncertainty was accounted for by introducing a normally-distributed error multiplier; then, the simulation sets calculated from the three models were combined by Bayesian model averaging (BMA. The results show that both these parameter optimization algorithms generate good streamflow simulations; specifically the SCEM-UA can imply parameter uncertainty and give the posterior distribution of the parameters. Considering the precipitation input uncertainty, the streamflow simulation precision does not improve very much. While the BMA combination not only improves the streamflow prediction precision, it also gives quantitative uncertainty bounds for the simulation sets. The SCEM-UA calculated prediction interval is better than the SCE-UA calculated one. These results suggest that considering the model parameters' uncertainties and doing multi-model ensemble simulations are very practical for streamflow prediction and flood forecasting, from which more precision prediction and more reliable uncertainty bounds can be generated.
Uncertainty models applied to the substation planning
Energy Technology Data Exchange (ETDEWEB)
Fontoura Filho, Roberto N. [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O.; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)
1994-12-31
The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.
Estimated Frequency Domain Model Uncertainties used in Robust Controller Design
DEFF Research Database (Denmark)
Tøffner-Clausen, S.; Andersen, Palle; Stoustrup, Jakob;
1994-01-01
This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are......This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are...
Kavetski, D.; Clark, M. P.; Fenicia, F.
2011-12-01
Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated
Model development and data uncertainty integration
Energy Technology Data Exchange (ETDEWEB)
Swinhoe, Martyn Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-12-02
The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the ^{240}Pu spontaneous fission neutron spectrum, the effect of P(ν) for ^{240}Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.
A conceptual framework for a mentoring model for nurse educators ...
African Journals Online (AJOL)
A conceptual framework for a mentoring model for nurse educators. ... recruiting and retaining nurse educators to meet the demands of teaching and learning ... approaches focusing on reasoning strategies, literature control and empirical data ...
Spatial uncertainty model for visual features using a Kinect™ sensor.
Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong
2012-01-01
This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.
Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor
Directory of Open Access Journals (Sweden)
Jae-Han Park
2012-06-01
Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.
Jacquin, A. P.
2012-04-01
This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed
Conceptual Model of Multidimensional Marketing Information System
Kriksciuniene, Dalia; Urbanskiene, Ruta
This article is aimed to analyse, why the information systems at the enterprise not always satisfy the expectations of marketing management specialists. The computerized systems more and more successfully serve information needs of those areas of enterprise management, where they can create the information equivalent of real management processes. Yet their inability to effectively fulfill marketing needs indicate the gaps not only in ability to structure marketing processes, but in the conceptual development of marketing information systems (MkIS) as well.
Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning
Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.
2016-12-01
Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate
Conceptual Models and Guidelines for Clinical Assessment of Financial Capacity.
Marson, Daniel
2016-09-01
The ability to manage financial affairs is a life skill of critical importance, and neuropsychologists are increasingly asked to assess financial capacity across a variety of settings. Sound clinical assessment of financial capacity requires knowledge and appreciation of applicable clinical conceptual models and principles. However, the literature has presented relatively little conceptual guidance for clinicians concerning financial capacity and its assessment. This article seeks to address this gap. The article presents six clinical models of financial capacity : (1) the early gerontological IADL model of Lawton, (2) the clinical skills model and (3) related cognitive psychological model developed by Marson and colleagues, (4) a financial decision-making model adapting earlier decisional capacity work of Appelbaum and Grisso, (5) a person-centered model of financial decision-making developed by Lichtenberg and colleagues, and (6) a recent model of financial capacity in the real world developed through the Institute of Medicine. Accompanying presentation of the models is discussion of conceptual and practical perspectives they represent for clinician assessment. Based on the models, the article concludes by presenting a series of conceptually oriented guidelines for clinical assessment of financial capacity. In summary, sound assessment of financial capacity requires knowledge and appreciation of clinical conceptual models and principles. Awareness of such models, principles and guidelines will strengthen and advance clinical assessment of financial capacity.
Investigating the Propagation of Meteorological Model Uncertainty for Tracer Modeling
Lopez-Coto, I.; Ghosh, S.; Karion, A.; Martin, C.; Mueller, K. L.; Prasad, K.; Whetstone, J. R.
2016-12-01
The North-East Corridor project aims to use a top-down inversion method to quantify sources of Greenhouse Gas (GHG) emissions in the urban areas of Washington DC and Baltimore at approximately 1km2 resolutions. The aim of this project is to help establish reliable measurement methods for quantifying and validating GHG emissions independently of the inventory methods typically used to guide mitigation efforts. Since inversion methods depend strongly on atmospheric transport modeling, analyzing the uncertainties on the meteorological fields and their propagation through the sensitivities of observations to surface fluxes (footprints) is a fundamental step. To this end, six configurations of the Weather Research and Forecasting Model (WRF-ARW) version 3.8 were used to generate an ensemble of meteorological simulations. Specifically, we used 4 planetary boundary layer parameterizations (YSU, MYNN2, BOULAC, QNSE), 2 sources of initial and boundary conditions (NARR and HRRR) and 1 configuration including the building energy parameterization (BEP) urban canopy model. The simulations were compared with more than 150 meteorological surface stations, a wind profiler and radiosondes for a month (February) in 2016 to account for the uncertainties and the ensemble spread for wind speed, direction and mixing height. In addition, we used the Stochastic Time-Inverted Lagrangian Transport model (STILT) to derive the sensitivity of 12 hypothetical observations to surface emissions (footprints) with each WRF configuration. The footprints and integrated sensitivities were compared and the resulting uncertainties estimated.
Towards a Model of Technology Adoption: A Conceptual Model Proposition
Costello, Pat; Moreton, Rob
A conceptual model for Information Communication Technology (ICT) adoption by Small Medium Enterprises (SMEs) is proposed. The research uses several ICT adoption models as its basis with theoretical underpinning provided by the Diffusion of Innovation theory and the Technology Acceptance Model (TAM). Taking an exploratory research approach the model was investigated amongst 200 SMEs whose core business is ICT. Evidence from this study demonstrates that these SMEs face the same issues as all other industry sectors. This work points out weaknesses in SMEs environments regarding ICT adoption and suggests what they may need to do to increase the success rate of any proposed adoption. The methodology for development of the framework is described and recommendations made for improved Government-led ICT adoption initiatives. Application of the general methodology has resulted in new opportunities to embed the ethos and culture surrounding the issues into the framework of new projects developed as a result of Government intervention. A conceptual model is proposed that may lead to a deeper understanding of the issues under consideration.
Southern marl prairies conceptual ecological model
Davis, S.M.; Loftus, W.F.; Gaiser, E.E.; Huffman, A.E.
2005-01-01
About 190,000 ha of higher-elevation marl prairies flank either side of Shark River Slough in the southern Everglades. Water levels typically drop below the ground surface each year in this landscape. Consequently, peat soil accretion is inhibited, and substrates consist either of calcitic marl produced by algal periphyton mats or exposed limestone bedrock. The southern marl prairies support complex mosaics of wet prairie, sawgrass sawgrass (Cladium jamaicense), tree islands, and tropical hammock communities and a high diversity of plant species. However, relatively short hydroperiods and annual dry downs provide stressful conditions for aquatic fauna, affecting survival in the dry season when surface water is absent. Here, we present a conceptual ecological model developed for this landscape through scientific concensus, use of empirical data, and modeling. The two major societal drivers affecting the southern marl prairies are water management practices and agricultural and urban development. These drivers lead to five groups of ecosystem stressors: loss of spatial extent and connectivity, shortened hydroperiod and increased drought severity, extended hydroperiod and drying pattern reversals, introduction and spread of non-native trees, and introduction and spread of non-native fishes. Major ecological attributes include periphyton mats, plant species diversity and community mosaic, Cape Sable seaside sparrow (Ammodramus maritimus mirabilis), marsh fishes and associated aquatic fauna prey base, American alligator (Alligator mississippiensis), and wading bird early dry season foraging. Water management and development are hypothesized to have a negative effect on the ecological attributes of the southern marl prairies in the following ways. Periphyton mats have decreased in cover in areas where hydroperiod has been significantly reduced and changed in community composition due to inverse responses to increased nutrient availability. Plant species diversity and
Modeling uncertainty in requirements engineering decision support
Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.
2005-01-01
One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.
Modeling uncertainty in requirements engineering decision support
Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.
2005-01-01
One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.
[Application of an uncertainty model for fibromyalgia].
Triviño Martínez, Ángeles; Solano Ruiz, M Carmen; Siles González, José
2016-04-01
Finding out women's experiences diagnosed with fibromyalgia applying the Theory of Uncertainty proposed by M. Mishel. A qualitative study was conducted, using a phenomenological approach. An Association of patients in the province of Alicante during the months of June 2012 to November 2013. A total of 14 women diagnosed with fibromyalgia participated in the study as volunteers, aged between 45 and 65 years. Information generated through structured interviews with recording and transcription, prior confidentiality pledge and informed consent. Analysis content by extracting different categories according to the theory proposed. The study patients perceive a high level of uncertainty related to the difficulty to deal with symptoms, uncertainty about diagnosis and treatment complexity. Moreover, the ability of coping with the disease it is influenced by social support, relationships with health professionals and help and information attending to patient associations. The health professional must provide clear information on the pathology to the fibromyalgia suffers, the larger lever of knowledge of the patients about their disease and the better the quality of the information provided, it is reported to be the less anxiety and uncertainty in the experience of the disease. Likewise patient associations should have health professionals in order to avoid bias in the information and advice with scientific evidence. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Assessing Uncertainty of Interspecies Correlation Estimation Models for Aromatic Compounds
We developed Interspecies Correlation Estimation (ICE) models for aromatic compounds containing 1 to 4 benzene rings to assess uncertainty in toxicity extrapolation in two data compilation approaches. ICE models are mathematical relationships between surrogate and predicted test ...
Reservoir management under geological uncertainty using fast model update
Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.
2015-01-01
Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU del
Development of a Prototype Model-Form Uncertainty Knowledge Base
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
A Stochastic Nonlinear Water Wave Model for Efficient Uncertainty Quantification
Bigoni, Daniele; Eskilsson, Claes
2014-01-01
A major challenge in next-generation industrial applications is to improve numerical analysis by quantifying uncertainties in predictions. In this work we present a stochastic formulation of a fully nonlinear and dispersive potential flow water wave model for the probabilistic description of the evolution waves. This model is discretized using the Stochastic Collocation Method (SCM), which provides an approximate surrogate of the model. This can be used to accurately and efficiently estimate the probability distribution of the unknown time dependent stochastic solution after the forward propagation of uncertainties. We revisit experimental benchmarks often used for validation of deterministic water wave models. We do this using a fully nonlinear and dispersive model and show how uncertainty in the model input can influence the model output. Based on numerical experiments and assumed uncertainties in boundary data, our analysis reveals that some of the known discrepancies from deterministic simulation in compa...
Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty
DEFF Research Database (Denmark)
Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens
the results of uncertainty analysis to predict the uncertainties in process design. For parameter estimation, large data-sets of experimentally measured property values for a wide range of pure compounds are taken from the CAPEC database. Classical frequentist approach i.e., least square method is adopted...... parameter, octanol/water partition coefficient, aqueous solubility, acentric factor, and liquid molar volume at 298 K. The performance of property models for these properties with the revised set of model parameters is highlighted through a set of compounds not considered in the regression step...... sensitive properties for each unit operation are also identified. This analysis can be used to reduce the uncertainties in property estimates for the properties of critical importance (by performing additional experiments to get better experimental data and better model parameter values). Thus...
Factoring uncertainty into restoration modeling of in-situ leach uranium mines
Johnson, Raymond H.; Friedel, Michael J.
2009-01-01
Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.
Conceptual model for assessment of inhalation exposure to manufactured nanoparticles
Schneider, T.; Brouwer, D.H.; Koponen, I.K.; Jensen, K.A.; Fransman, W.; Duuren-Stuurman, B. van; Tongeren, M. van; Tielemans, E.
2011-01-01
As workplace air measurements of manufactured nanoparticles are relatively expensive to conduct, models can be helpful for a first tier assessment of exposure. A conceptual model was developed to give a framework for such models. The basis for the model is an analysis of the fate and underlying
Urban drainage models simplifying uncertainty analysis for practitioners
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana
2013-01-01
There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a m...
Uncertainty and error in complex plasma chemistry models
Turner, Miles M.
2015-06-01
Chemistry models that include dozens of species and hundreds to thousands of reactions are common in low-temperature plasma physics. The rate constants used in such models are uncertain, because they are obtained from some combination of experiments and approximate theories. Since the predictions of these models are a function of the rate constants, these predictions must also be uncertain. However, systematic investigations of the influence of uncertain rate constants on model predictions are rare to non-existent. In this work we examine a particular chemistry model, for helium-oxygen plasmas. This chemistry is of topical interest because of its relevance to biomedical applications of atmospheric pressure plasmas. We trace the primary sources for every rate constant in the model, and hence associate an error bar (or equivalently, an uncertainty) with each. We then use a Monte Carlo procedure to quantify the uncertainty in predicted plasma species densities caused by the uncertainty in the rate constants. Under the conditions investigated, the range of uncertainty in most species densities is a factor of two to five. However, the uncertainty can vary strongly for different species, over time, and with other plasma conditions. There are extreme (pathological) cases where the uncertainty is more than a factor of ten. One should therefore be cautious in drawing any conclusion from plasma chemistry modelling, without first ensuring that the conclusion in question survives an examination of the related uncertainty.
Energy Technology Data Exchange (ETDEWEB)
Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2016-09-01
Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise
Conceptual Frameworks and Research Models on Resilience in Leadership
Directory of Open Access Journals (Sweden)
Janet Ledesma
2014-08-01
Full Text Available The purpose of this article was to discuss conceptual frameworks and research models on resilience theory. The constructs of resilience, the history of resilience theory, models of resilience, variables of resilience, career resilience, and organizational resilience will be examined and discussed as they relate to leadership development. The literature demonstrates that there is a direct relationship between the stress of the leader’s job and his or her ability to maintain resilience in the face of prolonged contact with adversity. This article discusses resilience theory as it relates to leadership development. The concept associated with resilience, which includes thriving and hardiness, is explored with the belief that resilient leaders are invaluable to the sustainability of an organization. In addition, the constructs of resilience and the history of resilience studies in the field of psychiatry, developmental psychopathy, human development, medicine, epidemiology, and the social sciences are examined. Survival, recovery, and thriving are concepts associated with resilience and describe the stage at which a person may be during or after facing adversity. The concept of “thriving” refers to a person’s ability to go beyond his or her original level of functioning and to grow and function despite repeated exposure to stressful experiences. The literature suggests a number of variables that characterize resilience and thriving. These variables include positive self-esteem, hardiness, strong coping skills, a sense of coherence, self-efficacy, optimism, strong social resources, adaptability, risk-taking, low fear of failure, determination, perseverance, and a high tolerance of uncertainty. These are reviewed in this article. The findings in this article suggest that those who develop leaders need to create safe environments to help emerging and existing leaders thrive as individuals and as organizational leaders in the area of resilience
Uncertainties in stellar evolution models: convective overshoot
Bressan, Alessandro; Marigo, Paola; Rosenfield, Philip; Tang, Jing
2014-01-01
In spite of the great effort made in the last decades to improve our understanding of stellar evolution, significant uncertainties remain due to our poor knowledge of some complex physical processes that require an empirical calibration, such as the efficiency of the interior mixing related to convective overshoot. Here we review the impact of convective overshoot on the evolution of stars during the main Hydrogen and Helium burning phases.
Uncertainties in Stellar Evolution Models: Convective Overshoot
Bressan, Alessandro; Girardi, Léo; Marigo, Paola; Rosenfield, Philip; Tang, Jing
In spite of the great effort made in the last decades to improve our understanding of stellar evolution, significant uncertainties remain due to our poor knowledge of some complex physical processes that require an empirical calibration, such as the efficiency of the interior mixing related to convective overshoot. Here we review the impact of convective overshoot on the evolution of stars during the main Hydrogen and Helium burning phases.
Modeling Uncertainty when Estimating IT Projects Costs
Winter, Michel; Mirbel, Isabelle; Crescenzo, Pierre
2014-01-01
In the current economic context, optimizing projects' cost is an obligation for a company to remain competitive in its market. Introducing statistical uncertainty in cost estimation is a good way to tackle the risk of going too far while minimizing the project budget: it allows the company to determine the best possible trade-off between estimated cost and acceptable risk. In this paper, we present new statistical estimators derived from the way IT companies estimate the projects' costs. In t...
Uncertainties in environmental radiological assessment models and their implications
Energy Technology Data Exchange (ETDEWEB)
Hoffman, F.O.; Miller, C.W.
1983-01-01
Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible.
A Multidisciplinary Design Optimization Model for AUV Synthetic Conceptual Design
Institute of Scientific and Technical Information of China (English)
BU Guang-zhi; ZHANG Yu-wen
2006-01-01
Autonomous undersea vehicle (AUV) is a typical complex engineering system. This paper studies the disciplines and coupled variables in AUV design with multidisciplinary design optimization (M DO) methods. The framework of AUV synthetic conceptual design is described first, and then a model with collaborative optimization is studied. At last,an example is given to verify the validity and efficiency of MDO in AUV synthetic conceptual design.
Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.
2014-12-01
Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping
Sikorska, A. E.; Scheidegger, A.; Banasik, K.; Rieckermann, J.
2012-04-01
Keywords: uncertainty assessment, rating curve uncertainties, Bayesian inference, rainfall-runoff models, small urban basins In hydrological flood forecasting, the problem of quantitative assessment of predictive uncertainties has been widely recognized. Despite several important findings in recent years, which helped to distinguish uncertainty contribution from input uncertainty (e.g., due to poor rainfall data), model structure deficits, parameter uncertainties and measurement errors, uncertainty analysis still remains a challenging task. This is especially true for small urbanized basins, where monitoring data are often poor. Among other things, measurement errors have been generally assumed to be significantly smaller than the other sources of uncertainty. It has been also shown that input error and model structure deficits are contributing more to the predictive uncertainties than uncertainties regarding the model parameters (Sikorska et al., 2011). These assumptions, however, are only correct when the modeled output is directly measurable in the system. Unfortunately, river discharge usually cannot be directly measured but is converted from the measured water stage with a rating curve method. The uncertainty introduced by the rating curve was shown in resent studies (Di Baldassarre et al., 2011) to be potentially significant in flood forecasting. This is especially true when extrapolating a rating curve above the measured level, which is often the case in (urban) flooding. In this work, we therefore investigated how flood predictions for small urban basins are affected by the uncertainties associated with the rating curve. To this aim, we augmented the model structure of a conceptual rainfall-runoff model to include the applied rating curve. This enabled us not only to directly modeled measurable water levels instead of discharges, but also to propagate the uncertainty of the rating curve through the model. To compare the importance of the rating curve to the
Meteorological Uncertainty of atmospheric Dispersion model results (MUD)
DEFF Research Database (Denmark)
Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik
The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario...... of the meteorological model results. These uncertainties stem from e.g. limits in meteorological obser-vations used to initialise meteorological forecast series. By perturbing the initial state of an NWP model run in agreement with the available observa-tional data, an ensemble of meteorological forecasts is produced....... However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties...
Meteorological Uncertainty of atmospheric Dispersion model results (MUD)
DEFF Research Database (Denmark)
Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik
The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the ‘most likely...... uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble......’ dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent...
Modeling Change Over Time: Conceptualization, Measurement, Analysis, and Interpretation
2009-11-12
2007 to 29-11-2008 4. TITLE AND SUBTITLE Modeling Change Over Time: Conceptualization, Measurement, Analysis, and Interpretation 5a. CONTRACT NUMBER...Multilevel Modeling Portal (www.ats.ucla.edu/stat/ mlm /) and the Web site of the Center for Multilevel Modeling (http://multilevel.ioe.ac.uk/index.html
Conceptual model for assessment of inhalation exposure: Defining modifying factors
Tielemans, E.; Schneider, T.; Goede, H.; Tischer, M.; Warren, N.; Kromhout, H.; Tongeren, M. van; Hemmen, J. van; Cherrie, J.W.
2008-01-01
The present paper proposes a source-receptor model to schematically describe inhalation exposure to help understand the complex processes leading to inhalation of hazardous substances. The model considers a stepwise transfer of a contaminant from the source to the receptor. The conceptual model is c
CML: the commonKADS conceptual modelling language
Schreiber, G.; Wielinga, B.J.; Akkermans, J.M.; Velde, van de W.; Anjewierden, A.A.
1994-01-01
We present a structured language for the specification of knowledge models according to the CommonKADS methodology. This language is called CML (Conceptual Modelling Language) and provides both a structured textual notation and a diagrammatic notation for expertise models. The use of our CML is illu
Moore, Lisa Simmons
This qualitative program evaluation examines the career decision-making processes and career choices of nine, African American women who participated in the Cooperating Hampton Roads Organization for Minorities in Engineering (CHROME) and who graduated from urban, rural or suburban high schools in the year 2000. The CHROME program is a nonprofit, pre-college intervention program that encourages underrepresented minority and female students to enter science, technically related, engineering, and math (STEM) career fields. The study describes career choices and decisions made by each participant over a five-year period since high school graduation. Data was collected through an Annual Report, Post High School Questionnaires, Environmental Support Questionnaires, Career Choice Questionnaires, Senior Reports, and standardized open-ended interviews. Data was analyzed using a model based on Helen C. Farmer's Conceptual Models, John Ogbu's Caste Theory and Feminist Theory. The CHROME program, based on its stated goals and tenets, was also analyzed against study findings. Findings indicated that participants received very low levels of support from counselors and teachers to pursue STEM careers and high levels of support from parents and family, the CHROME program and financial backing. Findings of this study also indicated that the majority of CHROME alumna persisted in STEM careers. The most successful participants, in terms of undergraduate degree completion and occupational prestige, were the African American women who remained single, experienced no critical incidents, came from a middle class to upper middle class socioeconomic background, and did not have children.
M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); D.J.C. van Dijk (Dick)
2007-01-01
textabstractWe forecast the term structure of U.S. Treasury zero-coupon bond yields by analyzing a range of models that have been used in the literature. We assess the relevance of parameter uncertainty by examining the added value of using Bayesian inference compared to frequentist estimation
Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations
Krauss, L M; White, M; Krauss, Lawrence M.; Gates, Evalyn; White, Martin
1993-01-01
We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.
Modelling theoretical uncertainties in phenomenological analyses for particle physics
Charles, Jérôme; Niess, Valentin; Silva, Luiz Vale
2016-01-01
The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding $p$-values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive $p$-value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavour p...
Modeling theoretical uncertainties in phenomenological analyses for particle physics
Energy Technology Data Exchange (ETDEWEB)
Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)
2017-04-15
The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)
An educational model for ensemble streamflow simulation and uncertainty analysis
National Research Council Canada - National Science Library
AghaKouchak, A; Nakhjiri, N; Habib, E
2013-01-01
...) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity...
Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations
1992-01-01
We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.
Directory of Open Access Journals (Sweden)
Zdeslav Hrepic
2010-09-01
Full Text Available We investigated introductory physics students’ mental models of sound propagation. We used a phenomenographic method to analyze the data in the study. In addition to the scientifically accepted Wave model, students used the “Entity” model to describe the propagation of sound. In this latter model sound is a self-standing entity, different from the medium through which it propagates. All other observed alternative models contain elements of both Entity and Wave models, but at the same time are distinct from each of the constituent models. We called these models “hybrid” or “blend” models. We discuss how students use these models in various contexts before and after instruction and how our findings contribute to the understanding of conceptual change. Implications of our findings for teaching are summarized.
Uncertainty and sensitivity analysis for photovoltaic system modeling.
Energy Technology Data Exchange (ETDEWEB)
Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk
2013-12-01
We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.
2015-03-01
Approved for public release; distribution is unlimited. ERDC/EL TN-15-1 March 2015 Developing Conceptual Models for Assessing Climate Change ...about climate change , contaminant availability, and TER-S conservation on installations. CONCEPTUAL MODEL BACKGROUND: Conceptual models are... Conceptual Models for Assessing Climate Change Impacts to Contaminant Availability in Terrestrial Ecosystems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c
Pandey, S.; Vesselinov, V. V.; O'Malley, D.; Karra, S.; Hansen, S. K.
2016-12-01
Models and data are used to characterize the extent of contamination and remediation, both of which are dependent upon the complex interplay of processes ranging from geochemical reactions, microbial metabolism, and pore-scale mixing to heterogeneous flow and external forcings. Characterization is wrought with important uncertainties related to the model itself (e.g. conceptualization, model implementation, parameter values) and the data used for model calibration (e.g. sparsity, measurement errors). This research consists of two primary components: (1) Developing numerical models that incorporate the complex hydrogeology and biogeochemistry that drive groundwater contamination and remediation; (2) Utilizing novel techniques for data/model-based analyses (such as parameter calibration and uncertainty quantification) to aid in decision support for optimal uncertainty reduction related to characterization and remediation of contaminated sites. The reactive transport models are developed using PFLOTRAN and are capable of simulating a wide range of biogeochemical and hydrologic conditions that affect the migration and remediation of groundwater contaminants under diverse field conditions. Data/model-based analyses are achieved using MADS, which utilizes Bayesian methods and Information Gap theory to address the data/model uncertainties discussed above. We also use these tools to evaluate different models, which vary in complexity, in order to weigh and rank models based on model accuracy (in representation of existing observations), model parsimony (everything else being equal, models with smaller number of model parameters are preferred), and model robustness (related to model predictions of unknown future states). These analyses are carried out on synthetic problems, but are directly related to real-world problems; for example, the modeled processes and data inputs are consistent with the conditions at the Los Alamos National Laboratory contamination sites (RDX and
Directory of Open Access Journals (Sweden)
H. Bormann
2005-01-01
Full Text Available Many model applications suffer from the fact that although it is well known that model application implies different sources of uncertainty there is no objective criterion to decide whether a model is suitable for a particular application or not. This paper introduces a comparative index between the uncertainty of a model and the change effects of scenario calculations which enables the modeller to objectively decide about suitability of a model to be applied in scenario analysis studies. The index is called "signal-to-noise-ratio", and it is applied for an exemplary scenario study which was performed within the GLOWA-IMPETUS project in Benin. The conceptual UHP model was applied on the upper Ouémé basin. Although model calibration and validation were successful, uncertainties on model parameters and input data could be identified. Applying the "signal-to-noise-ratio" on regional scale subcatchments of the upper Ouémé comparing water availability indicators for uncertainty studies and scenario analyses the UHP model turned out to be suitable to predict long-term water balances under the present poor data availability and changing environmental conditions in subhumid West Africa.
Directory of Open Access Journals (Sweden)
Francisco Moreno
2010-07-01
Full Text Available Hoy, gracias a los sistemas de posicionamiento global y dispositivos móviles equipados con sensores de rastreo, se puede recopilar una gran cantidad de datos sobre objetos móviles, es decir, datos espacio-temporales relacionados con el movimiento seguido por esos objetos. Por otro lado, las bodegas de datos, usualmente modeladas mediante una vista multidimensional de los datos, son bases de datos especializadas para soportar la toma de decisiones. Desafortunadamente, las bodegas de datos convencionales están principalmente orientadas al manejo de datos alfanuméricos. En este artículo, se incorporan elementos temporales a un modelo multidimensional conceptual espacial dando origen a un modelo multidimensional conceptual espacio-temporal. La propuesta se ilustra con un caso de estudio relacionado con la migración de animalesToday, thanks to global positioning systems technologies and mobile devices equipped with tracking sensors, and a lot of data about moving objects can be collected, e.g., spatio-temporal data related to the movement followed by objects. On the other hand, data warehouses, usually modeled using a multidimensional view of data, are specialized databases to support the decision-making process. Unfortunately, conventional data warehouses are mainly oriented to manage alphanumeric data. In this article, we incorporate temporal elements to a conceptual spatial multidimensional model resulting in a spatio-temporal multidimensional model. We illustrate our proposal with a case study related to animal migration.
Model and parameter uncertainty in IDF relationships under climate change
Chandra, Rupa; Saha, Ujjwal; Mujumdar, P. P.
2015-05-01
Quantifying distributional behavior of extreme events is crucial in hydrologic designs. Intensity Duration Frequency (IDF) relationships are used extensively in engineering especially in urban hydrology, to obtain return level of extreme rainfall event for a specified return period and duration. Major sources of uncertainty in the IDF relationships are due to insufficient quantity and quality of data leading to parameter uncertainty due to the distribution fitted to the data and uncertainty as a result of using multiple GCMs. It is important to study these uncertainties and propagate them to future for accurate assessment of return levels for future. The objective of this study is to quantify the uncertainties arising from parameters of the distribution fitted to data and the multiple GCM models using Bayesian approach. Posterior distribution of parameters is obtained from Bayes rule and the parameters are transformed to obtain return levels for a specified return period. Markov Chain Monte Carlo (MCMC) method using Metropolis Hastings algorithm is used to obtain the posterior distribution of parameters. Twenty six CMIP5 GCMs along with four RCP scenarios are considered for studying the effects of climate change and to obtain projected IDF relationships for the case study of Bangalore city in India. GCM uncertainty due to the use of multiple GCMs is treated using Reliability Ensemble Averaging (REA) technique along with the parameter uncertainty. Scale invariance theory is employed for obtaining short duration return levels from daily data. It is observed that the uncertainty in short duration rainfall return levels is high when compared to the longer durations. Further it is observed that parameter uncertainty is large compared to the model uncertainty.
Uncertainty analysis for a field-scale P loss model
Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...
A conceptual model for assessing the impact of electronic procurement
Boer, de Luitzen; Harink, Jeroen; Heijboer, Govert
2002-01-01
This paper aims to contribute to the development of a conceptual model for studying the direct and indirect impact of various forms of electronic procurement (EP) on a firm's integral purchasing (-related) costs. The model builds on existing classifications of purchasing-related costs and benefits a
A Conceptual Model for Episodes of Acute, Unscheduled Care.
Pines, Jesse M; Lotrecchiano, Gaetano R; Zocchi, Mark S; Lazar, Danielle; Leedekerken, Jacob B; Margolis, Gregg S; Carr, Brendan G
2016-10-01
We engaged in a 1-year process to develop a conceptual model representing an episode of acute, unscheduled care. Acute, unscheduled care includes acute illnesses (eg, nausea and vomiting), injuries, or exacerbations of chronic conditions (eg, worsening dyspnea in congestive heart failure) and is delivered in emergency departments, urgent care centers, and physicians' offices, as well as through telemedicine. We began with a literature search to define an acute episode of care and to identify existing conceptual models used in health care. In accordance with this information, we then drafted a preliminary conceptual model and collected stakeholder feedback, using online focus groups and concept mapping. Two technical expert panels reviewed the draft model, examined the stakeholder feedback, and discussed ways the model could be improved. After integrating the experts' comments, we solicited public comment on the model and made final revisions. The final conceptual model includes social and individual determinants of health that influence the incidence of acute illness and injury, factors that affect care-seeking decisions, specific delivery settings where acute care is provided, and outcomes and costs associated with the acute care system. We end with recommendations for how researchers, policymakers, payers, patients, and providers can use the model to identify and prioritize ways to improve acute care delivery.
A new conceptual model for aeolian transport rates on beaches
De Vries, S.; Stive, M.J.F.; Van Rijn, L.; Ranasinghe, R.
2012-01-01
In this paper a new conceptual model for aeolian sediment transport rates is presented. Traditional sediment transport formulations have known limitations when applied to coastal beach situations. A linear model for sediment transport rates with respect to wind speed is proposed and supported by
Conceptual Frameworks and Research Models on Resilience in Leadership
Janet Ledesma
2014-01-01
The purpose of this article was to discuss conceptual frameworks and research models on resilience theory. The constructs of resilience, the history of resilience theory, models of resilience, variables of resilience, career resilience, and organizational resilience will be examined and discussed as they relate to leadership development. The literature demonstrates that there is a direct relationship between the stress...
Based Instructional Model on Students' Conceptual Change and ...
African Journals Online (AJOL)
FIRST LADY
instructional model-Generative Learning Model (GLM) on students' conceptual change ... today's work force requires people who could think and have acquired the ... education is one of the most powerful instrument for enabling all members of ... psychology and education such as developmental psychology, cognitive and.
Conceptual Model of Artifacts for Design Science Research
DEFF Research Database (Denmark)
Bækgaard, Lars
2015-01-01
We present a conceptual model of design science research artifacts. The model views an artifact at three levels. At the artifact level a selected artifact is viewed as a combination of material and immaterial aspects and a set of representations hereof. At the design level the selected artifact...
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood
Uncertainty Quantification and Validation for RANS Turbulence Models
Oliver, Todd; Moser, Robert
2011-11-01
Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].
Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework
Chen, Lei; Gong, Yongwei; Shen, Zhenyao
2016-06-01
Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.
Design of Conceptual Model in Digital Map Database
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
The components of map information are analyzed theoretically in this paper,and the map information includes mainly the spatial information,attributive information and temporal characteristics information.Then the digital map entity is defined according to construction characteristics of the map information.Finally,on the basis of the analyses of the construction characteristics of digital map entity and present conceptual model of digital map database,an Abstracted conceptual model of digital map database is presented.And the Normal Form theory of relational database is discussed particularly.
Motivation to Improve Work through Learning: A Conceptual Model
Directory of Open Access Journals (Sweden)
Kueh Hua Ng
2014-12-01
Full Text Available This study aims to enhance our current understanding of the transfer of training by proposing a conceptual model that supports the mediating role of motivation to improve work through learning about the relationship between social support and the transfer of training. The examination of motivation to improve work through motivation to improve work through a learning construct offers a holistic view pertaining to a learner's profile in a workplace setting, which emphasizes learning for the improvement of work performance. The proposed conceptual model is expected to benefit human resource development theory building, as well as field practitioners by emphasizing the motivational aspects crucial for successful transfer of training.
Sensitivities and uncertainties of modeled ground temperatures in mountain environments
Directory of Open Access Journals (Sweden)
S. Gubler
2013-08-01
Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several
Three-dimensional conceptual model for service-oriented simulation
Institute of Scientific and Technical Information of China (English)
Wen-guang WANG; Wei-ping WANG; Justyna ZANDER; Yi-fan ZHU
2009-01-01
In this letter, we propose a novel three-dimensional conceptual model for an emerging service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks, In particular, the model inspects the crossover between the disciplines of modeling and simulation,service-orientation, and software/systems engineering. Finally, two specific simulation frameworks are studied as examples.
A new conceptual model for aeolian transport rates on beaches
de Vries, S.; Stive, M.J.F.; van Rijn, L.; Ranasinghe, R.
2012-01-01
In this paper a new conceptual model for aeolian sediment transport rates is presented. Traditional sediment transport formulations have known limitations when applied to coastal beach situations. A linear model for sediment transport rates with respect to wind speed is proposed and supported by both data and numerical model simulations. The presented model does not solve complex wind fields and is therefore very easily applicable. Physical principles such as the presence of a threshold veloc...
Three-dimensional conceptual model for service-oriented simulation
Wang, Wenguang; Zander, Justyna; Zhu, Yifan; 10.1631/jzus.A0920258
2009-01-01
In this letter, we propose a novel three-dimensional conceptual model for an emerging service-oriented simulation paradigm. The model can be used as a guideline or an analytic means to find the potential and possible future directions of the current simulation frameworks. In particular, the model inspects the crossover between the disciplines of modeling and simulation, service-orientation, and software/systems engineering. Finally, two specific simulation frameworks are studied as examples.
Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling.
Dotto, Cintia B S; Mannina, Giorgio; Kleidorfer, Manfred; Vezzaro, Luca; Henrichs, Malte; McCarthy, David T; Freni, Gabriele; Rauch, Wolfgang; Deletic, Ana
2012-05-15
Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multi-objective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper provides the specific advantages and disadvantages of each method. In relation to computational efficiency (i.e. number of iterations required to generate the probability
UNCERTAINTY SUPPLY CHAIN MODEL AND TRANSPORT IN ITS DEPLOYMENTS
Directory of Open Access Journals (Sweden)
Fabiana Lucena Oliveira
2014-05-01
Full Text Available This article discusses the Model Uncertainty of Supply Chain, and proposes a matrix with their transportation modes best suited to their chains. From the detailed analysis of the matrix of uncertainty, it is suggested transportation modes best suited to the management of these chains, so that transport is the most appropriate optimization of the gains previously proposed by the original model, particularly when supply chains are distant from suppliers of raw materials and / or supplies.Here we analyze in detail Agile Supply Chains, which is a result of Uncertainty Supply Chain Model, with special attention to Manaus Industrial Center. This research was done at Manaus Industrial Pole, which is a model of industrial agglomerations, based in Manaus, State of Amazonas (Brazil, which contemplates different supply chains and strategies sharing same infrastructure of transport, handling and storage and clearance process and uses inbound for suppliers of raw material. The state of art contemplates supply chain management, uncertainty supply chain model, agile supply chains, Manaus Industrial Center (MIC and Brazilian legislation, as a business case, and presents concepts and features, of each one. The main goal is to present and discuss how transport is able to support Uncertainty Supply Chain Model, in order to complete management model. The results obtained confirms the hypothesis of integrated logistics processes are able to guarantee attractivity for industrial agglomerations, and open discussions when the suppliers are far from the manufacturer center, in a logistics management.
Uncertainty quantification of squeal instability via surrogate modelling
Nobari, Amir; Ouyang, Huajiang; Bannister, Paul
2015-08-01
One of the major issues that car manufacturers are facing is the noise and vibration of brake systems. Of the different sorts of noise and vibration, which a brake system may generate, squeal as an irritating high-frequency noise costs the manufacturers significantly. Despite considerable research that has been conducted on brake squeal, the root cause of squeal is still not fully understood. The most common assumption, however, is mode-coupling. Complex eigenvalue analysis is the most widely used approach to the analysis of brake squeal problems. One of the major drawbacks of this technique, nevertheless, is that the effects of variability and uncertainty are not included in the results. Apparently, uncertainty and variability are two inseparable parts of any brake system. Uncertainty is mainly caused by friction, contact, wear and thermal effects while variability mostly stems from the manufacturing process, material properties and component geometries. Evaluating the effects of uncertainty and variability in the complex eigenvalue analysis improves the predictability of noise propensity and helps produce a more robust design. The biggest hurdle in the uncertainty analysis of brake systems is the computational cost and time. Most uncertainty analysis techniques rely on the results of many deterministic analyses. A full finite element model of a brake system typically consists of millions of degrees-of-freedom and many load cases. Running time of such models is so long that automotive industry is reluctant to do many deterministic analyses. This paper, instead, proposes an efficient method of uncertainty propagation via surrogate modelling. A surrogate model of a brake system is constructed in order to reproduce the outputs of the large-scale finite element model and overcome the issue of computational workloads. The probability distribution of the real part of an unstable mode can then be obtained by using the surrogate model with a massive saving of
Impact of inherent meteorology uncertainty on air quality model predictions
It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...
Quantification of Modelling Uncertainties in Turbulent Flow Simulations
Edeling, W.N.
2015-01-01
The goal of this thesis is to make predictive simulations with Reynolds-Averaged Navier-Stokes (RANS) turbulence models, i.e. simulations with a systematic treatment of model and data uncertainties and their propagation through a computational model to produce predictions of quantities of interest w
Quantification of Modelling Uncertainties in Turbulent Flow Simulations
Edeling, W.N.
2015-01-01
The goal of this thesis is to make predictive simulations with Reynolds-Averaged Navier-Stokes (RANS) turbulence models, i.e. simulations with a systematic treatment of model and data uncertainties and their propagation through a computational model to produce predictions of quantities of interest
Uncertainty quantification in Rothermel's Model using an efficient sampling method
Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick
2007-01-01
The purpose of the present work is to quantify parametric uncertainty in Rothermelâs wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...
Energy Technology Data Exchange (ETDEWEB)
Ericsson, Lars O. (Lars O. Ericsson Consulting AB (Sweden)); Holmen, Johan (Golder Associates (Sweden))
2010-12-15
The primary aim of this report is: - To present a supplementary, in-depth evaluation of certain conceptual simplifications, descriptions and model uncertainties in conjunction with regional groundwater simulation, which in the first instance refer to model depth, topography, groundwater table level and boundary conditions. Implementation was based on geo-scientifically available data compilations from the Smaaland region but different conceptual assumptions have been analysed
Immersive Data Comprehension: Visualizing Uncertainty in Measurable Models
Directory of Open Access Journals (Sweden)
Pere eBrunet
2015-09-01
Full Text Available Recent advances in 3D scanning technologies have opened new possibilities in a broad range of applications includingcultural heritage, medicine, civil engineering and urban planning. Virtual Reality systems can provide new tools toprofessionals that want to understand acquired 3D models. In this paper, we review the concept of data comprehension with an emphasis on visualization and inspection tools on immersive setups. We claim that in most application fields, data comprehension requires model measurements which in turn should be based on the explicit visualization of uncertainty. As 3D digital representations are not faithful, information on their fidelity at local level should be included in the model itself as uncertainty bounds. We propose the concept of Measurable 3D Models as digital models that explicitly encode local uncertainty bounds related to their quality. We claim that professionals and experts can strongly benefit from immersive interaction through new specific, fidelity-aware measurement tools which can facilitate 3D data comprehension. Since noise and processing errors are ubiquitous in acquired datasets, we discuss the estimation, representation and visualization of data uncertainty. We show that, based on typical user requirements in Cultural Heritage and other domains, application-oriented measuring tools in 3D models must consider uncertainty and local error bounds. We also discuss the requirements of immersive interaction tools for the comprehension of huge 3D and nD datasets acquired from real objects.
Multilevel Models: Conceptual Framework and Applicability
Directory of Open Access Journals (Sweden)
Roxana-Otilia-Sonia Hrițcu
2015-10-01
Full Text Available Individuals and the social or organizational groups they belong to can be viewed as a hierarchical system situated on different levels. Individuals are situated on the first level of the hierarchy and they are nested together on the higher levels. Individuals interact with the social groups they belong to and are influenced by these groups. Traditional methods that study the relationships between data, like simple regression, do not take into account the hierarchical structure of the data and the effects of a group membership and, hence, results may be invalidated. Unlike standard regression modelling, the multilevel approach takes into account the individuals as well as the groups to which they belong. To take advantage of the multilevel analysis it is important that we recognize the multilevel characteristics of the data. In this article we introduce the outlines of multilevel data and we describe the models that work with such data. We introduce the basic multilevel model, the two-level model: students can be nested into classes, individuals into countries and the general two-level model can be extended very easily to several levels. Multilevel analysis has begun to be extensively used in many research areas. We present the most frequent study areas where multilevel models are used, such as sociological studies, education, psychological research, health studies, demography, epidemiology, biology, environmental studies and entrepreneurship. We support the idea that since hierarchies exist everywhere, multilevel data should be recognized and analyzed properly by using multilevel modelling.
Reducing uncertainty in calibrating aquifer flow model with multiple scales of heterogeneity.
Zhang, Ye
2014-01-01
Modeling and calibration of natural aquifers with multiple scales of heterogeneity is a challenging task due to limited subsurface access. While computer modeling plays an essential role in aquifer studies, large uncertainty exists in developing a conceptual model of an aquifer and in calibrating the model for decision making. Due to uncertainties such as a lack of understanding of subsurface processes and a lack of techniques to parameterize the subsurface environment (including hydraulic conductivity, source/sink rate, and aquifer boundary conditions), existing aquifer models often suffer nonuniqueness in calibration, leading to poor predictive capability. A robust calibration methodology is needed that can address the simultaneous estimations of aquifer parameters, source/sink, and boundary conditions. In this paper, we propose a multistage and multiscale approach that addresses subsurface heterogeneity at multiple scales, while reducing uncertainty in estimating the model parameters and model boundary conditions. The key to this approach lies in the appropriate development, verification, and synthesis of existing and new techniques of static and dynamic data integration. In particular, based on a given set of observation data, new inversion techniques can be first used to estimate aquifer large-scale effective parameters and smoothed boundary conditions, based on which parameter and boundary condition estimation can be refined at increasing detail using standard or highly parameterized estimation techniques.
An independent verification and validation of the Future Theater Level Model conceptual model
Energy Technology Data Exchange (ETDEWEB)
Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.
1994-08-01
This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.
Virtual Business Collaboration Conceptual Knowledge Model (VBCKM
Directory of Open Access Journals (Sweden)
Morcous Massoud Yassa
2012-07-01
Full Text Available Within the context of virtual business collaboration modeling, many pervious works have been accepted to consider some essential virtual business collaborative models. A practical dynamic virtual organization may be a combination of those models and some other elemental features with some modifications to meet the business opportunity requirements. Therefore, some guidelines and rules are needed to help in constructing a practical collaboration model. This work aims to determine the essential features that must be considered in order to automate the creation of dynamic virtual organization. By integrate “Select-and-Modify” approach with “CommonKADS” methodology, the work of this paper propose a strategy-driven approach for virtual business collaboration modeling construction. Also, some generic knowledge-based components have been designed to support this creation, which can increase the flexibility of the knowledge-based approach facilitates future integration. This paper is considered as integration and extension to the recent work “New Federated Collaborative Networked Organization Model (FCNOM”, which has proposed an integrated framework that combines the existed collaborative-networked organization perspectives, as well as, proposes new.
Systemic change increases model projection uncertainty
Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floortje; Faaij, André
2014-01-01
Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship re
Uncertainty Consideration in Watershed Scale Models
Watershed scale hydrologic and water quality models have been used with increasing frequency to devise alternative pollution control strategies. With recent reenactment of the 1972 Clean Water Act’s TMDL (total maximum daily load) component, some of the watershed scale models are being recommended ...
Designing Public Library Websites for Teens: A Conceptual Model
Naughton, Robin Amanda
2012-01-01
The main goal of this research study was to develop a conceptual model for the design of public library websites for teens (TLWs) that would enable designers and librarians to create library websites that better suit teens' information needs and practices. It bridges a gap in the research literature between user interface design in human-computer…
Conceptualizations of Creativity: Comparing Theories and Models of Giftedness
Miller, Angie L.
2012-01-01
This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…
A Conceptual Model of the World of Work.
VanRooy, William H.
The conceptual model described in this paper resulted from the need to organize a body of knowledge related to the world of work which would enable curriculum developers to prepare accurate, realistic instructional materials. The world of work is described by applying Malinowski's scientific study of the structural components of culture. It is…
A New Conceptual Model for Understanding International Students' College Needs
Alfattal, Eyad
2016-01-01
This study concerns the theory and practice of international marketing in higher education with the purpose of exploring a conceptual model for understanding international students' needs in the context of a four-year college in the United States. A transcendental phenomenological design was employed to investigate the essence of international…
Conceptualizations of Creativity: Comparing Theories and Models of Giftedness
Miller, Angie L.
2012-01-01
This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…
A Conceptual Model for Effective Distance Learning in Higher Education
Farajollahi, Mehran; Zare, Hosein; Hormozi, Mahmood; Sarmadi, Mohammad Reza; Zarifsanaee, Nahid
2010-01-01
The present research aims at presenting a conceptual model for effective distance learning in higher education. Findings of this research shows that an understanding of the technological capabilities and learning theories especially constructive theory and independent learning theory and communicative and interaction theory in Distance learning is…
Designing Public Library Websites for Teens: A Conceptual Model
Naughton, Robin Amanda
2012-01-01
The main goal of this research study was to develop a conceptual model for the design of public library websites for teens (TLWs) that would enable designers and librarians to create library websites that better suit teens' information needs and practices. It bridges a gap in the research literature between user interface design in…
Conceptual model for reinforced grass on inner dike slopes
ComCoast
2005-01-01
A desk study has been carried out in order to develop a conceptual model for the erosion of inner dike slopes with reinforced grass cover. Based on the results the following can be concluded: The presence of a geosynthetic in a grass slope can be taken into account in the EPM method by increasing
Mapping the Territory: A Conceptual Model of Scholastic Journalism.
Arnold, Mary
1991-01-01
Describes scholastic journalism as the teaching of secondary school students to gather, process, and present information to an audience. Offers a model focusing upon scholastic journalism's conceptual areas of law and ethics, history and cultural diversity, technology and financial support, media and content, pedagogy, and working context as a…
Uncertainty of Modal Parameters Estimated by ARMA Models
DEFF Research Database (Denmark)
Jensen, Jakob Laigaard; Brincker, Rune; Rytter, Anders
In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the param......In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty...... by a simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been chosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...
Multiphysics modeling and uncertainty quantification for an active composite reflector
Peterson, Lee D.; Bradford, S. C.; Schiermeier, John E.; Agnes, Gregory S.; Basinger, Scott A.
2013-09-01
A multiphysics, high resolution simulation of an actively controlled, composite reflector panel is developed to extrapolate from ground test results to flight performance. The subject test article has previously demonstrated sub-micron corrected shape in a controlled laboratory thermal load. This paper develops a model of the on-orbit performance of the panel under realistic thermal loads, with an active heater control system, and performs an uncertainty quantification of the predicted response. The primary contribution of this paper is the first reported application of the Sandia developed Sierra mechanics simulation tools to a spacecraft multiphysics simulation of a closed-loop system, including uncertainty quantification. The simulation was developed so as to have sufficient resolution to capture the residual panel shape error that remains after the thermal and mechanical control loops are closed. An uncertainty quantification analysis was performed to assess the predicted tolerance in the closed-loop wavefront error. Key tools used for the uncertainty quantification are also described.
An educational model for ensemble streamflow simulation and uncertainty analysis
Directory of Open Access Journals (Sweden)
A. AghaKouchak
2013-02-01
Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.
Numerical Modelling of Structures with Uncertainties
Directory of Open Access Journals (Sweden)
Kahsin Maciej
2017-04-01
Full Text Available The nature of environmental interactions, as well as large dimensions and complex structure of marine offshore objects, make designing, building and operation of these objects a great challenge. This is the reason why a vast majority of investment cases of this type include structural analysis, performed using scaled laboratory models and complemented by extended computer simulations. The present paper focuses on FEM modelling of the offshore wind turbine supporting structure. Then problem is studied using the modal analysis, sensitivity analysis, as well as the design of experiment (DOE and response surface model (RSM methods. The results of modal analysis based simulations were used for assessing the quality of the FEM model against the data measured during the experimental modal analysis of the scaled laboratory model for different support conditions. The sensitivity analysis, in turn, has provided opportunities for assessing the effect of individual FEM model parameters on the dynamic response of the examined supporting structure. The DOE and RSM methods allowed to determine the effect of model parameter changes on the supporting structure response.
A new conceptual model of convection
Energy Technology Data Exchange (ETDEWEB)
Walcek, C. [State Univ. of New York, Albany, NY (United States)
1995-09-01
Classical cumulus parameterizations assume that cumulus clouds are entraining plumes of hot air rising through the atmosphere. However, ample evidence shows that clouds cannot be simulated using this approach. Dr. Walcek suggests that cumulus clouds can be reasonably simulated by assuming that buoyant plumes detrain mass as they rise through the atmosphere. Walcek successfully simulates measurements of tropical convection using this detraining model of cumulus convection. Comparisons with measurements suggest that buoyant plumes encounter resistance to upward movement as they pass through dry layers in the atmosphere. This probably results from turbulent mixing and evaporation of cloud water, which generates negatively buoyant mixtures which detrain from the upward moving plume. This mass flux model of detraining plumes is considerably simpler than existing mass flux models, yet reproduces many of the measured effects associated with convective activity. 1 fig.
A Conceptualized Investment Model of Crowdfunding
DEFF Research Database (Denmark)
Tomczak, A.; Brem, Alexander
2013-01-01
Crowdfunding is growing in popularity as a new form of both investment opportunity and source of venture capital. This article takes a view on whether crowdfunding is a replacement or an addition to traditional seed capital sources in the early stages of a new venture. With access to angel...... investment decreasing since the financial crisis of 2008, crowdfunding is of great importance to start-ups seeking starting capital. However, little effort has been made to define the investment model of crowdfunding with both crowdfunder and crowdfundee in mind. Drawing on an in-depth review of current...... literature on crowdfunding, this article creates an investment model of crowdfunding with various reward models available to investor and investee in mind. This article provides an extensive survey of the environment of crowdfunding based on current literature. It offers a jumping off point and a thorough...
A Conceptualized Investment Model of Crowdfunding
DEFF Research Database (Denmark)
Tomczak, A.; Brem, Alexander
2013-01-01
investment decreasing since the financial crisis of 2008, crowdfunding is of great importance to start-ups seeking starting capital. However, little effort has been made to define the investment model of crowdfunding with both crowdfunder and crowdfundee in mind. Drawing on an in-depth review of current......Crowdfunding is growing in popularity as a new form of both investment opportunity and source of venture capital. This article takes a view on whether crowdfunding is a replacement or an addition to traditional seed capital sources in the early stages of a new venture. With access to angel...... literature on crowdfunding, this article creates an investment model of crowdfunding with various reward models available to investor and investee in mind. This article provides an extensive survey of the environment of crowdfunding based on current literature. It offers a jumping off point and a thorough...
Integration of inaccurate data into model building and uncertainty assessment
Energy Technology Data Exchange (ETDEWEB)
Coleou, Thierry
1998-12-31
Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.
Enhancing uncertainty tolerance in modelling creep of ligaments.
Reda Taha, M M; Lucero, J
2006-09-01
The difficulty in performing biomechanical tests and the scarcity of biomechanical experimental databases necessitate extending the current knowledge base to allow efficient modelling using limited data sets. This study suggests a framework to reduce uncertainties in biomechanical systems using limited data sets. The study also shows how sparse data and epistemic input can be exploited using fuzzy logic to represent biomechanical relations. An example application to model collagen fibre recruitment in the medial collateral ligaments during time-dependent deformation under cyclic loading (creep) is presented. The study suggests a quality metric that can be employed to observe and enhance uncertainty tolerance in the modelling process.
Testing conceptual unsaturated zone flow models for Yucca Mountain
Energy Technology Data Exchange (ETDEWEB)
Brown, T.P.; Lehman, L.L. [L. Lehman & Associates, Inc., Burnsville, MN (United States); Nieber, J.L. [Univ. of Minnesota, St. Paul, MN (United States)
1994-12-31
An important component of site characterization and suitability assessment of the proposed nuclear waste repository at Yucca Mountain, Nevada is determination of the most appropriate conceptual model of the hydrologic mechanisms governing saturated and unsaturated flow for the site. As observers in the ITNRAVAL Unsaturated Zone Working Group, L. Lehman & Associates conducted a modeling exercise which numerically examined alternative conceptual flow models. Information was provided to the Working Group by the U.S. Geological Survey. Additional published data were utilized to fill in data gaps and to provide additional confidence in results. Data were modeled utilizing one and two dimensional matrix and fracture numerical models. Good agreement was obtained using a 2-dimensional dual porosity fracture flow model. Additional measures are needed to constrain the field conditions enough to validate conceptual models using numerical models. Geochemical data on tritium, chlorine-36, or carbon-14 concentrations or temperature profiles which can give estimates of time since recharge for water in the unsaturated zone, are needed to eliminate the non-uniqueness of various model solutions.
Spatial uncertainty assessment in modelling reference evapotranspiration at regional scale
Directory of Open Access Journals (Sweden)
G. Buttafuoco
2010-07-01
Full Text Available Evapotranspiration is one of the major components of the water balance and has been identified as a key factor in hydrological modelling. For this reason, several methods have been developed to calculate the reference evapotranspiration (ET_{0}. In modelling reference evapotranspiration it is inevitable that both model and data input will present some uncertainty. Whatever model is used, the errors in the input will propagate to the output of the calculated ET_{0}. Neglecting information about estimation uncertainty, however, may lead to improper decision-making and water resources management. One geostatistical approach to spatial analysis is stochastic simulation, which draws alternative and equally probable, realizations of a regionalized variable. Differences between the realizations provide a measure of spatial uncertainty and allow to carry out an error propagation analysis. Among the evapotranspiration models, the Hargreaves-Samani model was used.
The aim of this paper was to assess spatial uncertainty of a monthly reference evapotranspiration model resulting from the uncertainties in the input attributes (mainly temperature at regional scale. A case study was presented for Calabria region (southern Italy. Temperature data were jointly simulated by conditional turning bands simulation with elevation as external drift and 500 realizations were generated.
The ET_{0} was then estimated for each set of the 500 realizations of the input variables, and the ensemble of the model outputs was used to infer the reference evapotranspiration probability distribution function. This approach allowed to delineate the areas characterized by greater uncertainty, to improve supplementary sampling strategies and ET_{0} value predictions.
Learning strategies: a synthesis and conceptual model
Hattie, John A. C.; Donoghue, Gregory M.
2016-08-01
The purpose of this article is to explore a model of learning that proposes that various learning strategies are powerful at certain stages in the learning cycle. The model describes three inputs and outcomes (skill, will and thrill), success criteria, three phases of learning (surface, deep and transfer) and an acquiring and consolidation phase within each of the surface and deep phases. A synthesis of 228 meta-analyses led to the identification of the most effective strategies. The results indicate that there is a subset of strategies that are effective, but this effectiveness depends on the phase of the model in which they are implemented. Further, it is best not to run separate sessions on learning strategies but to embed the various strategies within the content of the subject, to be clearer about developing both surface and deep learning, and promoting their associated optimal strategies and to teach the skills of transfer of learning. The article concludes with a discussion of questions raised by the model that need further research.
Conceptualizing Evolving Models of Educational Development
Fraser, Kym; Gosling, David; Sorcinelli, Mary Deane
2010-01-01
Educational development, which the authors use to refer to the field of professional and strategic development associated with university and college learning and teaching, can be described in many ways by referring to its different aspects. In this article the authors endeavor to categorize many of the models that have been used to describe…
Conceptual Models as Tools for Communication Across Disciplines
Directory of Open Access Journals (Sweden)
Marieke Heemskerk
2003-12-01
Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.
Di Vittorio, Alan; Mao, Jiafu; Shi, Xiaoying
2017-04-01
Several climate adaptation and mitigation strategies incorporate Land Use and Land Cover Change (LULCC) to address global carbon balance and climate. However, LULCC is not consistent across the CMIP5 model simulations because only the land use input is harmonized. The associated LULCC uncertainty generates uncertainty in regional and global carbon and climate dynamics that obfuscates the evaluation of whether such strategies are effective in meeting their goals. For example, the integrated Earth System Model (iESM) overestimates 2004 atmospheric CO2 concentration by 14 ppmv, and we explore the contribution of historical LULCC uncertainty to this bias in relation to the effects of CO2 fertilization, climate change, and nitrogen deposition on terrestrial carbon. Using identical land use input, a chronologically referenced LULCC that accounts for pasture, as opposed to the default year-2000 referenced LULCC, increases this bias to 20 ppmv because more forest needs to be cleared for land use. Assuming maximum forest retention for all land conversion reduces the new bias to 19 ppmv, while minimum forest retention increases the new bias to 24 ppmv. There is a 33 Pg land carbon uncertainty range due to maximizing versus minimizing forest area, which is 80% of the estimated 41 PgC gain in land carbon due to CO2 fertilization combined with climate change from 1850-2004 and 150% of the estimated 22 PgC gain due to nitrogen deposition. These results demonstrate that LULCC accuracy and uncertainty are critical for estimating the carbon cycle, and also that LULCC may be an important lever for constraining global carbon estimates. Furthermore, different land conversion assumptions can generate local differences of over 1.0 °C between the two forest retention cases with less than 5% difference in tree cover within a grid cell. Whether these temperature differences are positive or negative depends more on region than on latitude. Sensible heat appears to be more sensitive than
Estimation and uncertainty of reversible Markov models
Trendelkamp-Schroer, Benjamin; Paul, Fabian; Noé, Frank
2015-01-01
Reversibility is a key concept in the theory of Markov models, simplified kinetic models for the conforma- tion dynamics of molecules. The analysis and interpretation of the transition matrix encoding the kinetic properties of the model relies heavily on the reversibility property. The estimation of a reversible transition matrix from simulation data is therefore crucial to the successful application of the previously developed theory. In this work we discuss methods for the maximum likelihood estimation of transition matrices from finite simulation data and present a new algorithm for the estimation if reversibility with respect to a given stationary vector is desired. We also develop new methods for the Bayesian posterior inference of reversible transition matrices with and without given stationary vector taking into account the need for a suitable prior distribution preserving the meta-stable features of the observed process during posterior inference.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Model uncertainty and systematic risk in US banking
Baele, L.T.M.; De Bruyckere, Valerie; De Jonghe, O.G.; Vander Vennet, Rudi
2015-01-01
This paper uses Bayesian Model Averaging to examine the driving factors of equity returns of US Bank Holding Companies. BMA has as an advantage over OLS that it accounts for the considerable uncertainty about the correct set (model) of bank risk factors. We find that out of a broad set of 12 risk fa
River meander modeling and confronting uncertainty.
Energy Technology Data Exchange (ETDEWEB)
Posner, Ari J. (University of Arizona Tucson, AZ)
2011-05-01
This study examines the meandering phenomenon as it occurs in media throughout terrestrial, glacial, atmospheric, and aquatic environments. Analysis of the minimum energy principle, along with theories of Coriolis forces (and random walks to explain the meandering phenomenon) found that these theories apply at different temporal and spatial scales. Coriolis forces might induce topological changes resulting in meandering planforms. The minimum energy principle might explain how these forces combine to limit the sinuosity to depth and width ratios that are common throughout various media. The study then compares the first order analytical solutions for flow field by Ikeda, et al. (1981) and Johannesson and Parker (1989b). Ikeda's et al. linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g., cohesiveness, stratigraphy, or vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of a meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations several measures are formulated in order to determine which of the resulting planforms is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model.
Modeling in transport phenomena a conceptual approach
Tosun, Ismail
2007-01-01
Modeling in Transport Phenomena, Second Edition presents and clearly explains with example problems the basic concepts and their applications to fluid flow, heat transfer, mass transfer, chemical reaction engineering and thermodynamics. A balanced approach is presented between analysis and synthesis, students will understand how to use the solution in engineering analysis. Systematic derivations of the equations and the physical significance of each term are given in detail, for students to easily understand and follow up the material. There is a strong incentive in science and engineering to
Hassan, Rania A.
In the design of complex large-scale spacecraft systems that involve a large number of components and subsystems, many specialized state-of-the-art design tools are employed to optimize the performance of various subsystems. However, there is no structured system-level concept-architecting process. Currently, spacecraft design is heavily based on the heritage of the industry. Old spacecraft designs are modified to adapt to new mission requirements, and feasible solutions---rather than optimal ones---are often all that is achieved. During the conceptual phase of the design, the choices available to designers are predominantly discrete variables describing major subsystems' technology options and redundancy levels. The complexity of spacecraft configurations makes the number of the system design variables that need to be traded off in an optimization process prohibitive when manual techniques are used. Such a discrete problem is well suited for solution with a Genetic Algorithm, which is a global search technique that performs optimization-like tasks. This research presents a systems engineering framework that places design requirements at the core of the design activities and transforms the design paradigm for spacecraft systems to a top-down approach rather than the current bottom-up approach. To facilitate decision-making in the early phases of the design process, the population-based search nature of the Genetic Algorithm is exploited to provide computationally inexpensive---compared to the state-of-the-practice---tools for both multi-objective design optimization and design optimization under uncertainty. In terms of computational cost, those tools are nearly on the same order of magnitude as that of standard single-objective deterministic Genetic Algorithm. The use of a multi-objective design approach provides system designers with a clear tradeoff optimization surface that allows them to understand the effect of their decisions on all the design objectives
Conceptual design interpretations, mindset and models
Andreasen, Mogens Myrup; Cash, Philip
2015-01-01
Maximising reader insights into the theory, models, methods and fundamental reasoning of design, this book addresses design activities in industrial settings, as well as the actors involved. This approach offers readers a new understanding of design activities and related functions, properties and dispositions. Presenting a ‘design mindset’ that seeks to empower students, researchers, and practitioners alike, it features a strong focus on how designers create new concepts to be developed into products, and how they generate new business and satisfy human needs. Employing a multi-faceted perspective, the book supplies the reader with a comprehensive worldview of design in the form of a proposed model that will empower their activities as student, researcher or practitioner. We draw the reader into the core role of design conceptualisation for society, for the development of industry, for users and buyers of products, and for citizens in relation to public systems. The book also features original con...
Uncertainty Visualization in Forward and Inverse Cardiac Models.
Burton, Brett M; Erem, Burak; Potter, Kristin; Rosen, Paul; Johnson, Chris R; Brooks, Dana H; Macleod, Rob S
2013-01-01
Quantification and visualization of uncertainty in cardiac forward and inverse problems with complex geometries is subject to various challenges. Specific to visualization is the observation that occlusion and clutter obscure important regions of interest, making visual assessment difficult. In order to overcome these limitations in uncertainty visualization, we have developed and implemented a collection of novel approaches. To highlight the utility of these techniques, we evaluated the uncertainty associated with two examples of modeling myocardial activity. In one case we studied cardiac potentials during the repolarization phase as a function of variability in tissue conductivities of the ischemic heart (forward case). In a second case, we evaluated uncertainty in reconstructed activation times on the epicardium resulting from variation in the control parameter of Tikhonov regularization (inverse case). To overcome difficulties associated with uncertainty visualization, we implemented linked-view windows and interactive animation to the two respective cases. Through dimensionality reduction and superimposed mean and standard deviation measures over time, we were able to display key features in large ensembles of data and highlight regions of interest where larger uncertainties exist.
Estimation of a multivariate mean under model selection uncertainty
Directory of Open Access Journals (Sweden)
Georges Nguefack-Tsague
2014-05-01
Full Text Available Model selection uncertainty would occur if we selected a model based on one data set and subsequently applied it for statistical inferences, because the "correct" model would not be selected with certainty. When the selection and inference are based on the same dataset, some additional problems arise due to the correlation of the two stages (selection and inference. In this paper model selection uncertainty is considered and model averaging is proposed. The proposal is related to the theory of James and Stein of estimating more than three parameters from independent normal observations. We suggest that a model averaging scheme taking into account the selection procedure could be more appropriate than model selection alone. Some properties of this model averaging estimator are investigated; in particular we show using Stein's results that it is a minimax estimator and can outperform Stein-type estimators.
Uncertainty Analysis in Population-Based Disease Microsimulation Models
Directory of Open Access Journals (Sweden)
Behnam Sharif
2012-01-01
Full Text Available Objective. Uncertainty analysis (UA is an important part of simulation model validation. However, literature is imprecise as to how UA should be performed in the context of population-based microsimulation (PMS models. In this expository paper, we discuss a practical approach to UA for such models. Methods. By adapting common concepts from published UA guidelines, we developed a comprehensive, step-by-step approach to UA in PMS models, including sample size calculation to reduce the computational time. As an illustration, we performed UA for POHEM-OA, a microsimulation model of osteoarthritis (OA in Canada. Results. The resulting sample size of the simulated population was 500,000 and the number of Monte Carlo (MC runs was 785 for 12-hour computational time. The estimated 95% uncertainty intervals for the prevalence of OA in Canada in 2021 were 0.09 to 0.18 for men and 0.15 to 0.23 for women. The uncertainty surrounding the sex-specific prevalence of OA increased over time. Conclusion. The proposed approach to UA considers the challenges specific to PMS models, such as selection of parameters and calculation of MC runs and population size to reduce computational burden. Our example of UA shows that the proposed approach is feasible. Estimation of uncertainty intervals should become a standard practice in the reporting of results from PMS models.
Hazard Response Modeling Uncertainty (A Quantitative Method)
1988-10-01
ersio 114-11aiaiI I I I II L I ATINI Iri Ig IN Ig - ISI I I s InWLS I I I II I I I IWILLa RguOSmI IT$ INDS In s list INDIN I Im Ad inla o o ILLS I...OesotASII II I I I" GASau ATI im IVS ES Igo Igo INC 9 -U TIg IN ImS. I IgoIDI II i t I I ol f i isI I I I I I * WOOL ETY tGMIM (SU I YESMI jWM# GUSA imp I...is the concentration predicted by some component or model.P The variance of C /C is calculated and defined as var(Model I), where Modelo p I could be
Uncertainty Models for Knowledge-Based Systems
1991-08-01
D. V. (1982). Improving judgment by reconciling incoherence. The behavioral and brain Sciences, 4, 317-370. (26] Carnap , R. (1958). Introduction to...Symbolic Logic and its Applications. Dover, N. Y. References 597 [271 Carnap , R. (1959). The Logical Syntax of Language. Littlefield, Adam and Co...Paterson, New Jersey. [28] Carnap , R. (1960). Meaning and Necessity, a Study in Semantic and Model Logic. Phoenix Books, Univ. of Chicago. [29] Carrega
Conceptual model innovation management: market orientation
Directory of Open Access Journals (Sweden)
L.Ya. Maljuta
2015-06-01
Full Text Available The article highlights issues that determine the beginning of the innovation process. Determined that until recently in Ukraine at all levels of innovation management (regional, sectoral, institutional dominated grocery orientation innovation that focus on production innovation and found that the transition to a market economy, the restructuring of production and complexity of social needs led to the strengthening of the consumer. It is proved that innovation itself – not the ultimate goal, but only a means of satisfying consumer needs. It proved that changing production conditions, complications of social needs and the need to improve the competitiveness of innovations require finding new forms of innovation. In this regard, proposed to allocate such basic scheme (model of innovation in small businesses, individual entrepreneurs, venture capital firms, eksplerents, patients, violents and commutants, spin-offs and spin-out company, network (or shell company and a network of small businesses.
Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis
Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.
2013-12-01
Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.
Model for predicting mountain wave field uncertainties
Damiens, Florentin; Lott, François; Millet, Christophe; Plougonven, Riwal
2017-04-01
Studying the propagation of acoustic waves throughout troposphere requires knowledge of wind speed and temperature gradients from the ground up to about 10-20 km. Typical planetary boundary layers flows are known to present vertical low level shears that can interact with mountain waves, thereby triggering small-scale disturbances. Resolving these fluctuations for long-range propagation problems is, however, not feasible because of computer memory/time restrictions and thus, they need to be parameterized. When the disturbances are small enough, these fluctuations can be described by linear equations. Previous works by co-authors have shown that the critical layer dynamics that occur near the ground produces large horizontal flows and buoyancy disturbances that result in intense downslope winds and gravity wave breaking. While these phenomena manifest almost systematically for high Richardson numbers and when the boundary layer depth is relatively small compare to the mountain height, the process by which static stability affects downslope winds remains unclear. In the present work, new linear mountain gravity wave solutions are tested against numerical predictions obtained with the Weather Research and Forecasting (WRF) model. For Richardson numbers typically larger than unity, the mesoscale model is used to quantify the effect of neglected nonlinear terms on downslope winds and mountain wave patterns. At these regimes, the large downslope winds transport warm air, a so called "Foehn" effect than can impact sound propagation properties. The sensitivity of small-scale disturbances to Richardson number is quantified using two-dimensional spectral analysis. It is shown through a pilot study of subgrid scale fluctuations of boundary layer flows over realistic mountains that the cross-spectrum of mountain wave field is made up of the same components found in WRF simulations. The impact of each individual component on acoustic wave propagation is discussed in terms of
Uncertainty Quantification for Large-Scale Ice Sheet Modeling
Energy Technology Data Exchange (ETDEWEB)
Ghattas, Omar [Univ. of Texas, Austin, TX (United States)
2016-02-05
This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.
Flight Dynamics and Control of Elastic Hypersonic Vehicles Uncertainty Modeling
Chavez, Frank R.; Schmidt, David K.
1994-01-01
It has been shown previously that hypersonic air-breathing aircraft exhibit strong aeroelastic/aeropropulsive dynamic interactions. To investigate these, especially from the perspective of the vehicle dynamics and control, analytical expressions for key stability derivatives were derived, and an analysis of the dynamics was performed. In this paper, the important issue of model uncertainty, and the appropriate forms for representing this uncertainty, is addressed. It is shown that the methods suggested in the literature for analyzing the robustness of multivariable feedback systems, which as a prerequisite to their application assume particular forms of model uncertainty, can be difficult to apply on real atmospheric flight vehicles. Also, the extent to which available methods are conservative is demonstrated for this class of vehicle dynamics.
Improved Wave-vessel Transfer Functions by Uncertainty Modelling
DEFF Research Database (Denmark)
Nielsen, Ulrik Dam; Fønss Bach, Kasper; Iseki, Toshio
2016-01-01
This paper deals with uncertainty modelling of wave-vessel transfer functions used to calculate or predict wave-induced responses of a ship in a seaway. Although transfer functions, in theory, can be calculated to exactly reflect the behaviour of the ship when exposed to waves, uncertainty in input...... variables, notably speed, draft and relative wave eading, often compromises results. In this study, uncertling is applied to improve theoretically calculated transfer functions, so they better fit the corresponding experimental, full-scale ones. Based on a vast amount of full-scale measurements data......, it is shown that uncertainty modelling can be successfully used to improve accuracy (and reliability) of theoretical transfer functions....
Uncertainty analysis in dissolved oxygen modeling in streams.
Hamed, Maged M; El-Beshry, Manar Z
2004-08-01
Uncertainty analysis in surface water quality modeling is an important issue. This paper presents a method based on the first-order reliability method (FORM) to assess the exceedance probability of a target dissolved oxygen concentration in a stream, using a Streeter-Phelps prototype model. Basic uncertainty in the input parameters is considered by representing them as random variables with prescribed probability distributions. Results obtained from FORM analysis compared well with those of the Monte Carlo simulation method. The analysis also presents the stochastic sensitivity of the probabilistic outcome in the form of uncertainty importance factors, and shows how they change with changing simulation time. Furthermore, a parametric sensitivity analysis was conducted to show the effect of selection of different probability distribution functions for the three most important parameters on the design point, exceedance probability, and importance factors.
TOWARD COLLECTIVE INTELLIGENCE OF ONLINE COMMUNITIES: A PRIMITIVE CONCEPTUAL MODEL
Institute of Scientific and Technical Information of China (English)
Shuangling LUO; Haoxiang XIA; Taketoshi YOSHIDA; Zhongtuo WANG
2009-01-01
Inspired by the ideas of Swarm Intelligence and the "global brain", a concept of "community intelligence" is suggested in the present paper, reflecting that some "intelligent" features may emerge in a Web-mediated online community from interactions and knowledge-transmissions between the community members. This possible research field of community intelligence is then examined under the backgrounds of "community" and "intelligence" researches. Furthermore, a conceptual model of community intelligence is developed from two views. From the structural view, the community intelligent system is modeled as a knowledge supernetwork that is comprised of triple interwoven networks of the media network, the human network, and the knowledge network. Furthermore, based on a dyad of knowledge in two forms of "knowing" and "knoware", the dynamic view describes the basic mechanics of the formation and evolution of "community intelligence". A few relevant research issues are shortly discussed on the basis of the proposed conceptual model.
Conceptual Commitments of the LIDA Model of Cognition
Franklin, Stan; Strain, Steve; McCall, Ryan; Baars, Bernard
2013-06-01
Significant debate on fundamental issues remains in the subfields of cognitive science, including perception, memory, attention, action selection, learning, and others. Psychology, neuroscience, and artificial intelligence each contribute alternative and sometimes conflicting perspectives on the supervening problem of artificial general intelligence (AGI). Current efforts toward a broad-based, systems-level model of minds cannot await theoretical convergence in each of the relevant subfields. Such work therefore requires the formulation of tentative hypotheses, based on current knowledge, that serve to connect cognitive functions into a theoretical framework for the study of the mind. We term such hypotheses "conceptual commitments" and describe the hypotheses underlying one such model, the Learning Intelligent Distribution Agent (LIDA) Model. Our intention is to initiate a discussion among AGI researchers about which conceptual commitments are essential, or particularly useful, toward creating AGI agents.
Spectral optimization and uncertainty quantification in combustion modeling
Sheen, David Allan
Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will
Assessing uncertainties in solute transport models: Upper Narew case study
Osuch, M.; Romanowicz, R.; Napiórkowski, J. J.
2009-04-01
This paper evaluates uncertainties in two solute transport models based on tracer experiment data from the Upper River Narew. Data Based Mechanistic and transient storage models were applied to Rhodamine WT tracer observations. We focus on the analysis of uncertainty and the sensitivity of model predictions to varying physical parameters, such as dispersion and channel geometry. An advection-dispersion model with dead zones (Transient Storage model) adequately describes the transport of pollutants in a single channel river with multiple storage. The applied transient storage model is deterministic; it assumes that observations are free of errors and the model structure perfectly describes the process of transport of conservative pollutants. In order to take into account the model and observation errors, an uncertainty analysis is required. In this study we used a combination of the Generalized Likelihood Uncertainty Estimation technique (GLUE) and the variance based Global Sensitivity Analysis (GSA). The combination is straightforward as the same samples (Sobol samples) were generated for GLUE analysis and for sensitivity assessment. Additionally, the results of the sensitivity analysis were used to specify the best parameter ranges and their prior distributions for the evaluation of predictive model uncertainty using the GLUE methodology. Apart from predictions of pollutant transport trajectories, two ecological indicators were also studied (time over the threshold concentration and maximum concentration). In particular, a sensitivity analysis of the length of "over the threshold" period shows an interesting multi-modal dependence on model parameters. This behavior is a result of the direct influence of parameters on different parts of the dynamic response of the system. As an alternative to the transient storage model, a Data Based Mechanistic approach was tested. Here, the model is identified and the parameters are estimated from available time series data using
A simple conceptual model of abrupt glacial climate events
Braun, H; Christl, M; Chialvo, D R
2008-01-01
Here we use a very simple conceptual model in an attempt to reduce essential parts of the complex nonlinearity of abrupt glacial climate changes (the so-called Dansgaard-Oeschger events) to a few simple principles, namely (i) a threshold process, (ii) an overshooting in the stability of the system and (iii) a millennial-scale relaxation. By comparison with a so-called Earth system model of intermediate complexity (CLIMBER-2), in which the events represent oscillations between two climate states corresponding to two fundamentally different modes of deep-water formation in the North Atlantic, we demonstrate that the conceptual model captures fundamental aspects of the nonlinearity of the events in that model. We use the conceptual model in order to reproduce and reanalyse nonlinear resonance mechanisms that were already suggested in order to explain the characteristic time scale of Dansgaard-Oeschger events. In doing so we identify a new form of stochastic resonance (i.e. an overshooting stochastic resonance) a...
Linear models in the mathematics of uncertainty
Mordeson, John N; Clark, Terry D; Pham, Alex; Redmond, Michael A
2013-01-01
The purpose of this book is to present new mathematical techniques for modeling global issues. These mathematical techniques are used to determine linear equations between a dependent variable and one or more independent variables in cases where standard techniques such as linear regression are not suitable. In this book, we examine cases where the number of data points is small (effects of nuclear warfare), where the experiment is not repeatable (the breakup of the former Soviet Union), and where the data is derived from expert opinion (how conservative is a political party). In all these cases the data is difficult to measure and an assumption of randomness and/or statistical validity is questionable. We apply our methods to real world issues in international relations such as nuclear deterrence, smart power, and cooperative threat reduction. We next apply our methods to issues in comparative politics such as successful democratization, quality of life, economic freedom, political stability, and fail...
A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling
Cao, G.
2015-12-01
All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the
DEFF Research Database (Denmark)
Troldborg, Mads; Nowak, W.; Binning, Philip John
2010-01-01
for quantifying the uncertainty in the mass discharge across a multilevel control plane. The method is based on geostatistical inverse modelling and accounts for i) conceptual model uncertainty through multiple conceptual models and Bayesian model averaging, ii) heterogeneity through Bayesian geostatistics...... with an uncertain geostatistical model and iii) measurement uncertainty. The method is tested on a TCE contaminated site for which four different conceptual models were set up. The mass discharge and the associated uncertainty are hereby determined. It is discussed which of the conceptual models is most likely...
Promoting Conceptual Coherence in Quantum Learning through Computational Models
Lee, Hee-Sun
2012-02-01
In order to explain phenomena at the quantum level, scientists use multiple representations in verbal, pictorial, mathematical, and computational forms. Conceptual coherence among these multiple representations is used as an analytical framework to describe student learning trajectories in quantum physics. A series of internet-based curriculum modules are designed to address topics in quantum mechanics, semiconductor physics, and nano-scale engineering applications. In these modules, students are engaged in inquiry-based activities situated in a highly interactive computational modeling environment. This study was conducted in an introductory level solid state physics course. Based on in-depth interviews with 13 students, methods for identifying conceptual coherence as a function of students' level of understanding are presented. Pre-post test comparisons of 20 students in the course indicate a statistically significant improvement in students' conceptual coherence of understanding quantum phenomena before and after the course, Effect Size = 1.29 SD. Additional analyses indicate that students who responded to the modules more coherently improved their conceptual coherence to a greater extent than those who did less to the modules after controlling for their course grades.
A Generalized Statistical Uncertainty Model for Satellite Precipitation Products
Sarachi, S.
2013-12-01
A mixture model of Generalized Normal Distribution and Gamma distribution (GND-G) is used to model the joint probability distribution of satellite-based and stage IV radar rainfall under a given spatial and temporal resolution (e.g. 1°x1° and daily rainfall). The distribution parameters of GND-G are extended across various rainfall rates and spatial and temporal resolutions. In the study, GND-G is used to describe the uncertainty of the estimates from Precipitation Estimation from Remote Sensing Information using Artificial Neural Network algorithm (PERSIANN). The stage IV-based multi-sensor precipitation estimates (MPE) are used as reference measurements .The study area for constructing the uncertainty model covers a 15°×15°box of 0.25°×0.25° cells over the eastern United States for summer 2004 to 2009. Cells are aggregated in space and time to obtain data with different resolutions for the construction of the model's parameter space. Result shows that comparing to the other statistical uncertainty models, GND-G fits better than the other models, such as Gaussian and Gamma distributions, to the reference precipitation data. The impact of precipitation uncertainty to the stream flow is further demonstrated by Monte Carlo simulation of precipitation forcing in the hydrologic model. The NWS DMIP2 basins over Illinois River basin south of Siloam is selected in this case study. The data covers the time period of 2006 to 2008.The uncertainty range of stream flow from precipitation of GND-G distributions calculated and will be discussed.
A Simplified Model of Choice Behavior under Uncertainty
Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu
2016-01-01
The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that m...
A simplified model of choice behavior under uncertainty
Ching-Hung Lin; Yu-Kai Lin; Tzu-Jiun Song; Jong-Tsun Huang; Yao-Chu Chiu
2016-01-01
The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the pr...
Uncertainty Quantification in Control Problems for Flocking Models
Directory of Open Access Journals (Sweden)
Giacomo Albi
2015-01-01
Full Text Available The optimal control of flocking models with random inputs is investigated from a numerical point of view. The effect of uncertainty in the interaction parameters is studied for a Cucker-Smale type model using a generalized polynomial chaos (gPC approach. Numerical evidence of threshold effects in the alignment dynamic due to the random parameters is given. The use of a selective model predictive control permits steering of the system towards the desired state even in unstable regimes.
Conceptual Model of Artifacts for Design Science Research
DEFF Research Database (Denmark)
Bækgaard, Lars
2015-01-01
We present a conceptual model of design science research artifacts. The model views an artifact at three levels. At the artifact level a selected artifact is viewed as a combination of material and immaterial aspects and a set of representations hereof. At the design level the selected artifact...... is viewed through its design in terms of descriptions, models, prototypes etc. At the knowledge level the selected artifact is viewed through ontologies, categories and various types of relevant knowledge. The model is based on description...
Conceptual Model of Artifacts for Design Science Research
DEFF Research Database (Denmark)
Bækgaard, Lars
2015-01-01
We present a conceptual model of design science research artifacts. The model views an artifact at three levels. At the artifact level a selected artifact is viewed as a combination of material and immaterial aspects and a set of representations hereof. At the design level the selected artifact...... is viewed through its design in terms of descriptions, models, prototypes etc. At the knowledge level the selected artifact is viewed through ontologies, categories and various types of relevant knowledge. The model is based on description...
Westerberg, I.; Birkel, C.
2012-04-01
Knowledge about hydrological processes and the spatial and temporal distribution of water resources is the basis for water management such as hydropower, agriculture and flood-protection. Conceptual hydrological models may be used to infer knowledge on catchment functioning but are affected by uncertainties in the model representation of reality as well as in the observational data used to drive the model and to evaluate model performance. Therefore, meaningful hypotheses testing of the hydrological functioning of a catchment requires such uncertainties to be carefully estimated and accounted for in model calibration and evaluation. We investigated the hydrological functioning of the relatively data-scarce tropical Sarapiqui catchment in Costa Rica, Central America, where water resources play a vital part for hydropower production and livelihood. Hypotheses on catchment functioning using different model structures were tested within an uncertainty estimation framework specifically accounting for observational uncertainties. The uncertainty in discharge data was estimated from a rating-curve analysis and precipitation measurement errors through scenarios relating the error to, for example, the elevation gradient. The suitability of the different model structures as hypotheses about the functioning of the catchment was evaluated in a posterior analysis of the simulations. The performance of each simulation relative to the observational uncertainties was analysed for the entire hydrograph as well as for different aspects of the hydrograph (e.g. peak flows, recession periods, and base flow). This analysis enabled the identification of periods of likely model-structural errors and periods of probable data errors. We conclude that accounting for observational uncertainties led to improved hypotheses testing, which resulted in less risk of rejecting an acceptable model structure because of uncertainties in the forcing and evaluation data.
Space Surveillance Network Scheduling Under Uncertainty: Models and Benefits
Valicka, C.; Garcia, D.; Staid, A.; Watson, J.; Rintoul, M.; Hackebeil, G.; Ntaimo, L.
2016-09-01
Advances in space technologies continue to reduce the cost of placing satellites in orbit. With more entities operating space vehicles, the number of orbiting vehicles and debris has reached unprecedented levels and the number continues to grow. Sensor operators responsible for maintaining the space catalog and providing space situational awareness face an increasingly complex and demanding scheduling requirements. Despite these trends, a lack of advanced tools continues to prevent sensor planners and operators from fully utilizing space surveillance resources. One key challenge involves optimally selecting sensors from a network of varying capabilities for missions with differing requirements. Another open challenge, the primary focus of our work, is building robust schedules that effectively plan for uncertainties associated with weather, ad hoc collections, and other target uncertainties. Existing tools and techniques are not amenable to rigorous analysis of schedule optimality and do not adequately address the presented challenges. Building on prior research, we have developed stochastic mixed-integer linear optimization models to address uncertainty due to weather's effect on collection quality. By making use of the open source Pyomo optimization modeling software, we have posed and solved sensor network scheduling models addressing both forms of uncertainty. We present herein models that allow for concurrent scheduling of collections with the same sensor configuration and for proactively scheduling against uncertain ad hoc collections. The suitability of stochastic mixed-integer linear optimization for building sensor network schedules under different run-time constraints will be discussed.
Reducing uncertainty based on model fitness: Application to a ...
African Journals Online (AJOL)
2015-01-07
Jan 7, 2015 ... 2Hydrology and Water Quality, Agricultural and Biological Engineering ... This general methodology is applied to a reservoir model of the Okavango ... Global sensitivity and uncertainty analysis (GSA/UA) system- ... and weighing risks between decisions (Saltelli et al., 2008). ...... resources and support.
Model parameter uncertainty analysis for an annual field-scale P loss model
Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie
2016-08-01
Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model
Uncertainty Modeling Based on Bayesian Network in Ontology Mapping
Institute of Scientific and Technical Information of China (English)
LI Yuhua; LIU Tao; SUN Xiaolin
2006-01-01
How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.
Uncertainty of Modal Parameters Estimated by ARMA Models
DEFF Research Database (Denmark)
Jensen, Jacob Laigaard; Brincker, Rune; Rytter, Anders
1990-01-01
In this paper the uncertainties of identified modal parameters such as eidenfrequencies and damping ratios are assed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the parameters...... by simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been choosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...
Evaluating the uncertainty of input quantities in measurement models
Possolo, Antonio; Elster, Clemens
2014-06-01
The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in
Scientific and conceptual flaws of coercive treatment models in addiction.
Uusitalo, Susanne; van der Eijk, Yvette
2016-01-01
In conceptual debates on addiction, neurobiological research has been used to support the idea that addicted drug users lack control over their addiction-related actions. In some interpretations, this has led to coercive treatment models, in which, the purpose is to 'restore' control. However, neurobiological studies that go beyond what is typically presented in conceptual debates paint a different story. In particular, they indicate that though addiction has neurobiological manifestations that make the addictive behaviour difficult to control, it is possible for individuals to reverse these manifestations through their own efforts. Thus, addicted individuals should not be considered incapable of making choices voluntarily, simply on the basis that addiction has neurobiological manifestations, and coercive treatment models of addiction should be reconsidered in this respect.
Creation of Sustainable Leadership Development: Conceptual Model Validation
Directory of Open Access Journals (Sweden)
Judita Peterlin
2013-01-01
Full Text Available Conceptual paper addresses the research question: How can leadershipdevelopment be managed within organizations? Our proposed answeris presented in the form of conceptual model of sustainable leadershipdevelopment, based on the theory of multiple intelligences by HowardGardner and applied to leadership through appreciative inquiry, meaningthat leaders possess multiple intelligences which differentiate intheir individual profiles and are able to develop a wide span of intelligencesduring their life span. The main developmental and analyticalmethod that enables the sustainable leadership development throughmultiple intelligences is action learning where as expected results of theappreciative process participants are creative seekers of positive learningopportunities in active learning environment. Sustainable leadershipdevelopment model proposes a new creative way in providing forthe process and content of leadership development that has sustainabilityas its core component.
Accounting for Epistemic Uncertainty in PSHA: Logic Tree and Ensemble Model
Taroni, M.; Marzocchi, W.; Selva, J.
2014-12-01
The logic tree scheme is the probabilistic framework that has been widely used in the last decades to take into account epistemic uncertainties in probabilistic seismic hazard analysis (PSHA). Notwithstanding the vital importance for PSHA to incorporate properly the epistemic uncertainties, we argue that the use of the logic tree in a PSHA context has conceptual and practical drawbacks. Despite some of these drawbacks have been reported in the past, a careful evaluation of their impact on PSHA is still lacking. This is the goal of the present work. In brief, we show that i) PSHA practice does not meet the assumptions that stand behind the logic tree scheme; ii) the output of a logic tree is often misinterpreted and/or misleading, e.g., the use of percentiles (median included) in a logic tree scheme raises theoretical difficulties from a probabilistic point of view; iii) in case the assumptions that stand behind a logic tree are actually met, this leads to several problems in testing any PSHA model. We suggest a different strategy - based on ensemble modeling - to account for epistemic uncertainties in a more proper probabilistic framework. Finally, we show that in many PSHA practical applications, the logic tree is de facto loosely applied to build sound ensemble models.
Uncertainty modelling of critical column buckling for reinforced concrete buildings
Indian Academy of Sciences (India)
Kasim A Korkmaz; Fuat Demir; Hamide Tekeli
2011-04-01
Buckling is a critical issue for structural stability in structural design. In most of the buckling analyses, applied loads, structural and material properties are considered certain. However, in reality, these parameters are uncertain. Therefore, a prognostic solution is necessary and uncertainties have to be considered. Fuzzy logic algorithms can be a solution to generate more dependable results. This study investigates the material uncertainties on column design and proposes an uncertainty model for critical column buckling reinforced concrete buildings. Fuzzy logic algorithm was employed in the study. Lower and upper bounds of elastic modulus representing material properties were deﬁned to take uncertainties into account. The results show that uncertainties play an important role in stability analyses and should be considered in the design. The proposed approach is applicable to both future numerical and experimental researches. According to the study results, it is seen that, calculated buckling load values are stayed in lower and upper bounds while the load values are different for same concrete strength values by using different code formula.
Influence of model reduction on uncertainty of flood inundation predictions
Romanowicz, R. J.; Kiczko, A.; Osuch, M.
2012-04-01
Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of
Aircraft conceptual design modelling incorporating reliability and maintainability predictions
Vaziry-Zanjany , Mohammad Ali (F)
1996-01-01
A computer assisted conceptual aircraft design program has been developed (CACAD). It has an optimisation capability, with extensive break-down in maintenance costs. CACAD's aim is to optimise the size, and configurations of turbofan-powered transport aircraft. A methodology was developed to enhance the reliability of current aircraft systems, and was applied to avionics systems. R&M models of thermal management were developed and linked with avionics failure rate and its ma...
Motivation to Improve Work through Learning: A Conceptual Model
Kueh Hua Ng; Rusli Ahmad
2014-01-01
This study aims to enhance our current understanding of the transfer of training by proposing a conceptual model that supports the mediating role of motivation to improve work through learning about the relationship between social support and the transfer of training. The examination of motivation to improve work through motivation to improve work through a learning construct offers a holistic view pertaining to a learner's profile in a workplace setting, which emphasizes learning for the imp...
Reliable Estimation of Prediction Uncertainty for Physicochemical Property Models.
Proppe, Jonny; Reiher, Markus
2017-07-11
One of the major challenges in computational science is to determine the uncertainty of a virtual measurement, that is the prediction of an observable based on calculations. As highly accurate first-principles calculations are in general unfeasible for most physical systems, one usually resorts to parameteric property models of observables, which require calibration by incorporating reference data. The resulting predictions and their uncertainties are sensitive to systematic errors such as inconsistent reference data, parametric model assumptions, or inadequate computational methods. Here, we discuss the calibration of property models in the light of bootstrapping, a sampling method that can be employed for identifying systematic errors and for reliable estimation of the prediction uncertainty. We apply bootstrapping to assess a linear property model linking the (57)Fe Mössbauer isomer shift to the contact electron density at the iron nucleus for a diverse set of 44 molecular iron compounds. The contact electron density is calculated with 12 density functionals across Jacob's ladder (PWLDA, BP86, BLYP, PW91, PBE, M06-L, TPSS, B3LYP, B3PW91, PBE0, M06, TPSSh). We provide systematic-error diagnostics and reliable, locally resolved uncertainties for isomer-shift predictions. Pure and hybrid density functionals yield average prediction uncertainties of 0.06-0.08 mm s(-1) and 0.04-0.05 mm s(-1), respectively, the latter being close to the average experimental uncertainty of 0.02 mm s(-1). Furthermore, we show that both model parameters and prediction uncertainty depend significantly on the composition and number of reference data points. Accordingly, we suggest that rankings of density functionals based on performance measures (e.g., the squared coefficient of correlation, r(2), or the root-mean-square error, RMSE) should not be inferred from a single data set. This study presents the first statistically rigorous calibration analysis for theoretical M
Conceptual Model of Climate Change Impacts at LANL
Energy Technology Data Exchange (ETDEWEB)
Dewart, Jean Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-17
Goal 9 of the LANL FY15 Site Sustainability Plan (LANL 2014a) addresses Climate Change Adaptation. As part of Goal 9, the plan reviews many of the individual programs the Laboratory has initiated over the past 20 years to address climate change impacts to LANL (e.g. Wildland Fire Management Plan, Forest Management Plan, etc.). However, at that time, LANL did not yet have a comprehensive approach to climate change adaptation. To fill this gap, the FY15 Work Plan for the LANL Long Term Strategy for Environmental Stewardship and Sustainability (LANL 2015) included a goal of (1) establishing a comprehensive conceptual model of climate change impacts at LANL and (2) establishing specific climate change indices to measure climate change and impacts at Los Alamos. Establishing a conceptual model of climate change impacts will demonstrate that the Laboratory is addressing climate change impacts in a comprehensive manner. This paper fulfills the requirement of goal 1. The establishment of specific indices of climate change at Los Alamos (goal 2), will improve our ability to determine climate change vulnerabilities and assess risk. Future work will include prioritizing risks, evaluating options/technologies/costs, and where appropriate, taking actions. To develop a comprehensive conceptual model of climate change impacts, we selected the framework provided in the National Oceanic and Atmospheric Administration (NOAA) Climate Resilience Toolkit (http://toolkit.climate.gov/).
A simplified model of choice behavior under uncertainty
Directory of Open Access Journals (Sweden)
Ching-Hung Lin
2016-08-01
Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.
What is recovery? A conceptual model and explication.
Jacobson, N; Greenley, D
2001-04-01
This paper describes a conceptual model of recovery from mental illness developed to aid the state of Wisconsin in moving toward its goal of developing a "recovery-oriented" mental health system. In the model, recovery refers to both internal conditions experienced by persons who describe themselves as being in recovery--hope, healing, empowerment, and connection--and external conditions that facilitate recovery--implementation of the principle of human rights, a positive culture of healing, and recovery-oriented services. The aim of the model is to link the abstract concepts that define recovery with specific strategies that systems, agencies, and individuals can use to facilitate it.
IMM: a multisystem memory model for conceptual learning.
Saleh, Mai Sabry
2014-01-01
Concepts of learning and memory are closely related, and the terms often describe the same processes. Conceptual learning is known to be the process of developing abstract rules or mental constructs based on sensory experience. The Integrated Model of Mind (IMM), introduced in the present work, is a theoretical multisystem memory model that describes how concepts are formed. The IMM in its design arranges memory systems after their function in an integrated and harmonized sequence. It provides answers to some limitations of Tulving's serial-parallel-independent (SPI) model and suggests a new assumption with respect to mental representation and image schema construction through the process of encoding.
Conceptual Model of IT Infrastructure Capability and Its Empirical Justification
Institute of Scientific and Technical Information of China (English)
QI Xianfeng; LAN Boxiong; GUO Zhenwei
2008-01-01
Increasing importance has been attached to the value of information technology (IT) infrastructure in today's organizations. The development of efficacious IT infrastructure capability enhances business performance and brings sustainable competitive advantage. This study analyzed the IT infrastructure capability in a holistic way and then presented a concept model of IT capability. IT infrastructure capability was categorized into sharing capability, service capability, and flexibility. This study then empirically tested the model using a set of survey data collected from 145 firms. Three factors emerge from the factor analysis as IT flexibility, IT service capability, and IT sharing capability, which agree with those in the conceptual model built in this study.
An educational model for ensemble streamflow simulation and uncertainty analysis
Directory of Open Access Journals (Sweden)
A. AghaKouchak
2012-06-01
Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.
Systematic Uncertainties in High-Energy Hadronic Interaction Models
Zha, M.; Knapp, J.; Ostapchenko, S.
2003-07-01
Hadronic interaction models for cosmic ray energies are uncertain since our knowledge of hadronic interactions is extrap olated from accelerator experiments at much lower energies. At present most high-energy models are based on Grib ov-Regge theory of multi-Pomeron exchange, which provides a theoretical framework to evaluate cross-sections and particle production. While experimental data constrain some of the model parameters, others are not well determined and are therefore a source of systematic uncertainties. In this paper we evaluate the variation of results obtained with the QGSJET model, when modifying parameters relating to three ma jor sources of uncertainty: the form of the parton structure function, the role of diffractive interactions, and the string hadronisation. Results on inelastic cross sections, on secondary particle production and on the air shower development are discussed.
Conceptual Model of Quantities, Units, Dimensions, and Values
Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar
2011-01-01
JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.
Extended Range Hydrological Predictions: Uncertainty Associated with Model Parametrization
Joseph, J.; Ghosh, S.; Sahai, A. K.
2016-12-01
The better understanding of various atmospheric processes has led to improved predictions of meteorological conditions at various temporal scale, ranging from short term which cover a period up to 2 days to long term covering a period of more than 10 days. Accurate prediction of hydrological variables can be done using these predicted meteorological conditions, which would be helpful in proper management of water resources. Extended range hydrological simulation includes the prediction of hydrological variables for a period more than 10 days. The main sources of uncertainty in hydrological predictions include the uncertainty in the initial conditions, meteorological forcing and model parametrization. In the present study, the Extended Range Prediction developed for India for monsoon by Indian Institute of Tropical Meteorology (IITM), Pune is used as meteorological forcing for the Variable Infiltration Capacity (VIC) model. Sensitive hydrological parameters, as derived from literature, along with a few vegetation parameters are assumed to be uncertain and 1000 random values are generated given their prescribed ranges. Uncertainty bands are generated by performing Monte-Carlo Simulations (MCS) for the generated sets of parameters and observed meteorological forcings. The basins with minimum human intervention, within the Indian Peninsular region, are identified and validation of results are carried out using the observed gauge discharge. Further, the uncertainty bands are generated for the extended range hydrological predictions by performing MCS for the same set of parameters and extended range meteorological predictions. The results demonstrate the uncertainty associated with the model parametrisation for the extended range hydrological simulations. Keywords: Extended Range Prediction, Variable Infiltration Capacity model, Monte Carlo Simulation.
Formal modeling of a system of chemical reactions under uncertainty.
Ghosh, Krishnendu; Schlipf, John
2014-10-01
We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.
Misrepresentation and amendment of soil moisture in conceptual hydrological modelling
Zhuo, Lu; Han, Dawei
2016-04-01
Although many conceptual models are very effective in simulating river runoff, their soil moisture schemes are generally not realistic in comparison with the reality (i.e., getting the right answers for the wrong reasons). This study reveals two significant misrepresentations in those models through a case study using the Xinanjiang model which is representative of many well-known conceptual hydrological models. The first is the setting of the upper limit of its soil moisture at the field capacity, due to the 'holding excess runoff' concept (i.e., runoff begins on repletion of its storage to the field capacity). The second is neglect of capillary rise of water movement. A new scheme is therefore proposed to overcome those two issues. The amended model is as effective as its original form in flow modelling, but represents more logically realistic soil water processes. The purpose of the study is to enable the hydrological model to get the right answers for the right reasons. Therefore, the new model structure has a better capability in potentially assimilating soil moisture observations to enhance its real-time flood forecasting accuracy. The new scheme is evaluated in the Pontiac catchment of the USA through a comparison with satellite observed soil moisture. The correlation between the XAJ and the observed soil moisture is enhanced significantly from 0.64 to 0.70. In addition, a new soil moisture term called SMDS (Soil Moisture Deficit to Saturation) is proposed to complement the conventional SMD (Soil Moisture Deficit).
Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren; Schaarup-Jensen, Kjeld
2007-01-01
In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....... volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system...
Nonlinear structural finite element model updating and uncertainty quantification
Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.
2015-04-01
This paper presents a framework for nonlinear finite element (FE) model updating, in which state-of-the-art nonlinear structural FE modeling and analysis techniques are combined with the maximum likelihood estimation method (MLE) to estimate time-invariant parameters governing the nonlinear hysteretic material constitutive models used in the FE model of the structure. The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem. A proof-of-concept example, consisting of a cantilever steel column representing a bridge pier, is provided to verify the proposed nonlinear FE model updating framework.
Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models
Energy Technology Data Exchange (ETDEWEB)
Ahmed Hassan; Jenny Chapman
2006-02-01
The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP
How much expert knowledge is it worth to put in conceptual hydrological models?
Antonetti, Manuel; Zappa, Massimiliano
2017-04-01
Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.
Conceptual model for the design of product systems
Directory of Open Access Journals (Sweden)
John J. Cardozo Vásquez
2016-08-01
Full Text Available Based on the concepts of personalization, differentiation and variability, product systems are characterized and the conditions for their design are explored. In the analysis of literature,17 variables related to the design of product systems were identified, and used to develop a conceptual model. In order to validate this finding, a survey was applied to 57 experts in the field of design. By means of a factor analysis, variables are reduced and the underlying conceptual relationships are identified. As a result of the validation, three factors are identified: structure, coherence and order, which contain the variables that determine the attributes of the product systems. These findings support the conclusion that the origin of the design process of these systems is based on the analysis of consumers and the multiple uses given and experiences obtained from these products. Study boundaries are established and future research possibilities are outlined.
The effect of uncertainty and systematic errors in hydrological modelling
Steinsland, I.; Engeland, K.; Johansen, S. S.; Øverleir-Petersen, A.; Kolberg, S. A.
2014-12-01
The aims of hydrological model identification and calibration are to find the best possible set of process parametrization and parameter values that transform inputs (e.g. precipitation and temperature) to outputs (e.g. streamflow). These models enable us to make predictions of streamflow. Several sources of uncertainties have the potential to hamper the possibility of a robust model calibration and identification. In order to grasp the interaction between model parameters, inputs and streamflow, it is important to account for both systematic and random errors in inputs (e.g. precipitation and temperatures) and streamflows. By random errors we mean errors that are independent from time step to time step whereas by systematic errors we mean errors that persists for a longer period. Both random and systematic errors are important in the observation and interpolation of precipitation and temperature inputs. Important random errors comes from the measurements themselves and from the network of gauges. Important systematic errors originate from the under-catch in precipitation gauges and from unknown spatial trends that are approximated in the interpolation. For streamflow observations, the water level recordings might give random errors whereas the rating curve contributes mainly with a systematic error. In this study we want to answer the question "What is the effect of random and systematic errors in inputs and observed streamflow on estimated model parameters and streamflow predictions?". To answer we test systematically the effect of including uncertainties in inputs and streamflow during model calibration and simulation in distributed HBV model operating on daily time steps for the Osali catchment in Norway. The case study is based on observations from, uncertainty carefullt quantified, and increased uncertainties and systmatical errors are done realistically by for example removing a precipitation gauge from the network.We find that the systematical errors in
A Multi-Model Approach for Uncertainty Propagation and Model Calibration in CFD Applications
Wang, Jian-xun; Xiao, Heng
2015-01-01
Proper quantification and propagation of uncertainties in computational simulations are of critical importance. This issue is especially challenging for CFD applications. A particular obstacle for uncertainty quantifications in CFD problems is the large model discrepancies associated with the CFD models used for uncertainty propagation. Neglecting or improperly representing the model discrepancies leads to inaccurate and distorted uncertainty distribution for the Quantities of Interest. High-fidelity models, being accurate yet expensive, can accommodate only a small ensemble of simulations and thus lead to large interpolation errors and/or sampling errors; low-fidelity models can propagate a large ensemble, but can introduce large modeling errors. In this work, we propose a multi-model strategy to account for the influences of model discrepancies in uncertainty propagation and to reduce their impact on the predictions. Specifically, we take advantage of CFD models of multiple fidelities to estimate the model ...
Keeping it simple: a conceptual model of DOC dynamics in a subarctic alpine catchment
Lessels, J. S.; Tetzlaff, D.; Carey, S. K.; Soulsby, C.
2013-12-01
Understanding hydrological processes in subarctic alpine catchments characterised with discontinuous permafrost is important in order to understand carbon exports. Subarctic catchments have large storages of carbon in organic and permafrost soils. Active layer depth is one of the largest controlling factors of the release of dissolved organic carbon (DOC) due to its control on runoff pathways. Therefore, any change of this depth will affect the amount of DOC mobilised from these catchments. Simple low parameterised conceptual models offer the ability to characterise hydrological processes and linked DOC dynamics without introducing many of the uncertainties linked to high parameterised models. Lumped models can also be used to identify sources of DOC within catchments. Here, we investigate hydrological sources, flow pathways and consequently DOC dynamics in the Granger Basin, Canada, a subarctic alpine catchment using data collected from 2001 to 2008. The catchment is distinguished by aspect dependant discontinuous permafrost and seasonal frost, compounded further by differences in soil and vegetation types. Applying a simple low parameterised conceptual model allowed identification of the dominant flow paths of the main hydrological response units. The results showed that it was necessary to include active layer dynamics combined with aspect to represent the hydrological and DOC dynamics. The model provides information on the effect of climatic conditions on DOC releases. By identifying the key flow paths and relating these to spring freshet DOC exports over multiple years it is possible to gain an insight of the how climatic changes might affect hydrological processes within subarctic catchments.
Kuczera, George; Kavetski, Dmitri; Franks, Stewart; Thyer, Mark
2006-11-01
SummaryCalibration and prediction in conceptual rainfall-runoff (CRR) modelling is affected by the uncertainty in the observed forcing/response data and the structural error in the model. This study works towards the goal of developing a robust framework for dealing with these sources of error and focuses on model error. The characterisation of model error in CRR modelling has been thwarted by the convenient but indefensible treatment of CRR models as deterministic descriptions of catchment dynamics. This paper argues that the fluxes in CRR models should be treated as stochastic quantities because their estimation involves spatial and temporal averaging. Acceptance that CRR models are intrinsically stochastic paves the way for a more rational characterisation of model error. The hypothesis advanced in this paper is that CRR model error can be characterised by storm-dependent random variation of one or more CRR model parameters. A simple sensitivity analysis is used to identify the parameters most likely to behave stochastically, with variation in these parameters yielding the largest changes in model predictions as measured by the Nash-Sutcliffe criterion. A Bayesian hierarchical model is then formulated to explicitly differentiate between forcing, response and model error. It provides a very general framework for calibration and prediction, as well as for testing hypotheses regarding model structure and data uncertainty. A case study calibrating a six-parameter CRR model to daily data from the Abercrombie catchment (Australia) demonstrates the considerable potential of this approach. Allowing storm-dependent variation in just two model parameters (with one of the parameters characterising model error and the other reflecting input uncertainty) yields a substantially improved model fit raising the Nash-Sutcliffe statistic from 0.74 to 0.94. Of particular significance is the use of posterior diagnostics to test the key assumptions about the data and model errors
Design Oriented Structural Modeling for Airplane Conceptual Design Optimization
Livne, Eli
1999-01-01
The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.
Conceptual Modeling in the Time of the Revolution: Part II
Mylopoulos, John
Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.
Assessment of errors and uncertainty patterns in GIA modeling
DEFF Research Database (Denmark)
Barletta, Valentina Roberta; Spada, G.
During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... "preferred" GIA model has been used, without any consideration of the possible errors involved. Lacking a rigorous assessment of systematic errors in GIA modeling, the reliability of the results is uncertain. GIA sensitivity and uncertainties associated with the viscosity models have been explored......, such as time-evolving shorelines and paleo-coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...
RANS turbulence model form uncertainty quantification for wind engineering flows
Gorle, Catherine; Zeoli, Stephanie; Bricteux, Laurent
2016-11-01
Reynolds-averaged Navier-Stokes simulations with linear eddy-viscosity turbulence models are commonly used for modeling wind engineering flows, but the use of the results for critical design decisions is hindered by the limited capability of the models to correctly predict bluff body flows. A turbulence model form uncertainty quantification (UQ) method to define confidence intervals for the results could remove this limitation, and promising results were obtained in a previous study of the flow in downtown Oklahoma City. The objective of the present study is to further investigate the validity of these results by considering the simplified test case of the flow around a wall-mounted cube. DNS data is used to determine: 1. whether the marker, which identifies regions that deviate from parallel shear flow, is a good indicator for the regions where the turbulence model fails, and 2. which Reynolds stress perturbations, in terms of the tensor magnitude and the eigenvalues and eigenvectors of the normalized anisotropy tensor, can capture the uncertainty in the flow field. A comparison of confidence intervals obtained with the UQ method and the DNS solution indicates that the uncertainty in the velocity field can be captured correctly in a large portion of the flow field.
Effects of input uncertainty on cross-scale crop modeling
Waha, Katharina; Huth, Neil; Carberry, Peter
2014-05-01
The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input
A Conceptual Model of eLearning Adoption
Directory of Open Access Journals (Sweden)
Muneer Abbad
2011-05-01
Full Text Available Internet-based learning systems are being used in many universities and firms but their adoption requires a solid understanding of the user acceptance processes. The technology acceptance model (TAM has been used to test the acceptance of various technologies and software within an e-learning context. This research aims to discuss the main factors of a successful e-learning adoption by students. A conceptual research framework of e-learning adoption is proposed based on the TAM model.
Nettasana, T.; Craig, J. R.; Tolson, B.; Sykes, J. F.
2009-05-01
Recently, there has been a good deal of research into the benefits of using multiple conceptual models in the simulation of data-poor aquifer systems. Here, multiple conceptual and numerical models have been developed to inform improved management decisions in the Chi River Basin, Northeast Thailand, where increasing groundwater withdrawals may result in water level declination and saline water upconing problems. Effective management policies, including water allocation strategies, are required to ensure sustainable groundwater usage in this area. Twelve alternative models have been identified from the combinations of three alternative models of hydrosratigraphy, two alternative models of recharge, and two alternative models of boundary conditions. It was found that the largest impact on the water budget is due to uncertainty in the hydrogeological model whereas the uncertainty in boundary conditions has the smallest impact. To select the best among the alternative models, multiple model performance criteria have been defined and applied to evaluate the quality of individual models. It was found that even models chosen from this small set of alternatives perform differently with respect to different evaluation criteria, and that it is unlikely that a single comparison criterion will ever be sufficient for general use. Rather, it is suggested here that performance or information criterion used for model selection and aggregations should be objective-specific. Once suitable performance metrics have been identified, the chosen alternative models may then be used both individually and collectively to assess the adverse impacts of future groundwater withdrawals and to formulate alternative water supply strategies.
A Conceptual Model for Engagement of the Online Learner
Directory of Open Access Journals (Sweden)
Lorraine M. Angelino
2009-01-01
Full Text Available Engagement of the online learner is one approach to reduce attrition rates. Attrition rates for classes taught through distance education are 10 – 20% higher than classes taught in a face-to-face setting. This paper introduces a Model for Engagement and provides strategies to engage the online learner. The Model depicts various opportunities where student-instructor, student-student, student-content, and student-community engagement can occur. The Model is divided into four strategic areas: (a recruitment, (b coursework, (c post coursework, and (d alumni. The theoretical framework for the model is Tinto‟s student integration model. The conceptual design of the model is based on engagement practices from an online Health Care Management (HCMT certificate program at a university in South Carolina.
Multimorbidity: conceptual basis, epidemiological models and measurement challenges.
Fernández-Niño, Julián Alfredo; Bustos-Vázquez, Eduardo
2016-06-03
The growing number of patients with complex clinical profiles related to chronic diseases has contributed to the increasingly widespread use of the term 'multimorbidity'. A suitable measurement of this condition is essential to epidemiological studies considering that it represents a challenge for the clinical management of patients as well as for health systems and epidemiological investigations. In this context, the present essay reviews the conceptual proposals behind the measurement of multimorbidity including the epidemiological and methodological challenges it involves. We discuss classical definitions of comorbidity, how they differ from the concept of multimorbidity, and their roles in epidemiological studies. The various conceptual models that contribute to the operational definitions and strategies to measure this variable are also presented. The discussion enabled us to identify a significant gap between the modern conceptual development of multimorbidity and the operational definitions. This gap exists despite the theoretical developments that have occurred in the classical concept of comorbidity to arrive to the modern and multidimensional conception of multimorbidty. Measurement strategies, however, have not kept pace with this advance. Therefore, new methodological proposals need to be developed in order to obtain information regarding the actual impact on individuals' health and its implications for public health.
Assessing and propagating uncertainty in model inputs in corsim
Energy Technology Data Exchange (ETDEWEB)
Molina, G.; Bayarri, M. J.; Berger, J. O.
2001-07-01
CORSIM is a large simulator for vehicular traffic, and is being studied with respect to its ability to successfully model and predict behavior of traffic in a 36 block section of Chicago. Inputs to the simulator include information about street configuration, driver behavior, traffic light timing, turning probabilities at each corner and distributions of traffic ingress into the system. This work is described in more detail in the article Fast Simulators for Assessment and Propagation of Model Uncertainty also in these proceedings. The focus of this conference poster is on the computational aspects of this problem. In particular, we address the description of the full conditional distributions needed for implementation of the MCMC algorithm and, in particular, how the constraints can be incorporated; details concerning the run time and convergence of the MCMC algorithm; and utilisation of the MCMC output for prediction and uncertainty analysis concerning the CORSIM computer model. As this last is the ultimate goal, it is worth emphasizing that the incorporation of all uncertainty concerning inputs can significantly affect the model predictions. (Author)
Regional knowledge economy development indicative planning system conceptual model
Directory of Open Access Journals (Sweden)
Elena Davidovna Vaisman
2012-12-01
Full Text Available The subject of the research is the processes of Russian knowledge economy development, its progress on the regional level is taken as a theme, which determined the purpose of research: development of the regional knowledge economy development indicative planning method conceptual model. The methodological base of the research is the knowledge economy concept and supply and demand theory, the methods of comparative and system analysis and theoretical modeling; common generalization and classification methods and regression models are used in the work. As a result, we managed to create the regional knowledge economy development indicative planning method conceptual model, which includes the choice of the types of indicative plans and the justification for the complex of indicators according to the stated requirements to this complex. The model of supply and demand for knowledge dependency from the knowledge cost, allowing to determine the acceptable range for the indicators proceeding from the demand and supply levels and their interrelation, is developed. The obtained results may be used by the regional government authorities while planning the regional innovative development and consulting companies while making the proposals for this development
Conceptual Change Texts in Chemistry Teaching: A Study on the Particle Model of Matter
Beerenwinkel, Anne; Parchmann, Ilka; Grasel, Cornelia
2011-01-01
This study explores the effect of a conceptual change text on students' awareness of common misconceptions on the particle model of matter. The conceptual change text was designed based on principles of text comprehensibility, of conceptual change instruction and of instructional approaches how to introduce the particle model. It was evaluated in…
Propagating Uncertainties from Source Model Estimations to Coulomb Stress Changes
Baumann, C.; Jonsson, S.; Woessner, J.
2009-12-01
Multiple studies have shown that static stress changes due to permanent fault displacement trigger earthquakes on the causative and on nearby faults. Calculations of static stress changes in previous studies have been based on fault parameters without considering any source model uncertainties or with crude assumptions about fault model errors based on available different source models. In this study, we investigate the influence of fault model parameter uncertainties on Coulomb Failure Stress change (ΔCFS) calculations by propagating the uncertainties from the fault estimation process to the Coulomb Failure stress changes. We use 2500 sets of correlated model parameters determined for the June 2000 Mw = 5.8 Kleifarvatn earthquake, southwest Iceland, which were estimated by using a repeated optimization procedure and multiple data sets that had been modified by synthetic noise. The model parameters show that the event was predominantly a right-lateral strike-slip earthquake on a north-south striking fault. The variability of the sets of models represents the posterior probability density distribution for the Kleifarvatn source model. First we investigate the influence of individual source model parameters on the ΔCFS calculations. We show through a correlation analysis that for this event, changes in dip, east location, strike, width and in part north location have stronger impact on the Coulomb failure stress changes than changes in fault length, depth, dip-slip and strike-slip. Second we find that the accuracy of Coulomb failure stress changes appears to increase with increasing distance from the fault. The absolute value of the standard deviation decays rapidly with distance within about 5-6 km around the fault from about 3-3.5 MPa down to a few Pa, implying that the influence of parameter changes decrease with increasing distance. This is underlined by the coefficient of variation CV, defined as the ratio of the standard deviation of the Coulomb stress
Hydrological model parameter dimensionality is a weak measure of prediction uncertainty
Directory of Open Access Journals (Sweden)
S. Pande
2015-04-01
Full Text Available This paper shows that instability of hydrological system representation in response to different pieces of information and associated prediction uncertainty is a function of model complexity. After demonstrating the connection between unstable model representation and model complexity, complexity is analyzed in a step by step manner. This is done measuring differences between simulations of a model under different realizations of input forcings. Algorithms are then suggested to estimate model complexity. Model complexities of the two model structures, SAC-SMA (Sacramento Soil Moisture Accounting and its simplified version SIXPAR (Six Parameter Model, are computed on resampled input data sets from basins that span across the continental US. The model complexities for SIXPAR are estimated for various parameter ranges. It is shown that complexity of SIXPAR increases with lower storage capacity and/or higher recession coefficients. Thus it is argued that a conceptually simple model structure, such as SIXPAR, can be more complex than an intuitively more complex model structure, such as SAC-SMA for certain parameter ranges. We therefore contend that magnitudes of feasible model parameters influence the complexity of the model selection problem just as parameter dimensionality (number of parameters does and that parameter dimensionality is an incomplete indicator of stability of hydrological model selection and prediction problems.
Uncertainty Analysis of Integrated Navigation Model for Underwater Vehicle
Directory of Open Access Journals (Sweden)
Zhang Tao
2013-02-01
Full Text Available In this study, to reduce information uncertainty of integrated navigation model for underwater vehicle, we present a multi-sensor information fusion algorithm based on evidence theory. The algorithm reduces attribution by rough set in order to acquire simplified ELMAN neural network and improve basic probability assignment. And then it uses improved D-S evidence to deal with the inaccuracy and fuzzy information, make the final decision. The simulation example shows feasibility and effectiveness of the algorithm.
Energy and Uncertainty: Models and Algorithms for Complex Energy Systems
2014-01-01
The problem of controlling energy systems (generation, transmission, storage, investment) introduces a number of optimization problems which need to be solved in the presence of different types of uncertainty. We highlight several of these applications, using a simple energy storage problem as a case application. Using this setting, we describe a modeling framework based around five fundamental dimensions which is more natural than the standard canonical form widely used in the reinforcement ...
Hydrological model uncertainty due to spatial evapotranspiration estimation methods
Yu, Xuan; Lamačová, Anna; Duffy, Christopher; Krám, Pavel; Hruška, Jakub
2016-05-01
Evapotranspiration (ET) continues to be a difficult process to estimate in seasonal and long-term water balances in catchment models. Approaches to estimate ET typically use vegetation parameters (e.g., leaf area index [LAI], interception capacity) obtained from field observation, remote sensing data, national or global land cover products, and/or simulated by ecosystem models. In this study we attempt to quantify the uncertainty that spatial evapotranspiration estimation introduces into hydrological simulations when the age of the forest is not precisely known. The Penn State Integrated Hydrologic Model (PIHM) was implemented for the Lysina headwater catchment, located 50°03‧N, 12°40‧E in the western part of the Czech Republic. The spatial forest patterns were digitized from forest age maps made available by the Czech Forest Administration. Two ET methods were implemented in the catchment model: the Biome-BGC forest growth sub-model (1-way coupled to PIHM) and with the fixed-seasonal LAI method. From these two approaches simulation scenarios were developed. We combined the estimated spatial forest age maps and two ET estimation methods to drive PIHM. A set of spatial hydrologic regime and streamflow regime indices were calculated from the modeling results for each method. Intercomparison of the hydrological responses to the spatial vegetation patterns suggested considerable variation in soil moisture and recharge and a small uncertainty in the groundwater table elevation and streamflow. The hydrologic modeling with ET estimated by Biome-BGC generated less uncertainty due to the plant physiology-based method. The implication of this research is that overall hydrologic variability induced by uncertain management practices was reduced by implementing vegetation models in the catchment models.
Regionalization parameters of conceptual rainfall-runoff model
Osuch, M.
2003-04-01
Main goal of this study was to develop techniques for the a priori estimation parameters of hydrological model. Conceptual hydrological model CLIRUN was applied to around 50 catchment in Poland. The size of catchments range from 1 000 to 100 000 km2. The model was calibrated for a number of gauged catchments with different catchment characteristics. The parameters of model were related to different climatic and physical catchment characteristics (topography, land use, vegetation and soil type). The relationships were tested by comparing observed and simulated runoff series from the gauged catchment that were not used in the calibration. The model performance using regional parameters was promising for most of the calibration and validation catchments.
Directory of Open Access Journals (Sweden)
Mehran FARAJOLLAHI
2010-07-01
Full Text Available The present research aims at presenting a conceptual model for effective distance learning in higher education. Findings of this research shows that an understanding of the technological capabilities and learning theories especially constructive theory and independent learning theory and communicative and interaction theory in Distance learning is an efficient factor in the planning of effective Distance learning in higher education. Considering the theoretical foundations of the present research, in the effective distance learning model, the learner is situated at the center of learning environment. For this purpose, the learner needs to be ready for successful learning and the teacher has to be ready to design the teaching- learning activities when they initially enter the environment. In the present model, group and individual active teaching-learning approach, timely feedback, using IT and eight types of interactions have been designed with respect to theoretical foundations and current university missions. From among the issues emphasized in this model, one can refer to the Initial, Formative and Summative evaluations. In an effective distance learning environment, evaluation should be part of the learning process and the feedback resulting from it should be used to improve learning. For validating the specified features, the opinions of Distance learning experts in Payame Noor, Shiraz, Science and Technology and Amirkabir Universities have been used which verified a high percentage of the statistical sample of the above mentioned features.
Sensitivities and uncertainties of modeled ground temperatures in mountain environments
Directory of Open Access Journals (Sweden)
S. Gubler
2013-02-01
Full Text Available Before operational use or for decision making, models must be validated, and the degree of trust in model outputs should be quantified. Often, model validation is performed at single locations due to the lack of spatially-distributed data. Since the analysis of parametric model uncertainties can be performed independently of observations, it is a suitable method to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainty of a physically-based mountain permafrost model are quantified within an artificial topography consisting of different elevations and exposures combined with six ground types characterized by their hydraulic properties. The analyses performed for all combinations of topographic factors and ground types allowed to quantify the variability of model sensitivity and uncertainty within mountain regions. We found that modeled snow duration considerably influences the mean annual ground temperature (MAGT. The melt-out day of snow (MD is determined by processes determining snow accumulation and melting. Parameters such as the temperature and precipitation lapse rate and the snow correction factor have therefore a great impact on modeled MAGT. Ground albedo changes MAGT from 0.5 to 4°C in dependence of the elevation, the aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter snow cover. Snow albedo and other parameters determining the amount of reflected solar radiation are important, changing MAGT at different depths by more than 1°C. Parameters influencing the turbulent fluxes as the roughness length or the dew temperature are more sensitive at low elevation sites due to higher air temperatures and decreased solar radiation. Modeling the individual terms of the energy
Stochastic reduced order models for inverse problems under uncertainty.
Warner, James E; Aquino, Wilkins; Grigoriu, Mircea D
2015-03-01
This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well.
Economic-mathematical methods and models under uncertainty
Aliyev, A G
2013-01-01
Brief Information on Finite-Dimensional Vector Space and its Application in EconomicsBases of Piecewise-Linear Economic-Mathematical Models with Regard to Influence of Unaccounted Factors in Finite-Dimensional Vector SpacePiecewise Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence in Three-Dimensional Vector SpacePiecewise-Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence on a PlaneBases of Software for Computer Simulation and Multivariant Prediction of Economic Even at Uncertainty Conditions on the Base of N-Comp
DEFF Research Database (Denmark)
Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard
2015-01-01
Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... uncertainties can be implemented in probabilistic reliability assessments....
A conceptual model of psychological contracts in construction projects
Directory of Open Access Journals (Sweden)
Yongjian Ke
2016-09-01
Full Text Available The strategic importance of relationship style contracting is recognised in the construction industry. Both public and private sector clients are stipulating more integrated and collaborative forms of procurement. Despite relationship and integrated contractual arrangement being available for some time, it is clear that construction firms have been slow to adopt them. Hence it is timely to examine how social exchanges, via unwritten agreement and behaviours, are being nurtured in construction projects. This paper adopted the concept of Psychological Contracts (PC to describe unwritten agreement and behaviours. A conceptual model of the PC is developed and validated using the results from a questionnaire survey administered to construction professionals in Australia. The results uncovered the relationships that existed amongst relational conditions and relational benefits, the PC and the partners’ satisfaction. The results show that all the hypotheses in the conceptual model of the PC are supported, suggesting the PC model is important and may have an effect on project performance and relationship quality among contracting parties. A validated model of the PC in construction was then developed based on the correlations among each component. The managerial implications are that past relationships and relationship characteristics should be taken into account in the selection of procurement partners and the promise of future resources, support and tangible relational outcomes are also vital. It is important for contracting parties to pay attention to unwritten agreements (the PC and behaviours when managing construction projects.
Conceptual Model of Offshore Wind Environmental Risk Evaluation System
Energy Technology Data Exchange (ETDEWEB)
Anderson, Richard M.; Copping, Andrea E.; Van Cleve, Frances B.; Unwin, Stephen D.; Hamilton, Erin L.
2010-06-01
In this report we describe the development of the Environmental Risk Evaluation System (ERES), a risk-informed analytical process for estimating the environmental risks associated with the construction and operation of offshore wind energy generation projects. The development of ERES for offshore wind is closely allied to a concurrent process undertaken to examine environmental effects of marine and hydrokinetic (MHK) energy generation, although specific risk-relevant attributes will differ between the MHK and offshore wind domains. During FY10, a conceptual design of ERES for offshore wind will be developed. The offshore wind ERES mockup described in this report will provide a preview of the functionality of a fully developed risk evaluation system that will use risk assessment techniques to determine priority stressors on aquatic organisms and environments from specific technology aspects, identify key uncertainties underlying high-risk issues, compile a wide-range of data types in an innovative and flexible data organizing scheme, and inform planning and decision processes with a transparent and technically robust decision-support tool. A fully functional version of ERES for offshore wind will be developed in a subsequent phase of the project.
Conceptual geoinformation model of natural hazards risk assessment
Kulygin, Valerii
2016-04-01
Natural hazards are the major threat to safe interactions between nature and society. The assessment of the natural hazards impacts and their consequences is important in spatial planning and resource management. Today there is a challenge to advance our understanding of how socio-economical and climate changes will affect the frequency and magnitude of hydro-meteorological hazards and associated risks. However, the impacts from different types of natural hazards on various marine and coastal economic activities are not of the same type. In this study, the conceptual geomodel of risk assessment is presented to highlight the differentiation by the type of economic activities in extreme events risk assessment. The marine and coastal ecosystems are considered as the objects of management, on the one hand, and as the place of natural hazards' origin, on the other hand. One of the key elements in describing of such systems is the spatial characterization of their components. Assessment of ecosystem state is based on ecosystem indicators (indexes). They are used to identify the changes in time. The scenario approach is utilized to account for the spatio-temporal dynamics and uncertainty factors. Two types of scenarios are considered: scenarios of using ecosystem services by economic activities and scenarios of extreme events and related hazards. The reported study was funded by RFBR, according to the research project No. 16-35-60043 mol_a_dk.
Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models
Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea
2014-05-01
Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.
Usage of ensemble geothermal models to consider geological uncertainties
Rühaak, Wolfram; Steiner, Sarah; Welsch, Bastian; Sass, Ingo
2015-04-01
The usage of geothermal energy for instance by borehole heat exchangers (BHE) is a promising concept for a sustainable supply of heat for buildings. BHE are closed pipe systems, in which a fluid is circulating. Heat from the surrounding rocks is transferred to the fluid purely by conduction. The fluid carries the heat to the surface, where it can be utilized. Larger arrays of BHE require typically previous numerical models. Motivations are the design of the system (number and depth of the required BHE) but also regulatory reasons. Especially such regulatory operating permissions often require maximum realistic models. Although such realistic models are possible in many cases with today's codes and computer resources, they are often expensive in terms of time and effort. A particular problem is the knowledge about the accuracy of the achieved results. An issue, which is often neglected while dealing with highly complex models, is the quantification of parameter uncertainties as a consequence of the natural heterogeneity of the geological subsurface. Experience has shown, that these heterogeneities can lead to wrong forecasts. But also variations in the technical realization and especially of the operational parameters (which are mainly a consequence of the regional climate) can lead to strong variations in the simulation results. Instead of one very detailed single forecast model, it should be considered, to model numerous more simple models. By varying parameters, the presumed subsurface uncertainties, but also the uncertainties in the presumed operational parameters can be reflected. Finally not only one single result should be reported, but instead the range of possible solutions and their respective probabilities. In meteorology such an approach is well known as ensemble-modeling. The concept is demonstrated at a real world data set and discussed.
Infiltration under snow cover: Modeling approaches and predictive uncertainty
Meeks, Jessica; Moeck, Christian; Brunner, Philip; Hunkeler, Daniel
2017-03-01
Groundwater recharge from snowmelt represents a temporal redistribution of precipitation. This is extremely important because the rate and timing of snowpack drainage has substantial consequences to aquifer recharge patterns, which in turn affect groundwater availability throughout the rest of the year. The modeling methods developed to estimate drainage from a snowpack, which typically rely on temporally-dense point-measurements or temporally-limited spatially-dispersed calibration data, range in complexity from the simple degree-day method to more complex and physically-based energy balance approaches. While the gamut of snowmelt models are routinely used to aid in water resource management, a comparison of snowmelt models' predictive uncertainties had previously not been done. Therefore, we established a snowmelt model calibration dataset that is both temporally dense and represents the integrated snowmelt infiltration signal for the Vers Chez le Brandt research catchment, which functions as a rather unique natural lysimeter. We then evaluated the uncertainty associated with the degree-day, a modified degree-day and energy balance snowmelt model predictions using the null-space Monte Carlo approach. All three melt models underestimate total snowpack drainage, underestimate the rate of early and midwinter drainage and overestimate spring snowmelt rates. The actual rate of snowpack water loss is more constant over the course of the entire winter season than the snowmelt models would imply, indicating that mid-winter melt can contribute as significantly as springtime snowmelt to groundwater recharge in low alpine settings. Further, actual groundwater recharge could be between 2 and 31% greater than snowmelt models suggest, over the total winter season. This study shows that snowmelt model predictions can have considerable uncertainty, which may be reduced by the inclusion of more data that allows for the use of more complex approaches such as the energy balance
Eye tracker uncertainty analysis and modelling in real time
Fornaser, A.; De Cecco, M.; Leuci, M.; Conci, N.; Daldoss, M.; Armanini, A.; Maule, L.; De Natale, F.; Da Lio, M.
2017-01-01
Techniques for tracking the eyes took place since several decades for different applications that range from military, to education, entertainment and clinics. The existing systems are in general of two categories: precise but intrusive or comfortable but less accurate. The idea of this work is to calibrate an eye tracker of the second category. In particular we have estimated the uncertainty both in nominal and in case of variable operating conditions. We took into consideration different influencing factors such as: head movement and rotation, eyes detected, target position on the screen, illumination and objects in front of the eyes. Results proved that the 2D uncertainty can be modelled as a circular confidence interval as far as there is no stable principal directions in both the systematic and the repeatability effects. This confidence region was also modelled as a function of the current working conditions. In this way we can obtain a value of the uncertainty that is a function of the operating condition estimated in real time opening the field to new applications that reconfigure the human machine interface as a function of the operating conditions. Examples can range from option buttons reshape, local zoom dynamically adjusted, speed optimization to regulate interface responsiveness, the possibility to take into account the uncertainty associated to a particular interaction. Furthermore, in the analysis of visual scanning patterns, the resulting Point of Regard maps would be associated with proper confidence levels thus allowing to draw accurate conclusions. We conducted an experimental campaign to estimate and validate the overall modelling procedure obtaining valid results in 86% of the cases.
Penetration Testing Professional Ethics: a conceptual model and taxonomy
Directory of Open Access Journals (Sweden)
Justin Pierce
2006-05-01
Full Text Available In an environment where commercial software is continually patched to correct security flaws, penetration testing can provide organisations with a realistic assessment of their security posture. Penetration testing uses the same principles as criminal hackers to penetrate corporate networks and thereby verify the presence of software vulnerabilities. Network administrators can use the results of a penetration test to correct flaws and improve overall security. The use of hacking techniques, however, raises several ethical questions that centre on the integrity of the tester to maintain professional distance and uphold the profession. This paper discusses the ethics of penetration testing and presents our conceptual model and revised taxonomy.
A Framework for Conceptual Modeling of Geographic Data Quality
DEFF Research Database (Denmark)
Friis-Christensen, Anders; Christensen, J.V.; Jensen, Christian Søndergaard
2004-01-01
Sustained advances in wireless communications, geo-positioning, and consumer electronics pave the way to a kind of location-based service that relies on the tracking of the continuously changing positions of an entire population of service users. This type of service is characterized by large...... determined by how "good" the data is, as different applications of geographic data require different qualities of the data are met. Such qualities concern the object level as well as the attribute level of the data. This paper presents a systematic and integrated approach to the conceptual modeling...
Conceptual modelling approach of mechanical products based on functional surface
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
A modelling framework based on functional surface is presented to support conceptual design of mechanical products. The framework organizes product information in an abstract and multilevel manner. It consists of two mapping processes: function decomposition process and form reconstitution process. The steady mapping relationship from function to form (function-functional surface-form) is realized by taking functional surface as the middle layer. It farthest reduces the possibilities of combinatorial explosion that can occur during function decomposition and form reconstitution. Finally, CAD tools are developed and an auto-bender machine is applied to demonstrate the proposed approach.
Evaluation of Trapped Radiation Model Uncertainties for Spacecraft Design
Armstrong, T. W.; Colborn, B. L.
2000-01-01
The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux, dose, and activation measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir Space Station. This report gives a summary of the model-data comparisons-detailed results are given in a companion report. Results from the model comparisons with flic,ht data show, for example, the AP8 model underpredicts the trapped proton flux at low altitudes by a factor of about two (independent of proton energy and solar cycle conditions), and that the AE8 model overpredicts the flux in the outer electron belt by an order of magnitude or more.
Conceptual Development af a 3D Product Configuration Model
DEFF Research Database (Denmark)
Skauge, Jørn
2006-01-01
Paper. This project deals with 3D product configuration of a digital building element which has been developed as a prototype in cooperation between a product manufacturer and a research institution in Denmark. The project falls within the concept of product modelling which is more and more used...... in the development of IT-systems that support the procedures in companies and in the building industry. In other words, it is a knowledge-based system that helps companies in their daily work. The aim of the project has been to develop and examine conceptual ideas about 3D modelling configurator used in the company......’s production of steel fire sliding doors. The development of the 3D digital model is based on practical rather than theoretical research. The result of the research is a prototype digital 3D model to be presented live....
Institute of Scientific and Technical Information of China (English)
LIU Lei; WU Yu-feng; LI Xiao-jun
2012-01-01
How to measure the quality of conceptual models is an important issue in the IS field and related research. This paper conducts a review of research in measuring conceptual model quality and identifies the major theoretical and practical issues that need to be addressed in future studies. We review current classification frameworks for conceptual model quality and practice of measuring conceptual model quality. Based on the review, challenges for studies of measuring the quality of conceptual models are proposed and these challenges are also research points which should be strengthened in future studies.
A conceptual model for local content development in petroleum industry
Directory of Open Access Journals (Sweden)
Abolfazl Kazzazi
2012-10-01
Full Text Available A novel concept, local content, in oil industry is gradually emerging. Local content should be defined in terms of value addition in local country (by local staff, local materials, local services and facilities rather than in terms of ownership of the company performing the value added activities. Many oil exporting countries have taken a positive approach toward local content development to maximize the benefits from oil and gas extraction. The purpose of this study is to develop a conceptual model for local content development in petroleum industry. Local content can generally be defined in terms of the ownership and/ or location of the enterprises involved in production and/ or the value-added in the production process. Local content promotion will have to vary significantly between countries, depending on the current status of their economic, political and social development. This model is useful for state governments to consider all aspects and factors affecting local content development generally. Local content development outcomes are economic growth, industrial growth and spillover effects. The paper begins with examining the factors accommodated in literature believed to influence the local content promotion. Based on our review, the conceptual model derived includes key factors of local content that evaluate local content development, and examine interrelations between local policies, local infrastructure, local environment, and local capability.
A python framework for environmental model uncertainty analysis
White, Jeremy; Fienen, Michael; Doherty, John E.
2016-01-01
We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.
Uncertainty quantification for quantum chemical models of complex reaction networks.
Proppe, Jonny; Husch, Tamara; Simm, Gregor N; Reiher, Markus
2016-12-22
For the quantitative understanding of complex chemical reaction mechanisms, it is, in general, necessary to accurately determine the corresponding free energy surface and to solve the resulting continuous-time reaction rate equations for a continuous state space. For a general (complex) reaction network, it is computationally hard to fulfill these two requirements. However, it is possible to approximately address these challenges in a physically consistent way. On the one hand, it may be sufficient to consider approximate free energies if a reliable uncertainty measure can be provided. On the other hand, a highly resolved time evolution may not be necessary to still determine quantitative fluxes in a reaction network if one is interested in specific time scales. In this paper, we present discrete-time kinetic simulations in discrete state space taking free energy uncertainties into account. The method builds upon thermo-chemical data obtained from electronic structure calculations in a condensed-phase model. Our kinetic approach supports the analysis of general reaction networks spanning multiple time scales, which is here demonstrated for the example of the formose reaction. An important application of our approach is the detection of regions in a reaction network which require further investigation, given the uncertainties introduced by both approximate electronic structure methods and kinetic models. Such cases can then be studied in greater detail with more sophisticated first-principles calculations and kinetic simulations.
Characterization and modeling of uncertainty intended for a secured MANET
Directory of Open Access Journals (Sweden)
Md. Amir Khusru Akhtar
2013-08-01
Full Text Available Mobile ad-hoc network is a chaos for decades due to its dynamic and heuristic base. It employs several forms of uncertainty such as vagueness and imprecision. Vagueness can be taken in terms of linguistic assumptions such as grading and classification for the acceptance. Imprecision on the other hand can be associated with countable or noncountable assumptions such as the weights of acceptance calculated by the members of the MANET. This paper presents “Certainty Intended Model (CIM” for a secured MANET by introducing one or more expert nodes together with the inclusion of various theories (such as monotone measure, belief, plausibility, evidence. These theories can be used for the characterization and modeling various forms of uncertainty. Further, these characterizations help in quantifying the uncertainty spectrum because, as much information about the problem is available we can transform from one theory to another. In this work we have shown how these theories and expert opinion helps to identify the setback associated with the MANET in respect of trust management and finally, enhances the security, reliability and performance of the MANET.
DEFF Research Database (Denmark)
Maiorano, Andrea; Martre, Pierre; Asseng, Senthold
2017-01-01
To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of mo...
Model parameter uncertainty analysis for an annual field-scale phosphorus loss model
Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...
Model parameter uncertainty analysis for annual field-scale P loss model
Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...
A sliding mode observer for hemodynamic characterization under modeling uncertainties
Zayane, Chadia
2014-06-01
This paper addresses the case of physiological states reconstruction in a small region of the brain under modeling uncertainties. The misunderstood coupling between the cerebral blood volume and the oxygen extraction fraction has lead to a partial knowledge of the so-called balloon model describing the hemodynamic behavior of the brain. To overcome this difficulty, a High Order Sliding Mode observer is applied to the balloon system, where the unknown coupling is considered as an internal perturbation. The effectiveness of the proposed method is illustrated through a set of synthetic data that mimic fMRI experiments.
A conceptual glacio-hydrological model for high mountainous catchments
Directory of Open Access Journals (Sweden)
B. Schaefli
2005-01-01
Full Text Available In high mountainous catchments, the spatial precipitation and therefore the overall water balance is generally difficult to estimate. The present paper describes the structure and calibration of a semi-lumped conceptual glacio-hydrological model for the joint simulation of daily discharge and annual glacier mass balance that represents a better integrator of the water balance. The model has been developed for climate change impact studies and has therefore a parsimonious structure; it requires three input times series – precipitation, temperature and potential evapotranspiration – and has 7 parameters to calibrate. A multi-signal approach considering daily discharge and – if available – annual glacier mass balance has been developed for the calibration of these parameters. The model has been calibrated for three different catchments in the Swiss Alps having glaciation rates between 37% and 52%. It simulates well the observed daily discharge, the hydrological regime and some basic glaciological features, such as the annual mass balance.
A CONCEPTUAL FRAMEWORK FOR SUSTAINABLE POULTRY SUPPLY CHAIN MODEL
Directory of Open Access Journals (Sweden)
Mohammad SHAMSUDDOHA
2013-12-01
Full Text Available Now a day, sustainable supply chain is the crucially considerable matter for future focused industries. As a result, attention in supply chain management has increasingly amplified since the 1980s when firms discovered its benefits of mutual relationships within and beyond their own organization. This is why, concern researchers are trying hard to develop new theory or model which might help the corporate sector for achieving sustainability in their supply chains. This kind of reflection can be seen by the number of papers published and in particular by journal since 1980. The objectives of this paper are twofold. First, it offers a literature review on sustainable supply chain management taking papers published in last three decades. Second, it offers a conceptual sustainable supply chain process model in light of triple bottom line theory. The model has been developed by taking in-depth interview of an entrepreneur from a Poultry case industry in Bangladesh.
Absorptive Capacity of Information Technology and Its Conceptual Model
Institute of Scientific and Technical Information of China (English)
BI Xinhua; YU Cuiling
2008-01-01
In order to examine the problem of how to improve the use of information technology (IT) in enterprises, this paper makes an exploration from the perspective of organizational absorptive capacity. We propose the concept of IT absorptive capacity from an organizational level. A dynamic process model is developed to further analyze IT absorption. IT absorptive capacity of this process is embodied as six forms: identification, adoption, adaptation, acceptance, infusion, and knowledge management. By means of questionnaire surveys of 76 Chinese enterprises, the main factors that favor or disable the capacity of each stage are discovered. Using the method of system dynamics, a conceptual model of IT absorptive capacity is developed to analyze the action mechanism of the factors in detail. The model indicates that the critical factors are embodied in the aspect of management. Furthermore, it demonstrates that IT absorption is a spiral process, during which IT absorptive capacity evolves dynamically and, consequently, promotes IT use.
Herrera, Christian; Custodio, Emilio
2008-11-01
Most human activities and hydrogeological information on small young volcanic islands are near the coastal area. There are almost no hydrological data from inland areas, where permanent springs and/or boreholes may be rare or nonexistent. A major concern is the excessive salinity of near-the-coast wells. Obtaining a conceptual hydrogeological model is crucial for groundwater resources development and management. Surveys of water seepages and rain for chemical and environmental isotope contents may provide information on the whole island groundwater flow conditions, in spite of remaining geological and hydrogeological uncertainties. New data from Easter Island (Isla de Pascua), in the Pacific Ocean, are considered. Whether Easter Island has a central low permeability volcanic “core” sustaining an elevated water table remains unknown. Average recharge is estimated at 300-400 mm/year, with a low salinity of 15-50 mg/L Cl. There is an apron of highly permeable volcanics that extends to the coast. The salinity of near-the-coast wells, >1,000 mg/L Cl, is marine in origin. This is the result of a thick mixing zone of island groundwater and encroached seawater, locally enhanced by upconings below pumping wells. This conceptual model explains what is observed, in the absence of inland boreholes and springs.
Updated Conceptual Model for the 300 Area Uranium Groundwater Plume
Energy Technology Data Exchange (ETDEWEB)
Zachara, John M.; Freshley, Mark D.; Last, George V.; Peterson, Robert E.; Bjornstad, Bruce N.
2012-11-01
The 300 Area uranium groundwater plume in the 300-FF-5 Operable Unit is residual from past discharge of nuclear fuel fabrication wastes to a number of liquid (and solid) disposal sites. The source zones in the disposal sites were remediated by excavation and backfilled to grade, but sorbed uranium remains in deeper, unexcavated vadose zone sediments. In spite of source term removal, the groundwater plume has shown remarkable persistence, with concentrations exceeding the drinking water standard over an area of approximately 1 km2. The plume resides within a coupled vadose zone, groundwater, river zone system of immense complexity and scale. Interactions between geologic structure, the hydrologic system driven by the Columbia River, groundwater-river exchange points, and the geochemistry of uranium contribute to persistence of the plume. The U.S. Department of Energy (DOE) recently completed a Remedial Investigation/Feasibility Study (RI/FS) to document characterization of the 300 Area uranium plume and plan for beginning to implement proposed remedial actions. As part of the RI/FS document, a conceptual model was developed that integrates knowledge of the hydrogeologic and geochemical properties of the 300 Area and controlling processes to yield an understanding of how the system behaves and the variables that control it. Recent results from the Hanford Integrated Field Research Challenge site and the Subsurface Biogeochemistry Scientific Focus Area Project funded by the DOE Office of Science were used to update the conceptual model and provide an assessment of key factors controlling plume persistence.
An analogue conceptual rainfall-runoff model for educational purposes
Herrnegger, Mathew; Riedl, Michael; Schulz, Karsten
2016-04-01
Conceptual rainfall-runoff models, in which runoff processes are modelled with a series of connected linear and non-linear reservoirs, remain widely applied tools in science and practice. Additionally, the concept is appreciated in teaching due to its somewhat simplicity in explaining and exploring hydrological processes of catchments. However, when a series of reservoirs are used, the model system becomes highly parametrized and complex and the traceability of the model results becomes more difficult to explain to an audience not accustomed to numerical modelling. Since normally the simulations are performed with a not visible digital code, the results are also not easily comprehensible. This contribution therefore presents a liquid analogue model, in which a conceptual rainfall-runoff model is reproduced by a physical model. This consists of different acrylic glass containers representing different storage components within a catchment, e.g. soil water or groundwater storage. The containers are equipped and connected with pipes, in which water movement represents different flow processes, e.g. surface runoff, percolation or base flow. Water from a storage container is pumped to the upper part of the model and represents effective rainfall input. The water then flows by gravity through the different pipes and storages. Valves are used for controlling the flows within the analogue model, comparable to the parameterization procedure in numerical models. Additionally, an inexpensive microcontroller-based board and sensors are used to measure storage water levels, with online visualization of the states as time series data, building a bridge between the analogue and digital world. The ability to physically witness the different flows and water levels in the storages makes the analogue model attractive to the audience. Hands-on experiments can be performed with students, in which different scenarios or catchment types can be simulated, not only with the analogue but
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the
Assessment of Solution Uncertainties in Single-Column Modeling Frameworks.
Hack, James J.; Pedretti, John A.
2000-01-01
Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.
Quantifying uncertainty, variability and likelihood for ordinary differential equation models
LENUS (Irish Health Repository)
Weisse, Andrea Y
2010-10-28
Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.
Modeling of uncertainties for wind turbine blade design
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Toft, Henrik Stensgaard
2014-01-01
Wind turbine blades are designed by a combination of tests and numerical calculations using finite element models of the blade. The blades are typically composite structures with laminates of glass-fiber and/or carbon-fibers glued together by a matrix material. This paper presents a framework...... for stochastic modelling of the load bearing capacity of wind turbine blades incorporating physical, model, measurement and statistical uncertainties at the different scales and also discusses the possibility to define numerical tests that can be included in the statistical basis. The stochastic modelling takes...... basis in the JCSS framework for modelling material properties, Bayesian statistical methods allowing prior / expert knowledge to be accounted for and the Maximum Likelihood Method. The stochastic framework is illustrated using simulated tests which represent examples relevant for wind turbine blades....
Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance
Energy Technology Data Exchange (ETDEWEB)
Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)
2017-03-23
In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.
Uncertainties in modelling the climate impact of irrigation
de Vrese, Philipp; Hagemann, Stefan
2017-04-01
Many issues related to the climate impact of irrigation are addressed in studies that apply a wide range of models. These involve uncertainties related to differences in the model's general structure and parametrizations on the one hand and the need for simplifying assumptions with respect to the representation of irrigation on the other hand. To address these uncertainties, we used the Max Planck Institute for Meteorology's Earth System model into which a simple irrigation scheme was implemented. In several simulations, we varied certain irrigation characteristics to estimate the resulting variations in irrigation's climate impact and found a large sensitivity with respect to the irrigation effectiveness. Here, the assumed effectiveness of the scheme is a combination of the target soil moisture and the degree to which water losses are accounted for. In general, the simulated impact of irrigation on the state of the land surface and the atmosphere is more than three times larger when assuming a low irrigation effectiveness compared to a high effectiveness. In an additional set of simulations, we varied certain aspects of the model's general structure, namely the land-surface-atmosphere coupling, to estimate the related uncertainties. Here we compared the impact of irrigation between simulations using a parameter aggregation, a simple flux aggregation scheme and a coupling scheme that also accounts for spatial heterogeneity within the lowest layers of the atmosphere. It was found that changes in the land-surface-atmosphere coupling do not only affect the magnitude of climate impacts but they can even affect the direction of the impacts.
Selection of Representative Models for Decision Analysis Under Uncertainty
Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.
2016-03-01
The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.
Comparing Two Strategies to Model Uncertainties in Structural Dynamics
Directory of Open Access Journals (Sweden)
Rubens Sampaio
2010-01-01
Full Text Available In the modeling of dynamical systems, uncertainties are present and they must be taken into account to improve the prediction of the models. Some strategies have been used to model uncertainties and the aim of this work is to discuss two of those strategies and to compare them. This will be done using the simplest model possible: a two d.o.f. (degrees of freedom dynamical system. A simple system is used because it is very helpful to assure a better understanding and, consequently, comparison of the strategies. The first strategy (called parametric strategy consists in taking each spring stiffness as uncertain and a random variable is associated to each one of them. The second strategy (called nonparametric strategy is more general and considers the whole stiffness matrix as uncertain, and associates a random matrix to it. In both cases, the probability density functions either of the random parameters or of the random matrix are deduced from the Maximum Entropy Principle using only the available information. With this example, some important results can be discussed, which cannot be assessed when complex structures are used, as it has been done so far in the literature. One important element for the comparison of the two strategies is the analysis of the samples spaces and the how to compare them.
Sustainable infrastructure system modeling under uncertainties and dynamics
Huang, Yongxi
potential risks caused by feedstock seasonality and demand uncertainty. Facility spatiality, time variation of feedstock yields, and demand uncertainty are integrated into a two-stage stochastic programming (SP) framework. In the study of Transitional Energy System Modeling under Uncertainty, a multistage stochastic dynamic programming is established to optimize the process of building and operating fuel production facilities during the transition. Dynamics due to the evolving technologies and societal changes and uncertainty due to demand fluctuations are the major issues to be addressed.
A conceptual model for translating omic data into clinical action
Directory of Open Access Journals (Sweden)
Timothy M Herr
2015-01-01
Full Text Available Genomic, proteomic, epigenomic, and other "omic" data have the potential to enable precision medicine, also commonly referred to as personalized medicine. The volume and complexity of omic data are rapidly overwhelming human cognitive capacity, requiring innovative approaches to translate such data into patient care. Here, we outline a conceptual model for the application of omic data in the clinical context, called "the omic funnel." This model parallels the classic "Data, Information, Knowledge, Wisdom pyramid" and adds context for how to move between each successive layer. Its goal is to allow informaticians, researchers, and clinicians to approach the problem of translating omic data from bench to bedside, by using discrete steps with clearly defined needs. Such an approach can facilitate the development of modular and interoperable software that can bring precision medicine into widespread practice.
Model requirements for decision support under uncertainty in data scarce dynamic deltas
Haasnoot, Marjolijn; van Deursen, W.P.A.; Kwakkel, J. H.; Middelkoop, H.
2016-01-01
There is a long tradition of model-based decision support in water management. The consideration of deep uncertainty, however, changes the requirements imposed on models.. In the face of deep uncertainty, models are used to explore many uncertainties and the decision space across multiple outcomes o
Workshop on Model Uncertainty and its Statistical Implications
1988-01-01
In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.
Modelling of physical properties - databases, uncertainties and predictive power
DEFF Research Database (Denmark)
Gani, Rafiqul
Physical and thermodynamic property in the form of raw data or estimated values for pure compounds and mixtures are important pre-requisites for performing tasks such as, process design, simulation and optimization; computer aided molecular/mixture (product) design; and, product-process analysis....... While use of experimentally measured values of the needed properties is desirable in these tasks, the experimental data of the properties of interest may not be available or may not be measurable in many cases. Therefore, property models that are reliable, predictive and easy to use are necessary....... However, which models should be used to provide the reliable estimates of the required properties? And, how much measured data is necessary to regress the model parameters? How to ensure predictive capabilities in the developed models? Also, as it is necessary to know the associated uncertainties...
System convergence in transport models: algorithms efficiency and output uncertainty
DEFF Research Database (Denmark)
Rich, Jeppe; Nielsen, Otto Anker
2015-01-01
much in the literature. The paper first investigates several variants of the Method of Successive Averages (MSA) by simulation experiments on a toy-network. It is found that the simulation experiments produce support for a weighted MSA approach. The weighted MSA approach is then analysed on large......-scale in the Danish National Transport Model (DNTM). It is revealed that system convergence requires that either demand or supply is without random noise but not both. In that case, if MSA is applied to the model output with random noise, it will converge effectively as the random effects are gradually dampened...... in the MSA process. In connection to DNTM it is shown that MSA works well when applied to travel-time averaging, whereas trip averaging is generally infected by random noise resulting from the assignment model. The latter implies that the minimum uncertainty in the final model output is dictated...
Franz, K.; Hogue, T.; Barco, J.
2007-12-01
Identification of appropriate parameter sets for simulation of streamflow in ungauged basins has become a significant challenge for both operational and research hydrologists. This is especially difficult in the case of conceptual models, when model parameters typically must be "calibrated" or adjusted to match streamflow conditions in specific systems (i.e. some of the parameters are not directly observable). This paper addresses the performance and uncertainty associated with transferring conceptual rainfall-runoff model parameters between basins within large-scale ecoregions. We use the National Weather Service's (NWS) operational hydrologic model, the SACramento Soil Moisture Accounting (SAC-SMA) model. A Multi-Step Automatic Calibration Scheme (MACS), using the Shuffle Complex Evolution (SCE), is used to optimize SAC-SMA parameters for a group of watersheds with extensive hydrologic records from the Model Parameter Estimation Experiment (MOPEX) database. We then explore "hydroclimatic" relationships between basins to facilitate regionalization of parameters for an established ecoregion in the southeastern United States. The impact of regionalized parameters is evaluated via standard model performance statistics as well as through generation of hindcasts and probabilistic verification procedures to evaluate streamflow forecast skill. Preliminary results show climatology ("climate neighbor") to be a better indicator of transferability than physical similarities or proximity ("nearest neighbor"). The mean and median of all the parameters within the ecoregion are the poorest choice for the ungauged basin. The choice of regionalized parameter set affected the skill of the ensemble streamflow hindcasts, however, all parameter sets show little skill in forecasts after five weeks (i.e. climatology is as good an indicator of future streamflows). In addition, the optimum parameter set changed seasonally, with the "nearest neighbor" showing the highest skill in the
Improved methodology for developing cost uncertainty models for naval vessels
Brown, Cinda L.
2008-01-01
The purpose of this thesis is to analyze the probabilistic cost model currently in use by NAVSEA 05C to predict cost uncertainty in naval vessel construction and to develop a method that better predicts the ultimate cost risk. The data used to develop the improved approach is collected from analysis of the CG(X) class ship by NAVSEA 05C. The NAVSEA 05C cost risk factors are reviewed and analyzed to determine if different factors are better cost predictors. The impact of data elicitation, t...
Modelling with uncertainties: The role of the fission barrier
Directory of Open Access Journals (Sweden)
Lü Hongliang
2013-12-01
Full Text Available Fission is the dominant decay channel of super-heavy elements formed in heavy ions collisions. The probability of synthesizing heavy or super-heavy nuclei in fusion-evaporation reactions is then very sensitive to the height of their fission barriers. This contribution will firstly address the influence of theoretical uncertainty on excitation functions. Our second aim is to investigate the inverse problem, i.e., what information about the fission barriers can be extracted from excitation functions? For this purpose, Bayesian methods have been used with a simplified toy model.
The uncertainty of modeled soil carbon stock change for Finland
Lehtonen, Aleksi; Heikkinen, Juha
2013-04-01
Countries should report soil carbon stock changes of forests for Kyoto Protocol. Under Kyoto Protocol one can omit reporting of a carbon pool by verifying that the pool is not a source of carbon, which is especially tempting for the soil pool. However, verifying that soils of a nation are not a source of carbon in given year seems to be nearly impossible. The Yasso07 model was parametrized against various decomposition data using MCMC method. Soil carbon change in Finland between 1972 and 2011 were simulated with Yasso07 model using litter input data derived from the National Forest Inventory (NFI) and fellings time series. The uncertainties of biomass models, litter turnoverrates, NFI sampling and Yasso07 model were propagated with Monte Carlo simulations. Due to biomass estimation methods, uncertainties of various litter input sources (e.g. living trees, natural mortality and fellings) correlate strongly between each other. We show how original covariance matrices can be analytically combined and the amount of simulated components reduce greatly. While doing simulations we found that proper handling correlations may be even more essential than accurate estimates of standard errors. As a preliminary results, from the analysis we found that both Southern- and Northern Finland were soil carbon sinks, coefficient of variations (CV) varying 10%-25% when model was driven with long term constant weather data. When we applied annual weather data, soils were both sinks and sources of carbon and CVs varied from 10%-90%. This implies that the success of soil carbon sink verification depends on the weather data applied with models. Due to this fact IPCC should provide clear guidance for the weather data applied with soil carbon models and also for soil carbon sink verification. In the UNFCCC reporting carbon sinks of forest biomass have been typically averaged for five years - similar period for soil model weather data would be logical.
The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model
DEFF Research Database (Denmark)
Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo
2014-01-01
Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi......Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume...... uncertainty. This aspect is evident particularly for stretches of the network with a high number of competing routes. Model sensitivity was also tested for BPR parameter uncertainty combined with link capacity uncertainty. The resultant increase in model sensitivity demonstrates even further the importance...
Mendes, B. S.; Draper, D.
2008-12-01
The issue of model uncertainty and model choice is central in any groundwater modeling effort [Neuman and Wierenga, 2003]; among the several approaches to the problem we favour using Bayesian statistics because it is a method that integrates in a natural way uncertainties (arising from any source) and experimental data. In this work, we experiment with several Bayesian approaches to model choice, focusing primarily on demonstrating the usefulness of the Reversible Jump Markov Chain Monte Carlo (RJMCMC) simulation method [Green, 1995]; this is an extension of the now- common MCMC methods. Standard MCMC techniques approximate posterior distributions for quantities of interest, often by creating a random walk in parameter space; RJMCMC allows the random walk to take place between parameter spaces with different dimensionalities. This fact allows us to explore state spaces that are associated with different deterministic models for experimental data. Our work is exploratory in nature; we restrict our study to comparing two simple transport models applied to a data set gathered to estimate the breakthrough curve for a tracer compound in groundwater. One model has a mean surface based on a simple advection dispersion differential equation; the second model's mean surface is also governed by a differential equation but in two dimensions. We focus on artificial data sets (in which truth is known) to see if model identification is done correctly, but we also address the issues of over and under-paramerization, and we compare RJMCMC's performance with other traditional methods for model selection and propagation of model uncertainty, including Bayesian model averaging, BIC and DIC.References Neuman and Wierenga (2003). A Comprehensive Strategy of Hydrogeologic Modeling and Uncertainty Analysis for Nuclear Facilities and Sites. NUREG/CR-6805, Division of Systems Analysis and Regulatory Effectiveness Office of Nuclear Regulatory Research, U. S. Nuclear Regulatory Commission
[Active ageing and success: A brief history of conceptual models].
Petretto, Donatella Rita; Pili, Roberto; Gaviano, Luca; Matos López, Cristina; Zuddas, Carlo
2016-01-01
The aim of this paper is to analyse and describe different conceptual models of successful ageing, active and healthy ageing developed in Europe and in America in the 20° century, starting from Rowe and Kahn's original model (1987, 1997). A narrative review was conducted on the literature on successful ageing. Our review included definition of successful ageing from European and American scholars. Models were found that aimed to describe indexes of active and healthy ageing, models devoted to describe processes involved in successful ageing, and additional views that emphasise subjective and objective perception of successful ageing. A description is also given of critiques on previous models and remedies according to Martin et al. (2014) and strategies for successful ageing according to Jeste and Depp (2014). The need is discussed for the enhancement of Rowe and Kahn's model and other models with a more inclusive, universal description of ageing, incorporating scientific evidence regarding active ageing. Copyright © 2015 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.
Uncertainties in modeling hazardous gas releases for emergency response
Directory of Open Access Journals (Sweden)
Kathrin Baumann-Stanzer
2011-02-01
Full Text Available In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms-1 in wind speed, on the scale of 50 degrees in wind direction, up to 4°C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders.
A conceptual model to improve performance in virtual teams
Directory of Open Access Journals (Sweden)
Shopee Dube
2016-04-01
Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location.Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams.Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed.Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model.Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.
Modeling a Hybrid Microgrid Using Probabilistic Reconfiguration under System Uncertainties
Directory of Open Access Journals (Sweden)
Hadis Moradi
2017-09-01
Full Text Available A novel method for a day-ahead optimal operation of a hybrid microgrid system including fuel cells, photovoltaic arrays, a microturbine, and battery energy storage in order to fulfill the required load demand is presented in this paper. In the proposed system, the microgrid has access to the main utility grid in order to exchange power when required. Available municipal waste is utilized to produce the hydrogen required for running the fuel cells, and natural gas will be used as the backup source. In the proposed method, an energy scheduling is introduced to optimize the generating unit power outputs for the next day, as well as the power flow with the main grid, in order to minimize the operational costs and produced greenhouse gases emissions. The nature of renewable energies and electric power consumption is both intermittent and unpredictable, and the uncertainty related to the PV array power generation and power consumption has been considered in the next-day energy scheduling. In order to model uncertainties, some scenarios are produced according to Monte Carlo (MC simulations, and microgrid optimal energy scheduling is analyzed under the generated scenarios. In addition, various scenarios created by MC simulations are applied in order to solve unit commitment (UC problems. The microgrid’s day-ahead operation and emission costs are considered as the objective functions, and the particle swarm optimization algorithm is employed to solve the optimization problem. Overall, the proposed model is capable of minimizing the system costs, as well as the unfavorable influence of uncertainties on the microgrid’s profit, by generating different scenarios.
Modeling the uncertainty of estimating forest carbon stocks in China
Directory of Open Access Journals (Sweden)
T. X. Yue
2015-12-01
Full Text Available Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ, the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA. The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.
Uncertainty Analysis of Multi-Model Flood Forecasts
Directory of Open Access Journals (Sweden)
Erich J. Plate
2015-12-01
Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.
Two conceptual models of displacement transfer and examples
Institute of Scientific and Technical Information of China (English)
GUAN; Shuwei; WANG; Xin; YANG; Shufeng; HE; Dengfa; ZHAO; W
2005-01-01
This paper presents two conceptual models of displacement transfer, reverse symmetry model and infinitely equal division model, based on the fault-bend folding theory. If the fault shape is held constant in the trend, then the distribution of slip magnitude, geometry of imbricate structures and its axial surface map all display reverse symmetry on the process of displacement transfer, as called reverse symmetry model in this paper. However, if the ramp height of thrust fault decreases gradually along its strike, the displacement is postulated to be equally and infinitely divided to every thrust that is formed subsequently, this kinematic process is described using infinitely equal division model. In both models, displacement transfer is characterized by the regular changes of imbricate thrusting in the trend. Geometric analysis indicates that the displacement transfer grads can be estimated using the tangent of deflective angle of hinterland structures. Displacement transfer is often responsible for the distortion and branching of the surface anticlines, especially in the region where the multi-level detachment structures is developed. We also present some examples from the frontal structures of the Southern Tianshan fold-and-thrust belt, Xinjiang, China. Displacement transfer between deep imbricate thrusts in the middle segment of Qiulitag anticline zone causes the Kuqatawu and Southern Qiulitag deep anticlines left-lateral echelon. The region, where these two deep anticlines overlap, is characterized by duplex structures, and extends about 18 km. The shallow anticline is migrated southward displaying obvious "S" form in this area.
Solvable Models on Noncommutative Spaces with Minimal Length Uncertainty Relations
Dey, Sanjib
2014-01-01
Our main focus is to explore different models in noncommutative spaces in higher dimensions. We provide a procedure to relate a three dimensional q-deformed oscillator algebra to the corresponding algebra satisfied by canonical variables describing non-commutative spaces. The representations for the corresponding operators obey algebras whose uncertainty relations lead to minimal length, areas and volumes in phase space, which are in principle natural candidates of many different approaches of quantum gravity. We study some explicit models on these types of noncommutative spaces, first by utilising the perturbation theory, later in an exact manner. In many cases the operators are not Hermitian, therefore we use PT -symmetry and pseudo-Hermiticity property, wherever applicable, to make them self-consistent. Apart from building mathematical models, we focus on the physical implications of noncommutative theories too. We construct Klauder coherent states for the perturbative and nonperturbative noncommutative ha...
Incentive salience attribution under reward uncertainty: A Pavlovian model.
Anselme, Patrick
2015-02-01
There is a vast literature on the behavioural effects of partial reinforcement in Pavlovian conditioning. Compared with animals receiving continuous reinforcement, partially rewarded animals typically show (a) a slower development of the conditioned response (CR) early in training and (b) a higher asymptotic level of the CR later in training. This phenomenon is known as the partial reinforcement acquisition effect (PRAE). Learning models of Pavlovian conditioning fail to account for it. In accordance with the incentive salience hypothesis, it is here argued that incentive motivation (or 'wanting') plays a more direct role in controlling behaviour than does learning, and reward uncertainty is shown to have an excitatory effect on incentive motivation. The psychological origin of that effect is discussed and a computational model integrating this new interpretation is developed. Many features of CRs under partial reinforcement emerge from this model.
Climate stability and sensitivity in some simple conceptual models
Energy Technology Data Exchange (ETDEWEB)
Bates, J. Ray [University College Dublin, Meteorology and Climate Centre, School of Mathematical Sciences, Dublin (Ireland)
2012-02-15
A theoretical investigation of climate stability and sensitivity is carried out using three simple linearized models based on the top-of-the-atmosphere energy budget. The simplest is the zero-dimensional model (ZDM) commonly used as a conceptual basis for climate sensitivity and feedback studies. The others are two-zone models with tropics and extratropics of equal area; in the first of these (Model A), the dynamical heat transport (DHT) between the zones is implicit, in the second (Model B) it is explicitly parameterized. It is found that the stability and sensitivity properties of the ZDM and Model A are very similar, both depending only on the global-mean radiative response coefficient and the global-mean forcing. The corresponding properties of Model B are more complex, depending asymmetrically on the separate tropical and extratropical values of these quantities, as well as on the DHT coefficient. Adopting Model B as a benchmark, conditions are found under which the validity of the ZDM and Model A as climate sensitivity models holds. It is shown that parameter ranges of physical interest exist for which such validity may not hold. The 2 x CO{sub 2} sensitivities of the simple models are studied and compared. Possible implications of the results for sensitivities derived from GCMs and palaeoclimate data are suggested. Sensitivities for more general scenarios that include negative forcing in the tropics (due to aerosols, inadvertent or geoengineered) are also studied. Some unexpected outcomes are found in this case. These include the possibility of a negative global-mean temperature response to a positive global-mean forcing, and vice versa. (orig.)
How well can we forecast future model error and uncertainty by mining past model performance data
Solomatine, Dimitri
2016-04-01
Consider a hydrological model Y(t) = M(X(t), P), where X=vector of inputs; P=vector of parameters; Y=model output (typically flow); t=time. In cases when there is enough past data on the model M performance, it is possible to use this data to build a (data-driven) model EC of model M error. This model EC will be able to forecast error E when a new input X is fed into model M; then subtracting E from the model prediction Y a better estimate of Y can be obtained. Model EC is usually called the error corrector (in meteorology - a bias corrector). However, we may go further in characterizing model deficiencies, and instead of using the error (a real value) we may consider a more sophisticated characterization, namely a probabilistic one. So instead of rather a model EC of the model M error it is also possible to build a model U of model M uncertainty; if uncertainty is described as the model error distribution D this model will calculate its properties - mean, variance, other moments, and quantiles. The general form of this model could be: D = U (RV), where RV=vector of relevant variables having influence on model uncertainty (to be identified e.g. by mutual information analysis); D=vector of variables characterizing the error distribution (typically, two or more quantiles). There is one aspect which is not always explicitly mentioned in uncertainty analysis work. In our view it is important to distinguish the following main types of model uncertainty: 1. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. its uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. Here the following methods can be mentioned: (a) quantile regression (QR
Gray, Kathleen; Sockolow, Paulina
2016-01-01
Background Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. Objectives The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologi...
Recruiting Transcultural Qualitative Research Participants: A Conceptual Model
Directory of Open Access Journals (Sweden)
Phyllis Eide
2005-06-01
Full Text Available Working with diverse populations poses many challenges to the qualitative researcher who is a member of the dominant culture. Traditional methods of recruitment and selection (such as flyers and advertisements are often unproductive, leading to missed contributions from potential participants who were not recruited and researcher frustration. In this article, the authors explore recruitment issues related to the concept of personal knowing based on experiences with Aboriginal Hawai'ian and Micronesian populations, wherein knowing and being known are crucial to successful recruitment of participants. They present a conceptual model that incorporates key concepts of knowing the other, cultural context, and trust to guide other qualitative transcultural researchers. They also describe challenges, implications, and concrete suggestions for recruitment of participants.
Ecological risk assessment conceptual model formulation for nonindigenous species.
Landis, Wayne G
2004-08-01
This article addresses the application of ecological risk assessment at the regional scale to the prediction of impacts due to invasive or nonindigenous species (NIS). The first section describes risk assessment, the decision-making process, and introduces regional risk assessment. A general conceptual model for the risk assessment of NIS is then presented based upon the regional risk assessment approach. Two diverse examples of the application of this approach are presented. The first example is based upon the dynamics of introduced plasmids into bacteria populations. The second example is the application risk assessment approach to the invasion of a coastal marine site of Cherry Point, Washington, USA by the European green crab. The lessons learned from the two examples demonstrate that assessment of the risks of invasion of NIS will have to incorporate not only the characteristics of the invasive species, but also the other stresses and impacts affecting the region of interest.
Conceptual model and map of financial exploitation of older adults.
Conrad, Kendon J; Iris, Madelyn; Ridings, John W; Fairman, Kimberly P; Rosen, Abby; Wilber, Kathleen H
2011-10-01
This article describes the processes and outcomes of three-dimensional concept mapping to conceptualize financial exploitation of older adults. Statements were generated from a literature review and by local and national panels consisting of 16 experts in the field of financial exploitation. These statements were sorted and rated using Concept Systems software, which grouped the statements into clusters and depicted them as a map. Statements were grouped into six clusters, and ranked by the experts as follows in descending severity: (a) theft and scams, (b) financial victimization, (c) financial entitlement, (d) coercion, (e) signs of possible financial exploitation, and (f) money management difficulties. The hierarchical model can be used to identify elder financial exploitation and differentiate it from related but distinct areas of victimization. The severity hierarchy may be used to develop measures that will enable more precise screening for triage of clients into appropriate interventions.
CONCEPTUAL MODEL OF CONSUMERS TRUST TO ONLINE SHOPS
Directory of Open Access Journals (Sweden)
T. Dubovyk
2014-06-01
Full Text Available In the article the conceptual model of the major factors that influence consumers trust in online shop: reliability of online store, reliable information system for making purchases online, factors of ethic interactiveness (security, third-party certification, internet-marketing communications of online-shop and other factors – that is divided enterprises of trade and consumers (demographic variables, psychological perception of internet-marketing communications, experience of purchase of commodities are in the Internet. The degree of individual customer trust propensity which reflects the personality traits, culture and previous experience. An implement signs of consumer confidence due to site elements online shop – graphic design, structured design, design of content, design harmonized with perception of target audience.
DEFF Research Database (Denmark)
Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.
2015-01-01
uncertainty. In most hydrological EnKF applications, an ad hoc model uncertainty is defined with the aim of avoiding a collapse of the filter. The present work provides a systematic assessment of model uncertainty in DA applications based on combinations of forcing, model parameters, and state uncertainties....... This is tested in a case where groundwater hydraulic heads are assimilated into a distributed and integrated catchment-scale model of the Karup catchment in Denmark. A series of synthetic data assimilation experiments are carried out to analyse the impact of different model uncertainty assumptions...
Yu, Xiaolin; Zhang, Shaoqing; Lin, Xiaopei; Li, Mingkui
2017-03-01
The uncertainties in values of coupled model parameters are an important source of model bias that causes model climate drift. The values can be calibrated by a parameter estimation procedure that projects observational information onto model parameters. The signal-to-noise ratio of error covariance between the model state and the parameter being estimated directly determines whether the parameter estimation succeeds or not. With a conceptual climate model that couples the stochastic atmosphere and slow-varying ocean, this study examines the sensitivity of state-parameter covariance on the accuracy of estimated model states in different model components of a coupled system. Due to the interaction of multiple timescales, the fast-varying atmosphere with a chaotic nature is the major source of the inaccuracy of estimated state-parameter covariance. Thus, enhancing the estimation accuracy of atmospheric states is very important for the success of coupled model parameter estimation, especially for the parameters in the air-sea interaction processes. The impact of chaotic-to-periodic ratio in state variability on parameter estimation is also discussed. This simple model study provides a guideline when real observations are used to optimize model parameters in a coupled general circulation model for improving climate analysis and predictions.
Etemadi, H.; Samadi, S.; Sharifikia, M.
2014-06-01
Regression-based statistical downscaling model (SDSM) is an appropriate method which broadly uses to resolve the coarse spatial resolution of general circulation models (GCMs). Nevertheless, the assessment of uncertainty propagation linked with climatic variables is essential to any climate change impact study. This study presents a procedure to characterize uncertainty analysis of two GCM models link with Long Ashton Research Station Weather Generator (LARS-WG) and SDSM in one of the most vulnerable international wetland, namely "Shadegan" in an arid region of Southwest Iran. In the case of daily temperature, uncertainty is estimated by comparing monthly mean and variance of downscaled and observed daily data at a 95 % confidence level. Uncertainties were then evaluated from comparing monthly mean dry and wet spell lengths and their 95 % CI in daily precipitation downscaling using 1987-2005 interval. The uncertainty results indicated that the LARS-WG is the most proficient model at reproducing various statistical characteristics of observed data at a 95 % uncertainty bounds while the SDSM model is the least capable in this respect. The results indicated a sequences uncertainty analysis at three different climate stations and produce significantly different climate change responses at 95 % CI. Finally the range of plausible climate change projections suggested a need for the decision makers to augment their long-term wetland management plans to reduce its vulnerability to climate change impacts.
Uncertainty Estimation in SiGe HBT Small-Signal Modeling
DEFF Research Database (Denmark)
Masood, Syed M.; Johansen, Tom Keinicke; Vidkjær, Jens;
2005-01-01
An uncertainty estimation and sensitivity analysis is performed on multi-step de-embedding for SiGe HBT small-signal modeling. The uncertainty estimation in combination with uncertainty model for deviation in measured S-parameters, quantifies the possible error value in de-embedded two...
Uncertainty analysis in WWTP model applications: a critical discussion using an example from design
DEFF Research Database (Denmark)
Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.
2009-01-01
This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte Ca...
An Uncertainty Structure Matrix for Models and Simulations
Green, Lawrence L.; Blattnig, Steve R.; Hemsch, Michael J.; Luckring, James M.; Tripathi, Ram K.
2008-01-01
Software that is used for aerospace flight control and to display information to pilots and crew is expected to be correct and credible at all times. This type of software is typically developed under strict management processes, which are intended to reduce defects in the software product. However, modeling and simulation (M&S) software may exhibit varying degrees of correctness and credibility, depending on a large and complex set of factors. These factors include its intended use, the known physics and numerical approximations within the M&S, and the referent data set against which the M&S correctness is compared. The correctness and credibility of an M&S effort is closely correlated to the uncertainty management (UM) practices that are applied to the M&S effort. This paper describes an uncertainty structure matrix for M&S, which provides a set of objective descriptions for the possible states of UM practices within a given M&S effort. The columns in the uncertainty structure matrix contain UM elements or practices that are common across most M&S efforts, and the rows describe the potential levels of achievement in each of the elements. A practitioner can quickly look at the matrix to determine where an M&S effort falls based on a common set of UM practices that are described in absolute terms that can be applied to virtually any M&S effort. The matrix can also be used to plan those steps and resources that would be needed to improve the UM practices for a given M&S effort.
Assessment of private hospital portals: A conceptual model
Directory of Open Access Journals (Sweden)
Mehdi Alipour-Hafezi
2016-01-01
Full Text Available Introduction: Hospital portals, as the first virtual entry, play an important role in connecting people with hospital and also presenting hospital virtual services. The main purpose of this article was to suggest a conceptual model to improve Tehran private hospital portals. The suggested model can be used by all the health portals that are in the same circumstances and all the health portals which are in progress. Method: This is a practical research, using evaluative survey research method. Research population includes all the private hospital portals in Tehran, 34 portals, and ten top international hospital portals. Data gathering tool used in this research was a researcher-made checklist including 14 criteria and 77 sub-criteria with their weight score. In fact, objective observation with the mentioned checklist was used to gather information. Descriptive statistics were used to analyze the data and tables and graphs were used to present the organized data. Also, data were analyzed using independent t-test. Conceptual modeling technique was used to design the model and demonstration method was used to evaluate the proposed model. In this regard, SPSS statistical software was used to perform the tests. Results:The comparative study between the two groups of portals, TPH and WTH, in the 14 main criteria showed that the value of t-test in contact information criteria was 0.862, portal page specification was -1.378, page design criteria -1.527, updating pages -0.322, general information and access roads -3.161, public services -7.302, patient services -4.154, patient data -8.703, research and education -9.155, public relationship -3.009, page technical specifications -4.726, telemedicine -7.488, pharmaceutical services -6.183, and financial services -2.782. Finally, the findings demonstrated that Tehran private hospital portals in criterion of contact information were favorable; page design criteria were relatively favorable; page technical
Modelling in Primary School: Constructing Conceptual Models and Making Sense of Fractions
Shahbari, Juhaina Awawdeh; Peled, Irit
2017-01-01
This article describes sixth-grade students' engagement in two model-eliciting activities offering students the opportunity to construct mathematical models. The findings show that students utilized their knowledge of fractions including conceptual and procedural knowledge in constructing mathematical models for the given situations. Some students…
Equilibrium Assignment Model with Uncertainties in Traffic Demands
Directory of Open Access Journals (Sweden)
Aiwu Kuang
2013-01-01
Full Text Available In this study, we present an equilibrium traffic assignment model considering uncertainties in traffic demands. The link and route travel time distributions are derived based on the assumption that OD traffic demand follows a log-normal distribution. We postulate that travelers can acquire the variability of route travel times from past experiences and factor such variability into their route choice considerations in the form of mean route travel time. Furthermore, all travelers want to minimize their mean route travel times. We formulate the assignment problem as a variational inequality, which can be solved by a route-based heuristic solution algorithm. Some numerical studies on a small test road network are carried out to validate the proposed model and algorithm, at the same time, some reasonable results are obtained.
A CONCEPTUAL MODEL FOR IMPROVED PROJECT SELECTION AND PRIORITISATION
Directory of Open Access Journals (Sweden)
P. J. Viljoen
2012-01-01
Full Text Available
ENGLISH ABSTRACT: Project portfolio management processes are often designed and operated as a series of stages (or project phases and gates. However, the flow of such a process is often slow, characterised by queues waiting for a gate decision and by repeated work from previous stages waiting for additional information or for re-processing. In this paper the authors propose a conceptual model that applies supply chain and constraint management principles to the project portfolio management process. An advantage of the proposed model is that it provides the ability to select and prioritise projects without undue changes to project schedules. This should result in faster flow through the system.
AFRIKAANSE OPSOMMING: Prosesse om portefeuljes van projekte te bestuur word normaalweg ontwerp en bedryf as ’n reeks fases en hekke. Die vloei deur so ’n proses is dikwels stadig en word gekenmerk deur toue wat wag vir besluite by die hekke en ook deur herwerk van vorige fases wat wag vir verdere inligting of vir herprosessering. In hierdie artikel word ‘n konseptuele model voorgestel. Die model berus op die beginsels van voorsieningskettings sowel as van beperkingsbestuur, en bied die voordeel dat projekte geselekteer en geprioritiseer kan word sonder onnodige veranderinge aan projekskedules. Dit behoort te lei tot versnelde vloei deur die stelsel.
BIM-enabled Conceptual Modelling and Representation of Building Circulation
Directory of Open Access Journals (Sweden)
Jin Kook Lee
2014-08-01
Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC’s schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs’ schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.
Vulnerability Assessment Models to Drought: Toward a Conceptual Framework
Directory of Open Access Journals (Sweden)
Kiumars Zarafshani
2016-06-01
Full Text Available Drought is regarded as a slow-onset natural disaster that causes inevitable damage to water resources and to farm life. Currently, crisis management is the basis of drought mitigation plans, however, thus far studies indicate that effective drought management strategies are based on risk management. As a primary tool in mitigating the impact of drought, vulnerability assessment can be used as a benchmark in drought mitigation plans and to enhance farmers’ ability to cope with drought. Moreover, literature pertaining to drought has focused extensively on its impact, only awarding limited attention to vulnerability assessment as a tool. Therefore, the main purpose of this paper is to develop a conceptual framework for designing a vulnerability model in order to assess farmers’ level of vulnerability before, during and after the onset of drought. Use of this developed drought vulnerability model would aid disaster relief workers by enhancing the adaptive capacity of farmers when facing the impacts of drought. The paper starts with the definition of vulnerability and outlines different frameworks on vulnerability developed thus far. It then identifies various approaches of vulnerability assessment and finally offers the most appropriate model. The paper concludes that the introduced model can guide drought mitigation programs in countries that are impacted the most by drought.
Business models for material efficiency services. Conceptualization and application
Energy Technology Data Exchange (ETDEWEB)
Halme, Minna; Anttonen, Markku; Kuisma, Mika; Kontoniemi, Nea [Helsinki School of Economics, Department of Marketing and Management, P.O. Box 1210, 00101 Helsinki (Finland); Heino, Erja [University of Helsinki, Department of Biological and Environmental Sciences (Finland)
2007-06-15
Despite the abundant research on material flows and the growing recognition of the need to dematerialize the economy, business enterprises are still not making the best possible use of the many opportunities for material efficiency improvements. This article proposes one possible solution: material efficiency services provided by outside suppliers. It also introduces a conceptual framework for the analysis of different business models for eco-efficient services and applies the framework to material efficiency services. Four business models are outlined and their feasibility is studied from an empirical vantage point. In contrast to much of the previous research, special emphasis is laid on the financial aspects. It appears that the most promising business models are 'material efficiency as additional service' and 'material flow management service'. Depending on the business model, prominent material efficiency service providers differ from large companies that offer multiple products and/or services to smaller, specialized providers. Potential clients (users) typically lack the resources (expertise, management's time or initial funds) to conduct material efficiency improvements themselves. Customers are more likely to use material efficiency services that relate to support materials or side-streams rather than those that are at the core of production. Potential client organizations with a strategy of outsourcing support activities and with experience of outsourcing are more keen to use material efficiency services. (author)
An empirical conceptual gully evolution model for channelled sea cliffs
Leyland, Julian; Darby, Stephen E.
2008-12-01
Incised coastal channels are a specific form of incised channel that are found in locations where stream channels flowing to cliffed coasts have the excess energy required to cut down through the cliff to reach the outlet water body. The southern coast of the Isle of Wight, southern England, comprises soft cliffs that vary in height between 15 and 100 m and which are retreating at rates ≤ 1.5 m a - 1 , due to a combination of wave erosion and landslides. In several locations, river channels have cut through the cliffs to create deeply (≤ 45 m) incised gullies, known locally as 'Chines'. The Chines are unusual in that their formation is associated with dynamic shoreline encroachment during a period of rising sea-level, whereas existing models of incised channel evolution emphasise the significance of base level lowering. This paper develops a conceptual model of Chine evolution by applying space for time substitution methods using empirical data gathered from Chine channel surveys and remotely sensed data. The model identifies a sequence of evolutionary stages, which are classified based on a suite of morphometric indices and associated processes. The extent to which individual Chines are in a state of growth or decay is estimated by determining the relative rates of shoreline retreat and knickpoint recession, the former via analysis of historical aerial images and the latter through the use of a stream power erosion model.
Advances in the study of uncertainty quantification of large-scale hydrological modeling system
Institute of Scientific and Technical Information of China (English)
SONG Xiaomeng; ZHAN Chesheng; KONG Fanzhe; XIA Jun
2011-01-01
The regional hydrological system is extremely complex because it is affected not only by physical factors but also by human dimensions.And the hydrological models play a very important role in simulating the complex system.However,there have not been effective methods for the model reliability and uncertainty analysis due to its complexity and difficulty.The uncertainties in hydrological modeling come from four important aspects:uncertainties in input data and parameters,uncertainties in model structure,uncertainties in analysis method and the initial and boundary conditions.This paper systematically reviewed the recent advances in the study of the uncertainty analysis approaches in the large-scale complex hydrological model on the basis of uncertainty sources.Also,the shortcomings and insufficiencies in the uncertainty analysis for complex hydrological models are pointed out.And then a new uncertainty quantification platform PSUADE and its uncertainty quantification methods were introduced,which will be a powerful tool and platform for uncertainty analysis of large-scale complex hydrological models.Finally,some future perspectives on uncertainty quantification are put forward.