Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty
Energy Technology Data Exchange (ETDEWEB)
Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.
2004-03-01
The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four
DEFF Research Database (Denmark)
Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.
A key component in risk assessment of contaminated sites is the formulation of a conceptual site model. The conceptual model is a simplified representation of reality and forms the basis for the mathematical modelling of contaminant fate and transport at the site. A conceptual model should...... therefore identify the most important site-specific features and processes that may affect the contaminant transport behaviour at the site. The development of a conceptual model will always be associated with uncertainties due to lack of data and understanding of the site conditions, and often many...... different conceptual models may describe the same contaminated site equally well. In many cases, conceptual model uncertainty has been shown to be one of the dominant sources for uncertainty and is therefore essential to account for when quantifying uncertainties in risk assessments. We present here...
Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors
Carrera, J.; Pool, M.
2014-12-01
Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on
Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.
2012-04-01
Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from
Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling
Abu Shoaib, S.; Marshall, L. A.; Sharma, A.
2015-12-01
Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.
Reducing structural uncertainty in conceptual hydrological modeling in the semi-arid Andes
Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.
2014-10-01
The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modeling of a meso-scale Andean catchment (1515 km2) over a 30 year period (1982-2011). The modeling process was decomposed into six model-building decisions related to the following aspects of the system behavior: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modeling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain 8 model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modeling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.
Reducing structural uncertainty in conceptual hydrological modelling in the semi-arid Andes
Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.
2015-05-01
The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modelling of a mesoscale Andean catchment (1515 km2) over a 30-year period (1982-2011). The modelling process was decomposed into six model-building decisions related to the following aspects of the system behaviour: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modelling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional (4-D) space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain eight model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modelling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.
Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.
2014-12-01
This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.
Wei Wu; James Clark; James Vose
2010-01-01
Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model â GR4J â by coherently assimilating the uncertainties from the...
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare
Hillen, Marij A.; Gutheil, Caitlin M.; Strout, Tania D.; Smets, Ellen M. A.; Han, Paul K. J.
2017-01-01
Rationale: Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. Objective: The
International Nuclear Information System (INIS)
Gallegos, D.P.; Phol, P.I.; Updegraff, C.D.
1992-04-01
Performance assessment modeling for High Level Waste (HLW) disposal incorporates three different types of uncertainty. These include data and parameter uncertainty, modeling uncertainty (which includes conceptual, mathematical, and numerical), and uncertainty associated with predicting the future state of the system. In this study, the potential impact of conceptual model uncertainty on the estimated performance of a hypothetical high-level radioactive waste disposal site in unsaturated, fractured tuff has been assessed for a given group of conceptual models. This was accomplished by taking a series of six, one-dimensional conceptual models, which differed only by the fundamental assumptions used to develop them, and conducting ground-water flow and radionuclide transport simulations. Complementary cumulative distribution functions (CCDFs) representing integrated radionuclide release to the water table indicate that differences in the basic assumptions used to develop conceptual models can have a significant impact on the estimated performance of the site. Because each of the conceptual models employed the same mathematical and numerical models, contained the same data and parameter values and ranges, and did not consider the possible future states of the system, changes in the CCDF could be attributed primarily to differences in conceptual modeling assumptions. Studies such as this one could help prioritize site characterization activities by identifying critical and uncertain assumptions used in model development, thereby providing guidance as to where reduction of uncertainty is most important
Improving characterization of streamflow by conceptual modeling of rating curve uncertainty
Weijs, Steven; Galindo, Luis
2017-04-01
Streamflow timeseries are an important source of information for hydrological predictions, both through direct use in extreme value analysis and through streamflow records used in calibration of hydrological models. In this research we look at ways to best represent uncertainties in the rating curve and ways to constrain them using additional information apart from the Q,h pairs used traditionally. One of the possible avenues to enable use of such information is a more physically based representation of rating curves and explicit accounting of the dynamic nature of the stage-discharge relation. We present these representations and the reduction in uncertainty that can be achieved by the introduction of various pieces of external information. The influence of variable streamflow uncertainty for hydrological model calibration will also be explored.
Directory of Open Access Journals (Sweden)
A. E. Sikorska
2012-04-01
Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.
Directory of Open Access Journals (Sweden)
Marc A. Nelitz
2015-12-01
Full Text Available Complexity and uncertainty are inherent in social-ecological systems. Although they can create challenges for scientists and decision makers, they cannot be a reason for delaying decision making. Two strategies have matured in recent decades to address these challenges. Systems thinking, as embodied by conceptual modeling, is a holistic approach in which a system can be better understood by examining it as a whole. Expert elicitation represents a second strategy that enables a greater diversity of inputs to understand complex systems. We explored the use of conceptual models and expert judgments to inform expansion of monitoring around oil sands development in northern Alberta, Canada, particularly related to migratory forest birds. This study area is a complex social-ecological system for which there is an abundance of specific information, but a relatively weak understanding about system behavior. Multiple conceptual models were developed to represent complexity and provide a more fulsome view of influences across the landscape. A hierarchical approach proved useful, and a mechanistic structure of the models clarified the cumulative and interactive nature of factors within and outside the study area. To address gaps in understanding, expert judgments were integrated using a series of structured exercises to derive "weightings" of importance of different components in the conceptual models, specifically pairwise comparisons, Likert scaling, and a maximum difference conjoint approach. These exercises were helpful for discriminating the importance of different influences and illuminating the competing beliefs of experts. Various supporting tools helped us engage a group of experts from across North America, which included a virtual meeting, online polling, desktop sharing, web survey, and financial incentive. This combination of techniques was innovative and proved useful for addressing complexity and uncertainty in a specific natural resource
DEFF Research Database (Denmark)
Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.
2016-01-01
to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert...... on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information becomes available....
Valent, Peter; Szolgay, Ján; Riverso, Carlo
2012-12-01
Most of the studies that assess the performance of various calibration techniques have to deal with a certain amount of uncertainty in the calibration data. In this study we tested HBV model calibration procedures in hypothetically ideal conditions under the assumption of no errors in the measured data. This was achieved by creating an artificial time series of the flows created by the HBV model using the parameters obtained from calibrating the measured flows. The artificial flows were then used to replace the original flows in the calibration data, which was then used for testing how calibration procedures can reproduce known model parameters. The results showed that in performing one hundred independent calibration runs of the HBV model, we did not manage to obtain parameters that were almost identical to those used to create the artificial flow data without a certain degree of uncertainty. Although the calibration procedure of the model works properly from a practical point of view, it can be regarded as a demonstration of the equifinality principle, since several parameter sets were obtained which led to equally acceptable or behavioural representations of the observed flows. The study demonstrated that this concept for assessing how uncertain hydrological predictions can be applied in the further development of a model or the choice of calibration method using artificially generated data.
Energy Technology Data Exchange (ETDEWEB)
Meyer, Philip D.; Ye, Ming; Rockhold, Mark L.; Neuman, Shlomo P.; Cantrell, Kirk J.
2007-07-30
This report to the Nuclear Regulatory Commission (NRC) describes the development and application of a methodology to systematically and quantitatively assess predictive uncertainty in groundwater flow and transport modeling that considers the combined impact of hydrogeologic uncertainties associated with the conceptual-mathematical basis of a model, model parameters, and the scenario to which the model is applied. The methodology is based on a n extension of a Maximum Likelihood implementation of Bayesian Model Averaging. Model uncertainty is represented by postulating a discrete set of alternative conceptual models for a site with associated prior model probabilities that reflect a belief about the relative plausibility of each model based on its apparent consistency with available knowledge and data. Posterior model probabilities are computed and parameter uncertainty is estimated by calibrating each model to observed system behavior; prior parameter estimates are optionally included. Scenario uncertainty is represented as a discrete set of alternative future conditions affecting boundary conditions, source/sink terms, or other aspects of the models, with associated prior scenario probabilities. A joint assessment of uncertainty results from combining model predictions computed under each scenario using as weight the posterior model and prior scenario probabilities. The uncertainty methodology was applied to modeling of groundwater flow and uranium transport at the Hanford Site 300 Area. Eight alternative models representing uncertainty in the hydrogeologic and geochemical properties as well as the temporal variability were considered. Two scenarios represent alternative future behavior of the Columbia River adjacent to the site were considered. The scenario alternatives were implemented in the models through the boundary conditions. Results demonstrate the feasibility of applying a comprehensive uncertainty assessment to large-scale, detailed groundwater flow
DEFF Research Database (Denmark)
Thomsen, Nanna Isbak; Troldborg, Mads; McKnight, Ursula S.
2012-01-01
) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk...... consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation...
Lin, Lin; Chiang, Hui-Hsun; Acquaye, Alvina A; Vera-Bolanos, Elizabeth; Gilbert, Mark R; Armstrong, Terri S
2013-08-01
Patients with primary brain tumors (PBTs) face uncertainty related to prognosis, symptoms, treatment response, and toxicity. The authors of this report examined the direct/indirect relations among patients' uncertainty, mood states, and symptoms. In total, 186 patients with PBTs were accrued at various points in the illness trajectory. Data-collection tools included an investigator-completed clinician checklist, a patient-completed demographic data sheet, the Mishel Uncertainty in Illness Scale-Brain Tumor Form (MUIS-BT), the MD Anderson Symptom Inventory-Brain Tumor Module (MDASI-BT), and the Profile of Mood States-Short Form (POMS-SF). Structural equation modeling was used to explore correlations among variables. Participants were primarily white (80%) men (53%) with a variety of brain tumors. They ranged in age from 19 to 80 years (mean ± standard deviation, 44.2 ± 12.6 years). Lower functional status and earlier point in the illness trajectory were associated with greater uncertainty (P mood states measured by the POMS-SF were directly associated with symptom severity perceived by patients (P mood states. The results from the study clearly demonstrated distinct pathways for the relations between uncertainty-mood states-symptom severity for patients with PBTs. Uncertainty in patients with PBTs is higher for those who have a poor performance status and directly impacts negative mood states, which mediate patient-perceived symptom severity. This conceptual model suggests that interventions designed to reduce uncertainty or that target mood states may help lessen patients' perception of symptom severity, which, in turn, may result in better treatment outcomes and quality of life. © 2013 American Cancer Society.
Communicating Uncertainty Information Across Conceptual Boundaries
2011-12-01
market for ’ lemons ’: Quality uncertainty and the market mechanism.‖ The quarterly journal of economics 84 (3): 488–500. Allen, M. 2010. ―Embracing...www.bipm.org/utils/common/documents/jcgm/JCGM_100_2008_E.pdf. Bitz, C. M. 2008. ―Some aspects of uncertainty in predicting sea ice thinning
Lin, Lin; Yeh, Chao-Hsing; Mishel, Merle H
2010-12-01
The prognoses of childhood cancers have improved over the last few decades. Nevertheless, parental uncertainty about the absolute cure and possible relapse pervades the entire illness trajectory. Despite illness-related uncertainty is significantly related to psychological distress, continual uncertainty may serve as a catalyst for positive psychological change and personal growth in the context of surviving cancer. The purpose of this study was to examine a conceptual model that depicts coping and growth in Taiwanese parents living with the continual uncertainty about their child's cancer. The conceptual model was guided by Mishel's theories of Uncertainty in Illness. The impact of the child's health status, parents' education level and perceived social support on parental uncertainty was analyzed. The mediating effect of coping as well as the influence of parental uncertainty and parents' perceived social support on growth through uncertainty was incorporated in the model testing. This study involved a sample of 205 mothers and 96 fathers of 226 children enrolled in a longitudinal cancer study in Taiwan. This study only analyzed the data collected at baseline. A cross-sectional design was utilized to examine the relationships among proposed variables. Parental uncertainty and growth through uncertainty were measured by the translated questionnaires originally developed by Mishel. Parents' perceived social support and coping were measured by culturally sensitive instruments developed in Taiwan. The full research model and its alternative models fit adequately to the data via structural equation modeling tests. Parental uncertainty and parents' perceived social support were associated with growth through uncertainty which was mediated by coping. Child's health status and parents' perceived social support would significantly predict parental uncertainty. This study suggests that parental uncertainty has negative impact on coping strategies such as interacting with
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Uncertainty in biodiversity science, policy and management: a conceptual overview
Directory of Open Access Journals (Sweden)
Yrjö Haila
2014-10-01
Full Text Available The protection of biodiversity is a complex societal, political and ultimately practical imperative of current global society. The imperative builds upon scientific knowledge on human dependence on the life-support systems of the Earth. This paper aims at introducing main types of uncertainty inherent in biodiversity science, policy and management, as an introduction to a companion paper summarizing practical experiences of scientists and scholars (Haila et al. 2014. Uncertainty is a cluster concept: the actual nature of uncertainty is inherently context-bound. We use semantic space as a conceptual device to identify key dimensions of uncertainty in the context of biodiversity protection; these relate to [i] data; [ii] proxies; [iii] concepts; [iv] policy and management; and [v] normative goals. Semantic space offers an analytic perspective for drawing critical distinctions between types of uncertainty, identifying fruitful resonances that help to cope with the uncertainties, and building up collaboration between different specialists to support mutual social learning.
Application of uncertainty analysis in conceptual fusion reactor design
International Nuclear Information System (INIS)
Wu, T.; Maynard, C.W.
1979-01-01
The theories of sensitivity and uncertainty analysis are described and applied to a new conceptual tokamak fusion reactor design--NUWMAK. The responses investigated in this study include the tritium breeding ratio, first wall Ti dpa and gas productions, nuclear heating in the blanket, energy leakage to the magnet, and the dpa rate in the superconducting magnet aluminum stabilizer. The sensitivities and uncertainties of these responses are calculated. The cost/benefit feature of proposed integral measurements is also studied through the uncertainty reductions of these responses
Model Building for Conceptual Change
Jonassen, David; Strobel, Johannes; Gottdenker, Joshua
2005-01-01
Conceptual change is a popular, contemporary conception of meaningful learning. Conceptual change describes changes in conceptual frameworks (mental models or personal theories) that learners construct to comprehend phenomena. Different theories of conceptual change describe the reorganization of conceptual frameworks that results from different…
Arnaoudova, Kristina; Stanchev, Peter
2015-11-01
The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Conceptual and computational basis for the quantification of margins and uncertainty
International Nuclear Information System (INIS)
Helton, Jon Craig
2009-01-01
In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e, Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainty (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. Topics considered include (1) the role of aleatory and epistemic uncertainty in QMU, (2) the representation of uncertainty with probability, (3) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, (4) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty, (5) procedures for sampling-based uncertainty and sensitivity analysis, (6) the representation of uncertainty with alternatives to probability such as interval analysis, possibility theory and evidence theory, (7) the representation of uncertainty with alternatives to probability in QMU analyses involving only epistemic uncertainty, and (8) the representation of uncertainty with alternatives to probability in QMU analyses involving aleatory and epistemic uncertainty. Concepts and computational procedures are illustrated with both notional examples and examples from reactor safety and radioactive waste disposal.
Uncertainties in repository modeling
Energy Technology Data Exchange (ETDEWEB)
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Uncertainties in repository modeling
International Nuclear Information System (INIS)
Wilson, J.R.
1996-01-01
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling
Pragmatic aspects of uncertainty propagation: A conceptual review
Thacker, W.Carlisle
2015-09-11
When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.
A commentary on model uncertainty
International Nuclear Information System (INIS)
Apostolakis, G.
1994-01-01
A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed
Chemical model reduction under uncertainty
Najm, Habib
2016-01-05
We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.
Representing uncertainty on model analysis plots
Directory of Open Access Journals (Sweden)
Trevor I. Smith
2016-09-01
Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.
Methodology for the treatment of model uncertainty
Droguett, Enrique Lopez
The development of a conceptual, unified, framework and methodology for treating model and parameter uncertainties is the subject of this work. Firstly, a discussion on the philosophical grounds of notions such as reality, modeling, models, and their relation is presented. On this, a characterization of the modeling process is presented. The concept of uncertainty, addressing controversial topics such as type and sources of uncertainty, are investigated arguing that uncertainty is fundamentally a characterization of lack of knowledge and as such all uncertainty are of the same type. A discussion about the roles of a model structure and model parameters is presented, in which it is argued that a distinction is for convenience and a function of the stage in the modeling process. From the foregoing discussion, a Bayesian framework for an integrated assessment of model and parameter uncertainties is developed. The methodology has as its central point the treatment of model as source of information regarding the unknown of interest. It allows for the assessment of the model characteristics affecting its performance, such as bias and precision. It also permits the assessment of possible dependencies among multiple models. Furthermore, the proposed framework makes possible the use of not only information from models (e.g., point estimates, qualitative assessments), but also evidence about the models themselves (performance data, confidence in the model, applicability of the model). The methodology is then applied in the context of fire risk models where several examples with real data are studied. These examples demonstrate how the framework and specific techniques developed in this study can address cases involving multiple models, use of performance data to update the predictive capabilities of a model, and the case where a model is applied in a context other than one for which it is designed.
Uncertainty analysis of environmental models
International Nuclear Information System (INIS)
Monte, L.
1990-01-01
In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition
Model uncertainty in safety assessment
International Nuclear Information System (INIS)
Pulkkinen, U.; Huovinen, T.
1996-01-01
The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)
Event-Based Conceptual Modeling
DEFF Research Database (Denmark)
Bækgaard, Lars
2009-01-01
The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...
Numerical modeling of economic uncertainty
DEFF Research Database (Denmark)
Schjær-Jacobsen, Hans
2007-01-01
Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...
Model Breaking Points Conceptualized
Vig, Rozy; Murray, Eileen; Star, Jon R.
2014-01-01
Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…
Uncertainty in hydrological change modelling
DEFF Research Database (Denmark)
Seaby, Lauren Paige
.D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...
Uncertainty in hydrological change modelling
DEFF Research Database (Denmark)
Seaby, Lauren Paige
Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...
Bayesian analysis for uncertainty estimation of a canopy transpiration model
Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.
2007-04-01
A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.
A Bayesian approach to model uncertainty
International Nuclear Information System (INIS)
Buslik, A.
1994-01-01
A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given
Model uncertainty in growth empirics
Prüfer, P.
2008-01-01
This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high
The conceptualization model problem—surprise
Bredehoeft, John
2005-03-01
The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait
Uncertainty modeling and decision support
International Nuclear Information System (INIS)
Yager, Ronald R.
2004-01-01
We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function
Conceptual Models for Search Engines
Hendry, D. G.; Efthimiadis, E. N.
Search engines have entered popular culture. They touch people in diverse private and public settings and thus heighten the importance of such important social matters as information privacy and control, censorship, and equitable access. To fully benefit from search engines and to participate in debate about their merits, people necessarily appeal to their understandings for how they function. In this chapter we examine the conceptual understandings that people have of search engines by performing a content analysis on the sketches that 200 undergraduate and graduate students drew when asked to draw a sketch of how a search engine works. Analysis of the sketches reveals a diverse range of conceptual approaches, metaphors, representations, and misconceptions. On the whole, the conceptual models articulated by these students are simplistic. However, students with higher levels of academic achievement sketched more complete models. This research calls attention to the importance of improving students' technical knowledge of how search engines work so they can be better equipped to develop and advocate policies for how search engines should be embedded in, and restricted from, various private and public information settings.
Uncertainty quantification for environmental models
Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming
2012-01-01
Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10
Model Uncertainty for Bilinear Hysteric Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model......In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...
Chemical model reduction under uncertainty
Malpica Galassi, Riccardo
2017-03-06
A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.
Conceptual Modeling via Logic Programming
1990-01-01
CONCEPTUAL MODELING VIA LOGIC PROGRAMaNG 12. PERSONAL AUTHOR(S) John Burge, Bill Noah, Las Smith 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF...interpreter and named it Prolog for middle. He found that his program got into an "Programmation en logique ." It provided the infinite loop unless he used... lA N C I WA v Cruse Mle Submartine 44 0 0 0 0 0835 I 9 33 ICC 33 High F.e cruise Misaile 640 0 8S S44 0 0 High Post cruise missile 840 2 7S 440 too
Applied research in uncertainty modeling and analysis
Ayyub, Bilal
2005-01-01
Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...
Some illustrative examples of model uncertainty
International Nuclear Information System (INIS)
Bier, V.M.
1994-01-01
In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion
Adressing Replication and Model Uncertainty
DEFF Research Database (Denmark)
Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld
innovation survey data for France, Germany and the UK, we conduct a ‘large-scale’ replication using the Bayesian averaging approach of classical estimators. Our method tests a wide range of determinants of innovation suggested in the prior literature, and establishes a robust set of findings on the variables...... which shape the introduction of new to the firm and new to the world innovations. We provide some implications for innovation research, and explore the potential application of our approach to other domains of research in strategic management.......Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing...
Uncertainty and its propagation in dynamics models
International Nuclear Information System (INIS)
Devooght, J.
1994-01-01
The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision
Dealing with uncertainties in fusion power plant conceptual development
Kemp, R.; Lux, H.; Kovari, M.; Morris, J.; Wenninger, R.; Zohm, H.; Biel, W.; Federici, G.
2017-04-01
Although the ultimate goal of most current fusion research is to build an economically attractive power plant, the present status of physics and technology does not provide the performance necessary to achieve this goal. Therefore, in order to model how such plants may operate and what their output might be, extrapolations must be made from existing experimental data and technology. However, the expected performance of a plant built to the operating point specifications can only ever be a ‘best guess’. Extrapolations far beyond the current operating regimes are necessarily uncertain, and some important interactions, for example the coupling of conducted power from the scape-off layer to the divertor surface, lack reliable predictive models. This means both that the demands on plant systems at the target operating point can vary significantly from the nominal value, and that the overall plant performance may potentially fall short of design targets. In this contribution we discuss tools and techniques that have been developed to assess the robustness of the operating points for the EU-DEMO tokamak-based demonstration power plant, and the consequences for its design. The aim is to make explicit the design choices and areas where improved modelling and DEMO-relevant experiments will have the greatest impact on confidence in a successful DEMO design.
Flood modelling : Parameterisation and inflow uncertainty
Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.
2014-01-01
This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve
On the evolution of conceptual modeling
Kaschek, Roland H.
2008-01-01
Since the 1980s the need increased for overcoming idiosyncrasies of approaches to modeling in the various sub-disciplines of computing. The theoretical model of evolution is used in this paper for analyzing how computing and conceptual modeling have changed. It is concluded that computing has changed into a social phenomenon with a technical core and that therefore relying on (formal) model semantics as the sole tool for the discussion of conceptual modeling is no more adequate. A number of l...
Reusable launch vehicle model uncertainties impact analysis
Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng
2018-03-01
Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).
Wastewater treatment modelling: dealing with uncertainties
DEFF Research Database (Denmark)
Belia, E.; Amerlinck, Y.; Benedetti, L.
2009-01-01
This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...
Conceptual Models Core to Good Design
Johnson, Jeff
2011-01-01
People make use of software applications in their activities, applying them as tools in carrying out tasks. That this use should be good for people--easy, effective, efficient, and enjoyable--is a principal goal of design. In this book, we present the notion of Conceptual Models, and argue that Conceptual Models are core to achieving good design. From years of helping companies create software applications, we have come to believe that building applications without Conceptual Models is just asking for designs that will be confusing and difficult to learn, remember, and use. We show how Concept
Uncertainty propagation within the UNEDF models
Haverinen, T.; Kortelainen, M.
2017-04-01
The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.
Bioprocess optimization under uncertainty using ensemble modeling
Liu, Yang; Gunawan, Rudiyanto
2017-01-01
The performance of model-based bioprocess optimizations depends on the accuracy of the mathematical model. However, models of bioprocesses often have large uncertainty due to the lack of model identifiability. In the presence of such uncertainty, process optimizations that rely on the predictions of a single “best fit” model, e.g. the model resulting from a maximum likelihood parameter estimation using the available process data, may perform poorly in real life. In this study, we employed ens...
On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori
International Nuclear Information System (INIS)
Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer
2006-01-01
Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2009-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....
Study on Uncertainty and Contextual Modelling
Czech Academy of Sciences Publication Activity Database
Klimešová, Dana; Ocelíková, E.
2007-01-01
Roč. 1, č. 1 (2007), s. 12-15 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Knowledge * contextual modelling * temporal modelling * uncertainty * knowledge management Subject RIV: BD - Theory of Information
Some concepts of model uncertainty for performance assessments of nuclear waste repositories
International Nuclear Information System (INIS)
Eisenberg, N.A.; Sagar, B.; Wittmeyer, G.W.
1994-01-01
Models of the performance of nuclear waste repositories will be central to making regulatory decisions regarding the safety of such facilities. The conceptual model of repository performance is represented by mathematical relationships, which are usually implemented as one or more computer codes. A geologic system may allow many conceptual models, which are consistent with the observations. These conceptual models may or may not have the same mathematical representation. Experiences in modeling the performance of a waste repository representation. Experiences in modeling the performance of a waste repository (which is, in part, a geologic system), show that this non-uniqueness of conceptual models is a significant source of model uncertainty. At the same time, each conceptual model has its own set of parameters and usually, it is not be possible to completely separate model uncertainty from parameter uncertainty for the repository system. Issues related to the origin of model uncertainty, its relation to parameter uncertainty, and its incorporation in safety assessments are discussed from a broad regulatory perspective. An extended example in which these issues are explored numerically is also provided
Indian Academy of Sciences (India)
The imperfect understanding of some of the processes and physics in the carbon cycle and chemistry models generate uncertainties in the conversion of emissions to concentration. To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the ...
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2012-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....
Uncertainty modelling of atmospheric dispersion by stochastic ...
Indian Academy of Sciences (India)
2016-08-26
Aug 26, 2016 ... Uncertainty; polynomial chaos expansion; fuzzy set theory; cumulative distribution function; uniform distribution; membership function. Abstract. The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that ...
Model uncertainties in top-quark physics
Seidel, Markus
2014-01-01
The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.
Urban drainage models - making uncertainty analysis simple
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana
2012-01-01
There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...
Declarative representation of uncertainty in mathematical models.
Miller, Andrew K; Britten, Randall D; Nielsen, Poul M F
2012-01-01
An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent) variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language) to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses) in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.
Declarative representation of uncertainty in mathematical models.
Directory of Open Access Journals (Sweden)
Andrew K Miller
Full Text Available An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.
Representing Uncertainty on Model Analysis Plots
Smith, Trevor I.
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…
Formalizing Linguistic Conventions for Conceptual Models
Becker, Jörg; Delfmann, Patrick; Herwig, Sebastian; Lis, Łukasz; Stein, Armin
A precondition for the appropriate analysis of conceptual models is not only their syntactic correctness but also their semantic comparability. Assuring comparability is challenging especially when models are developed by different persons. Empirical studies show that such models can vary heavily, especially in model element naming, even if they express the same issue. In contrast to most ontology-driven approaches proposing the resolution of these differences ex-post, we introduce an approach that avoids naming differences in conceptual models already during modeling. Therefore we formalize naming conventions combining domain thesauri and phrase structures based on a lin-guistic grammar. This allows for guiding modelers automatically during the modeling process using standardized labels for model elements. Our approach is generic, making it applicable for any modeling language.
Event-Based Conceptual Modeling
DEFF Research Database (Denmark)
Bækgaard, Lars
The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event-based mod......The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...
Parameter and Uncertainty Estimation in Groundwater Modelling
DEFF Research Database (Denmark)
Jensen, Jacob Birk
The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... was applied.Capture zone modelling was conducted on a synthetic stationary 3-dimensional flow problem involving river, surface and groundwater flow. Simulated capture zones were illustrated as likelihood maps and compared with a deterministic capture zones derived from a reference model. The results showed...
Driver Performance Model: 1. Conceptual Framework
National Research Council Canada - National Science Library
Heimerl, Joseph
2001-01-01
...'. At the present time, no such comprehensive model exists. This report discusses a conceptual framework designed to encompass the relationships, conditions, and constraints related to direct, indirect, and remote modes of driving and thus provides a guide or 'road map' for the construction and creation of a comprehensive driver performance model.
Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling
DEFF Research Database (Denmark)
Glückstad, Fumiko Kano; Herlau, Tue; Schmidt, Mikkel Nørgaard
2013-01-01
This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother...
Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling
DEFF Research Database (Denmark)
Glückstad, Fumiko Kano; Herlau, Tue; Schmidt, Mikkel Nørgaard
2013-01-01
This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother la...
Empirical Bayesian inference and model uncertainty
International Nuclear Information System (INIS)
Poern, K.
1994-01-01
This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability
Conceptual modelling of evapotranspiration for simulations of climate change effects
Energy Technology Data Exchange (ETDEWEB)
Lindstroem, G.; Gardelin, M.; Persson, Magnus
1994-09-01
The evapotranspiration routines in existing conceptual hydrological models have been identified as one of the weaknesses which appear when these models are used for the simulation of hydrological effects of a changing climate. The hydrological models in operational use today usually have a very superficial description of evapotranspiration. They have, nevertheless, been able to yield reasonable results. The objective of this paper is to analyse and suggest modifications of existing evapotranspiration routines in conceptual hydrological models to make them more appropriate for use in simulation of the effects of a changing climate on water resources. The following modifications of the evapotranspiration routine were formulated and tested in the HBV model: Temperature anomaly correction of evapotranspiration, potential evapotranspiration by a simplified Thornthwaite type formula, interception submodel, spatially distributed evapotranspiration routine and alternative formulations of lake evapotranspiration. Sensitivity analyses were thereafter made to illustrate the effects of uncertainty in the hydrological model structure versus those of the uncertainty in the climate change predictions. 34 refs, 15 figs, 6 tabs
Nuclear data requirements for the ADS conceptual design EFIT: Uncertainty and sensitivity study
Energy Technology Data Exchange (ETDEWEB)
Garcia-Herranz, N., E-mail: nuria@din.upm.e [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Cabellos, O. [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Alvarez-Velarde, F. [CIEMAT (Spain); Sanz, J. [Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Departamento de Ingenieria Energetica, UNED (Spain); Gonzalez-Romero, E.M. [CIEMAT (Spain); Juan, J. [Laboratorio de Estadistica, Universidad Politecnica de Madrid (Spain)
2010-11-15
In this paper, we assess the impact of activation cross-section uncertainties on relevant fuel cycle parameters for a conceptual design of a modular European Facility for Industrial Transmutation (EFIT) with a 'double strata' fuel cycle. Next, the nuclear data requirements are evaluated so that the parameters can meet the assigned design target accuracies. Different discharge burn-up levels are considered: a low burn-up, corresponding to the equilibrium cycle, and a high burn-up level, simulating the effects on the fuel of the multi-recycling scenario. In order to perform this study, we propose a methodology in two steps. Firstly, we compute the uncertainties on the system parameters by using a Monte Carlo simulation, as it is considered the most reliable approach to address this problem. Secondly, the analysis of the results is performed by a sensitivity technique, in order to identify the relevant reaction channels and prioritize the data improvement needs. Cross-section uncertainties are taken from the EAF-2007/UN library since it includes data for all the actinides potentially present in the irradiated fuel. Relevant uncertainties in some of the fuel cycle parameters have been obtained, and we conclude with recommendations for future nuclear data measurement programs, beyond the specific results obtained with the present nuclear data files and the limited available covariance information. A comparison with the uncertainty and accuracy analysis recently published by the WPEC-Subgroup26 of the OECD using BOLNA covariance matrices is performed. Despite the differences in the transmuter reactor used for the analysis, some conclusions obtained by Subgroup26 are qualitatively corroborated, and improvements for additional cross sections are suggested.
FRSAD conceptual modeling of aboutness
Zeng, Marcia; Žumer, Maja
2012-01-01
The first comprehensive exploration of the development and use of the International Federation of Library Associations and Institutions' (IFLA) newly released model for subject authority data, covering everything from the rationale for creating the model to practical steps for implementing it.
Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses
Energy Technology Data Exchange (ETDEWEB)
Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-08-01
We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.
Model Uncertainty Quantification Methods In Data Assimilation
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
Assessing alternative conceptual models of fracture flow
International Nuclear Information System (INIS)
Ho, C.K.
1995-01-01
The numerical code TOUGH2 was used to assess alternative conceptual models of fracture flow. The models that were considered included the equivalent continuum model (ECM) and the dual permeability (DK) model. A one-dimensional, layered, unsaturated domain was studied with a saturated bottom boundary and a constant infiltration at the top boundary. Two different infiltration rates were used in the studies. In addition, the connection areas between the fracture and matrix elements in the dual permeability model were varied. Results showed that the two conceptual models of fracture flow produced different saturation and velocity profiles-even under steady-state conditions. The magnitudes of the discrepancies were sensitive to two parameters that affected the flux between the fractures and matrix in the dual permeability model: (1) the fracture-matrix connection areas and (2) the capillary pressure gradients between the fracture and matrix elements
Uncertainty modeling process for semantic technology
Directory of Open Access Journals (Sweden)
Rommel N. Carvalho
2016-08-01
Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.
Uncertainty Quantification in Climate Modeling and Projection
Energy Technology Data Exchange (ETDEWEB)
Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel
2016-05-01
The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for
Conceptual modeling in social and physical contexts
Wieringa, Roelf J.
The history of the computing sciences shows a shift in attention from the syntactic properties of computation to the semantics of computing in the real world. A large part of this shift has been brought about by the introduction of conceptual modeling languages. In this paper I review this history
Logistics and Transport - a conceptual model
DEFF Research Database (Denmark)
Jespersen, Per Homann; Drewes, Lise
2004-01-01
This paper describes how the freight transport sector is influenced by logistical principles of production and distribution. It introduces new ways of understanding freight transport as an integrated part of the changing trends of mobility. By introducing a conceptual model for understanding...... the interaction between logistics and transport, it points at ways to over-come inherent methodological difficulties when studying this relation...
Modeling Uncertainty in Climate Change: A Multi-Model Comparison
Energy Technology Data Exchange (ETDEWEB)
Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul
2015-10-01
The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO_{2} concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.
International Nuclear Information System (INIS)
Ericsson, Lars O.; Holmen, Johan
2010-12-01
The primary aim of this report is: - To present a supplementary, in-depth evaluation of certain conceptual simplifications, descriptions and model uncertainties in conjunction with regional groundwater simulation, which in the first instance refer to model depth, topography, groundwater table level and boundary conditions. Implementation was based on geo-scientifically available data compilations from the Smaaland region but different conceptual assumptions have been analysed
International Nuclear Information System (INIS)
Sig Drellack, Lance Prothro
2007-01-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The
Spatial Uncertainty Analysis of Ecological Models
Energy Technology Data Exchange (ETDEWEB)
Jager, H.I.; Ashwood, T.L.; Jackson, B.L.; King, A.W.
2000-09-02
The authors evaluated the sensitivity of a habitat model and a source-sink population model to spatial uncertainty in landscapes with different statistical properties and for hypothetical species with different habitat requirements. Sequential indicator simulation generated alternative landscapes from a source map. Their results showed that spatial uncertainty was highest for landscapes in which suitable habitat was rare and spatially uncorrelated. Although, they were able to exert some control over the degree of spatial uncertainty by varying the sampling density drawn from the source map, intrinsic spatial properties (i.e., average frequency and degree of spatial autocorrelation) played a dominant role in determining variation among realized maps. To evaluate the ecological significance of landscape variation, they compared the variation in predictions from a simple habitat model to variation among landscapes for three species types. Spatial uncertainty in predictions of the amount of source habitat depended on both the spatial life history characteristics of the species and the statistical attributes of the synthetic landscapes. Species differences were greatest when the landscape contained a high proportion of suitable habitat. The predicted amount of source habitat was greater for edge-dependent (interior) species in landscapes with spatially uncorrelated(correlated) suitable habitat. A source-sink model demonstrated that, although variation among landscapes resulted in relatively little variation in overall population growth rate, this spatial uncertainty was sufficient in some situations, to produce qualitatively different predictions about population viability (i.e., population decline vs. increase).
An evaluation of uncertainties in radioecological models
International Nuclear Information System (INIS)
Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III
1978-01-01
The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de
Estimating Coastal Digital Elevation Model (DEM) Uncertainty
Amante, C.; Mesick, S.
2017-12-01
Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.
Conceptual geohydrological model of the separations area
International Nuclear Information System (INIS)
Root, R.W.; Marine, I.W.
1977-01-01
Subsurface drilling in and around the Separations Areas (F-Area and H-Area of the Savannah River Plant) is providing detailed information for a conceptual model of the geology and hydrology underlying these areas. This conceptual model will provide the framework needed for a mathematical model of groundwater movement beneath these areas. Existing information substantiates the presence of two areally extensive clay layers and several discontinuous clay and sandy-clay layers. These layers occur in and between beds of clayey and silty sand that make up most of the subsurface material. Within these sand beds are geologic units of differing hydraulic conductivity. For the present scale of the model, the subsurface information is considered adequate in H-Area, but additional drilling is planned in F-Area
Model Uncertainty for Bilinear Hysteretic Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
1984-01-01
is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...
Uncertainty quantification in wind farm flow models
DEFF Research Database (Denmark)
Murcia Leon, Juan Pablo
uncertainties through a model chain are presented and applied to several wind energy related problems such as: annual energy production estimation, wind turbine power curve estimation, wake model calibration and validation, and estimation of lifetime equivalent fatigue loads on a wind turbine. Statistical...
User verification of the FRBR conceptual model
Pisanski, Jan; Žumer, Maja
2015-01-01
Purpose - The paper aims to build on of a previous study of mental model s of the bibliographic universe, which found that the Functional Requirements for Bibliographic Records (FRBR) conceptual model is intuitive. Design/ methodology/approach - A total 120 participants were presented with a list of bibliographic entities and six graphs each. They were asked to choose the graph they thought best represented the relationships between entities described. Findings - The graph bas ed on the FRBR ...
ORGANIZATIONAL LEARNING AND PERFORMANCE. A CONCEPTUAL MODEL
Alexandra Luciana GUÞÃ
2013-01-01
Throught this paper, our main objective is to propose a conceptual model that links the notions of organizational learning (as capability and as a process) and organizational performance. Our contribution consists in analyzing the literature on organizational learning and organizational performance and in proposing an integrated model, that comprises: organizational learning capability, the process of organizational learning, organizational performance, human capital (the value and uniqueness...
Uncertainty in biology a computational modeling approach
Gomez-Cabrero, David
2016-01-01
Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process. This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples. This book is intended for graduate stude...
Return Predictability, Model Uncertainty, and Robust Investment
DEFF Research Database (Denmark)
Lukas, Manuel
Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....
Uncertainty calculation in transport models and forecasts
DEFF Research Database (Denmark)
Manzo, Stefano; Prato, Carlo Giacomo
. Forthcoming: European Journal of Transport and Infrastructure Research, 15-3, 64-72. 4 The last paper4 examined uncertainty in the spatial composition of residence and workplace locations in the Danish National Transport Model. Despite the evidence that spatial structure influences travel behaviour...... to increase the quality of the decision process and to develop robust or adaptive plans. In fact, project evaluation processes that do not take into account model uncertainty produce not fully informative and potentially misleading results so increasing the risk inherent to the decision to be taken...
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
Energy Technology Data Exchange (ETDEWEB)
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.
Quantifying Registration Uncertainty With Sparse Bayesian Modelling.
Le Folgoc, Loic; Delingette, Herve; Criminisi, Antonio; Ayache, Nicholas
2017-02-01
We investigate uncertainty quantification under a sparse Bayesian model of medical image registration. Bayesian modelling has proven powerful to automate the tuning of registration hyperparameters, such as the trade-off between the data and regularization functionals. Sparsity-inducing priors have recently been used to render the parametrization itself adaptive and data-driven. The sparse prior on transformation parameters effectively favors the use of coarse basis functions to capture the global trends in the visible motion while finer, highly localized bases are introduced only in the presence of coherent image information and motion. In earlier work, approximate inference under the sparse Bayesian model was tackled in an efficient Variational Bayes (VB) framework. In this paper we are interested in the theoretical and empirical quality of uncertainty estimates derived under this approximate scheme vs. under the exact model. We implement an (asymptotically) exact inference scheme based on reversible jump Markov Chain Monte Carlo (MCMC) sampling to characterize the posterior distribution of the transformation and compare the predictions of the VB and MCMC based methods. The true posterior distribution under the sparse Bayesian model is found to be meaningful: orders of magnitude for the estimated uncertainty are quantitatively reasonable, the uncertainty is higher in textureless regions and lower in the direction of strong intensity gradients.
Bioprocess optimization under uncertainty using ensemble modeling.
Liu, Yang; Gunawan, Rudiyanto
2017-02-20
The performance of model-based bioprocess optimizations depends on the accuracy of the mathematical model. However, models of bioprocesses often have large uncertainty due to the lack of model identifiability. In the presence of such uncertainty, process optimizations that rely on the predictions of a single "best fit" model, e.g. the model resulting from a maximum likelihood parameter estimation using the available process data, may perform poorly in real life. In this study, we employed ensemble modeling to account for model uncertainty in bioprocess optimization. More specifically, we adopted a Bayesian approach to define the posterior distribution of the model parameters, based on which we generated an ensemble of model parameters using a uniformly distributed sampling of the parameter confidence region. The ensemble-based process optimization involved maximizing the lower confidence bound of the desired bioprocess objective (e.g. yield or product titer), using a mean-standard deviation utility function. We demonstrated the performance and robustness of the proposed strategy in an application to a monoclonal antibody batch production by mammalian hybridoma cell culture. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Indian Academy of Sciences (India)
To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...
Evidential Model Validation under Epistemic Uncertainty
Directory of Open Access Journals (Sweden)
Wei Deng
2018-01-01
Full Text Available This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.
Uncertainty in reactive transport geochemical modelling
International Nuclear Information System (INIS)
Oedegaard-Jensen, A.; Ekberg, C.
2005-01-01
Full text of publication follows: Geochemical modelling is one way of predicting the transport of i.e. radionuclides in a rock formation. In a rock formation there will be fractures in which water and dissolved species can be transported. The composition of the water and the rock can either increase or decrease the mobility of the transported entities. When doing simulations on the mobility or transport of different species one has to know the exact water composition, the exact flow rates in the fracture and in the surrounding rock, the porosity and which minerals the rock is composed of. The problem with simulations on rocks is that the rock itself it not uniform i.e. larger fractures in some areas and smaller in other areas which can give different water flows. The rock composition can be different in different areas. In additions to this variance in the rock there are also problems with measuring the physical parameters used in a simulation. All measurements will perturb the rock and this perturbation will results in more or less correct values of the interesting parameters. The analytical methods used are also encumbered with uncertainties which in this case are added to the uncertainty from the perturbation of the analysed parameters. When doing simulation the effect of the uncertainties must be taken into account. As the computers are getting faster and faster the complexity of simulated systems are increased which also increase the uncertainty in the results from the simulations. In this paper we will show how the uncertainty in the different parameters will effect the solubility and mobility of different species. Small uncertainties in the input parameters can result in large uncertainties in the end. (authors)
The conceptual model of organization social responsibility
LUO, Lan; WEI, Jingfu
2014-01-01
With the developing of the research of CSR, people more and more deeply noticethat the corporate should take responsibility. Whether other organizations besides corporatesshould not take responsibilities beyond their field? This paper puts forward theconcept of organization social responsibility on the basis of the concept of corporate socialresponsibility and other theories. And the conceptual models are built based on theconception, introducing the OSR from three angles: the types of organi...
Directory of Open Access Journals (Sweden)
Herwig Reiter
2010-01-01
Full Text Available The article proposes a general, empirically grounded model for analyzing biographical uncertainty. The model is based on findings from a qualitative-explorative study of transforming meanings of unemployment among young people in post-Soviet Lithuania. In a first step, the particular features of the uncertainty puzzle in post-communist youth transitions are briefly discussed. A historical event like the collapse of state socialism in Europe, similar to the recent financial and economic crisis, is a generator of uncertainty par excellence: it undermines the foundations of societies and the taken-for-grantedness of related expectations. Against this background, the case of a young woman and how she responds to the novel threat of unemployment in the transition to the world of work is introduced. Her uncertainty management in the specific time perspective of certainty production is then conceptually rephrased by distinguishing three types or levels of biographical uncertainty: knowledge, outcome, and recognition uncertainty. Biographical uncertainty, it is argued, is empirically observable through the analysis of acting and projecting at the biographical level. The final part synthesizes the empirical findings and the conceptual discussion into a stratification model of biographical uncertainty as a general tool for the biographical analysis of uncertainty phenomena. URN: urn:nbn:de:0114-fqs100120
Accept & Reject Statement-Based Uncertainty Models
E. Quaeghebeur (Erik); G. de Cooman; F. Hermans (Felienne)
2015-01-01
textabstractWe develop a framework for modelling and reasoning with uncertainty based on accept and reject statements about gambles. It generalises the frameworks found in the literature based on statements of acceptability, desirability, or favourability and clarifies their relative position. Next
Parametric uncertainty in optical image modeling
Potzick, James; Marx, Egon; Davidson, Mark
2006-10-01
Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.
Optical Model and Cross Section Uncertainties
Energy Technology Data Exchange (ETDEWEB)
Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.
2009-10-05
Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.
Uncertainty quantification and stochastic modeling with Matlab
Souza de Cursi, Eduardo
2015-01-01
Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no
Uncertainties in storm surge and coastal inundation modeling
Dukhovskoy, D. S.; Morey, S. L.
2011-12-01
Storm surge modeling has developed noticeably over the past two decades marching from relatively simple two-dimensional models with simplified physics and coarse computational grid to three-dimensional complex modeling systems with wetting and drying capabilities and high-resolution grids. Although the physics of a storm surge is conceptually straight forward, it is still a challenge to provide an accurate numerical forecast of a storm surge. The presentation will focus on sources of uncertainties in storm surge modeling based on several real-case simulations of the storm surge in the Gulf of Mexico. An ongoing study on estimating the likelihood of coastal inundation along the U.S. Gulf coast will be presented.
Chou, Shuo-Ju
2011-12-01
-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling
Uncertainty Quantification in Geomagnetic Field Modeling
Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.
2017-12-01
Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.
A conceptual model of political market orientation
DEFF Research Database (Denmark)
Ormrod, Robert P.
2005-01-01
. The remaining four constructs are attitudinal, designed to capture the awareness of members to the activities and importance of stakeholder groups in society, both internal and external to the organisation. The model not only allows the level of a party's political market orientation to be assessed, but also......This article proposes eight constructs of a conceptual model of political market orientation, taking inspiration from the business and political marketing literature. Four of the constructs are 'behavioural' in that they aim to describe the process of how information flows through the organisation...
Modalities for an Allegorical Conceptual Data Model
Directory of Open Access Journals (Sweden)
Bartosz Zieliński
2014-05-01
Full Text Available Allegories are enriched categories generalizing a category of sets and binary relations. In this paper, we extend a new, recently-introduced conceptual data model based on allegories by adding support for modal operators and developing a modal interpretation of the model in any allegory satisfying certain additional (but natural axioms. The possibility of using different allegories allows us to transparently use alternative logical frameworks, such as fuzzy relations. Mathematically, our work demonstrates how to enrich with modal operators and to give a many world semantics to an abstract algebraic logic framework. We also give some examples of applications of the modal extension.
Spatial variability and parametric uncertainty in performance assessment models
International Nuclear Information System (INIS)
Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo
2011-01-01
The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)
Summarization of clinical information: a conceptual model.
Feblowitz, Joshua C; Wright, Adam; Singh, Hardeep; Samal, Lipika; Sittig, Dean F
2011-08-01
To provide high-quality and safe care, clinicians must be able to optimally collect, distill, and interpret patient information. Despite advances in text summarization, only limited research exists on clinical summarization, the complex and heterogeneous process of gathering, organizing and presenting patient data in various forms. To develop a conceptual model for describing and understanding clinical summarization in both computer-independent and computer-supported clinical tasks. Based on extensive literature review and clinical input, we developed a conceptual model of clinical summarization to lay the foundation for future research on clinician workflow and automated summarization using electronic health records (EHRs). Our model identifies five distinct stages of clinical summarization: (1) Aggregation, (2) Organization, (3) Reduction and/or Transformation, (4) Interpretation and (5) Synthesis (AORTIS). The AORTIS model describes the creation of complex, task-specific clinical summaries and provides a framework for clinical workflow analysis and directed research on test results review, clinical documentation and medical decision-making. We describe a hypothetical case study to illustrate the application of this model in the primary care setting. Both practicing physicians and clinical informaticians need a structured method of developing, studying and evaluating clinical summaries in support of a wide range of clinical tasks. Our proposed model of clinical summarization provides a potential pathway to advance knowledge in this area and highlights directions for further research. Copyright © 2011 Elsevier Inc. All rights reserved.
DEFF Research Database (Denmark)
Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.
2007-01-01
of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...
Conceptual Model of Dynamic Geographic Environment
Directory of Open Access Journals (Sweden)
Martínez-Rosales Miguel Alejandro
2014-04-01
Full Text Available In geographic environments, there are many and different types of geographic entities such as automobiles, trees, persons, buildings, storms, hurricanes, etc. These entities can be classified into two groups: geographic objects and geographic phenomena. By its nature, a geographic environment is dynamic, thus, it’s static modeling is not sufficient. Considering the dynamics of geographic environment, a new type of geographic entity called event is introduced. The primary target is a modeling of geographic environment as an event sequence, because in this case the semantic relations are much richer than in the case of static modeling. In this work, the conceptualization of this model is proposed. It is based on the idea to process each entity apart instead of processing the environment as a whole. After that, the so called history of each entity and its spatial relations to other entities are defined to describe the whole environment. The main goal is to model systems at a conceptual level that make use of spatial and temporal information, so that later it can serve as the semantic engine for such systems.
Achievements and Problems of Conceptual Modelling
Thalheim, Bernhard
Database and information systems technology has substantially changed. Nowadays, content management systems, (information-intensive) web services, collaborating systems, internet databases, OLAP databases etc. have become buzzwords. At the same time, object-relational technology has gained the maturity for being widely applied. Conceptual modelling has not (yet) covered all these novel topics. It has been concentrated for more than two decades around specification of structures. Meanwhile, functionality, interactivity and distribution must be included into conceptual modelling of information systems. Also, some of the open problems that have been already discussed in 1987 [15, 16] still remain to be open. At the same time, novel models such as object-relational models or XML-based models have been developed. They did not overcome all the problems but have been sharpening and extending the variety of open problems. The open problem presented are given for classical areas of database research, i.e., structuring and functionality. The entire are of distribution and interaction is currently an area of very intensive research.
Impact of model defect and experimental uncertainties on evaluated output
International Nuclear Information System (INIS)
Neudecker, D.; Capote, R.; Leeb, H.
2013-01-01
One of the current major problems in nuclear data evaluation is the unreasonably small evaluated uncertainties often obtained. These small uncertainties are partly attributed to missing correlations of experimental uncertainties as well as to deficiencies of the model employed for the prior information. In this article, both uncertainty sources are included in an evaluation of 55 Mn cross-sections for incident neutrons. Their impact on the evaluated output is studied using a prior obtained by the Full Bayesian Evaluation Technique and a prior obtained by the nuclear model program EMPIRE. It is shown analytically and by means of an evaluation that unreasonably small evaluated uncertainties can be obtained not only if correlated systematic uncertainties of the experiment are neglected but also if prior uncertainties are smaller or about the same magnitude as the experimental ones. Furthermore, it is shown that including model defect uncertainties in the evaluation of 55 Mn leads to larger evaluated uncertainties for channels where the model is deficient. It is concluded that including correlated experimental uncertainties is equally important as model defect uncertainties, if the model calculations deviate significantly from the measurements. -- Highlights: • We study possible causes of unreasonably small evaluated nuclear data uncertainties. • Two different formulations of model defect uncertainties are presented and compared. • Smaller prior than experimental uncertainties cause too small evaluated ones. • Neglected correlations of experimental uncertainties cause too small evaluated ones. • Including model defect uncertainties in the prior improves the evaluated output
Conceptual Modelling of Complex Production Systems
Directory of Open Access Journals (Sweden)
Nenad Perši
2008-12-01
Full Text Available Complex system dynamics, structure and behaviour performances call for a wide range of methods, algorithms and tools to reach a model capable of finding optimal performing parameters. In the modelling process, it is up to the analyst to select the appropriate combination of methods, algorithms and tools to express significant system performances. Such a methodology for designing complex systems should be based upon conceptual modelling to perform a sensitive analysis of different system levels and views, allowing system representations for developing computer models.Complex systems, such as business systems with a continuous-discrete production process, require a well organised supply chain highly reactive to production assortment changes. Aligning two different production components distinctive in their behaviour is especially delicate at the production parameters transition point. Such system performances require distinctive designing methods that can follow the double nature of the production process behaviour in accordance with their entities dynamics caused by assortment changes. Consequently, such systems need different conceptual presentations for their purpose to be realized from different views and aspects.
Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model
Berman, Jeanette; Smyth, Robyn
2015-01-01
This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…
A Structural Equation Model of Conceptual Change in Physics
Taasoobshirazi, Gita; Sinatra, Gale M.
2011-01-01
A model of conceptual change in physics was tested on introductory-level, college physics students. Structural equation modeling was used to test hypothesized relationships among variables linked to conceptual change in physics including an approach goal orientation, need for cognition, motivation, and course grade. Conceptual change in physics…
A conceptual model of referee efficacy
Directory of Open Access Journals (Sweden)
Félix eGuillén
2011-02-01
Full Text Available This paper presents a conceptual model of referee efficacy, defines the concept, proposes sources of referee specific efficacy information, and suggests consequences of having high or low referee efficacy. Referee efficacy is defined as the extent to which referees believe they have the capacity to perform successfully in their job. Referee efficacy beliefs are hypothesized to be influenced by mastery experiences, referee knowledge/education, support from significant others, physical/mental preparedness, environmental comfort, and perceived anxiety. In turn, referee efficacy beliefs are hypothesized to influence referee performance, referee stress, athlete rule violations, athlete satisfaction, and co-referee satisfaction.
Sierra toolkit computational mesh conceptual model
International Nuclear Information System (INIS)
Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.
2010-01-01
The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.
Probabilistic Radiological Performance Assessment Modeling and Uncertainty
Tauxe, J.
2004-12-01
A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A
Current status of uncertainty analysis methods for computer models
International Nuclear Information System (INIS)
Ishigami, Tsutomu
1989-11-01
This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)
Intrinsic Uncertainties in Modeling Complex Systems.
Energy Technology Data Exchange (ETDEWEB)
Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.
2014-09-01
Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.
Uncertainty modelling of critical column buckling for reinforced ...
Indian Academy of Sciences (India)
gates the material uncertainties on column design and proposes an uncertainty model for critical ... ances the accuracy of the structural models by using experimental results and design codes. (Baalbaki et al ..... Elishakoff I 1999 Whys and hows in uncertainty modeling, probability, fuzziness and anti-optimization. New York: ...
Model uncertainty from a regulatory point of view
International Nuclear Information System (INIS)
Abramson, L.R.
1994-01-01
This paper discusses model uncertainty in the larger context of knowledge and random uncertainty. It explores some regulatory implications of model uncertainty and argues that, from a regulator's perspective, a conservative approach must be taken. As a consequence of this perspective, averaging over model results is ruled out
Uncertainty Assessment in Urban Storm Water Drainage Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren
The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...
DEFF Research Database (Denmark)
Lindblom, Erik Ulfson; Madsen, Henrik; Mikkelsen, Peter Steen
2007-01-01
of the measurements. In the second attempt the conceptual model is reformulated to a grey-box model followed by parameter estimation. Given data from an extensive measurement campaign, the two methods suggest that the output of the stormwater pollution model is associated with significant uncertainty....... With the proposed model and input data, the GLUE analysis show that the total sampled copper mass can be predicted within a range of +/- 50% of the median value ( 385 g), whereas the grey-box analysis showed a prediction uncertainty of less than +/- 30%. Future work will clarify the pros and cons of the two methods...
Uncertainty associated with selected environmental transport models
International Nuclear Information System (INIS)
Little, C.A.; Miller, C.W.
1979-11-01
A description is given of the capabilities of several models to predict accurately either pollutant concentrations in environmental media or radiological dose to human organs. The models are discussed in three sections: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations. This procedure is infeasible for food chain models and, therefore, the uncertainty embodied in the models input parameters, rather than the model output, is estimated. Aquatic transport models are divided into one-dimensional, longitudinal-vertical, and longitudinal-horizontal models. Several conclusions were made about the ability of the Gaussian plume atmospheric dispersion model to predict accurately downwind air concentrations from releases under several sets of conditions. It is concluded that no validation study has been conducted to test the predictions of either aquatic or terrestrial food chain models. Using the aquatic pathway from water to fish to an adult for 137 Cs as an example, a 95% one-tailed confidence limit interval for the predicted exposure is calculated by examining the distributions of the input parameters. Such an interval is found to be 16 times the value of the median exposure. A similar one-tailed limit for the air-grass-cow-milk-thyroid for 131 I and infants was 5.6 times the median dose. Of the three model types discussed in this report,the aquatic transport models appear to do the best job of predicting observed concentrations. However, this conclusion is based on many fewer aquatic validation data than were availaable for atmospheric model validation
[The concepts and conceptual models of psychopharmacology].
Gor'kov, V A
1996-01-01
The present approaches to the testing and use of drugs, psychotropic ones in particular, are characterized by inadequate efficiency: the ratio of agents successfully undergone preclinical, clinical, and postclinical tests is 100:5:1 and in clinical psychopharmacotherapy, the proportion of drug-resistant patients and the incidence of side effects are rather high. The groundless supposition that interspecies- and intraspecies-specific sensitivity to drugs is equal may explain this notion, which contradicts the concept of the biochemical stability of species and to the principle of molecular economy in species-specific ratios. To enhance the efficiency of tests and pharmacotherapy, it is suggested that the existing conceptual model "concentration-effect" should be replaced by the extended one "sensitivity-concentration-effect" as it is more scientifically substantiated. Increases in the efficiency of psychopharmacotherapy may be, among other things, reached by individually predicting the most effective psychotropic agents with clinical, paraclinical, pharmacokinetic, and other predictors.
Front-end conceptual platform modeling
DEFF Research Database (Denmark)
Guðlaugsson, Tómas Vignir; Ravn, Poul Martin; Mortensen, Niels Henrik
2014-01-01
Platform thinking has been the subject of investigation and deployment in many projects in both academia and industry. Most contributions involve the restructuring of product programs, and only a few support front-end development of a new platform in parallel with technology development. This con......Platform thinking has been the subject of investigation and deployment in many projects in both academia and industry. Most contributions involve the restructuring of product programs, and only a few support front-end development of a new platform in parallel with technology development....... The conclusion is that the Conceptual Product Platform model supports stakeholders in achieving an overview of the development tasks and communicating these across multidisciplinary development teams, as well as making decisions on the contents of the platform and providing a link between technical solutions...
A unifying conceptual model of entrepreneurial management
DEFF Research Database (Denmark)
Senderovitz, Martin
of entrepreneurial management? The paper builts on the seminal work by Stevenson (1983, 1990) and proposes a discussion and elaboration of the understanding and definition of entrepreneurial management in terms of the relationship between entrepreneurial opportunities and firm resources.......This article offers a systematic analysis and synthesis of the area of entrepreneurial management. Through a presentation of two main perspectives on entrepreneurial management and a newly developed unifying conceptual entrepreneurial management model, the paper discusses a number of theoretical...... disagreements, managerial dilemmas and paradoxes. On the basis of the findings and conclusions of the study, the article contributes with and overview of the entrepreneurial management field, and offers an answer to the overall research question: What constitutes the most essential areas and challenges...
Methodology for characterizing modeling and discretization uncertainties in computational simulation
Energy Technology Data Exchange (ETDEWEB)
ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.
2000-03-01
This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.
Development of conceptual groundwater flow model for Pali Area ...
African Journals Online (AJOL)
The study also extensively uses GIS for preprocessing of hydrological, hydrogeological and geological data. In our view, the methodology presented here provides better tools for building a conceptual model for tackling groundwater modeling problems. Keywords: Groundwater flow model, conceptual model, groundwater ...
A conceptual holding model for veterinary applications
Directory of Open Access Journals (Sweden)
Nicola Ferrè
2014-05-01
Full Text Available Spatial references are required when geographical information systems (GIS are used for the collection, storage and management of data. In the veterinary domain, the spatial component of a holding (of animals is usually defined by coordinates, and no other relevant information needs to be interpreted or used for manipulation of the data in the GIS environment provided. Users trying to integrate or reuse spatial data organised in such a way, frequently face the problem of data incompatibility and inconsistency. The root of the problem lies in differences with respect to syntax as well as variations in the semantic, spatial and temporal representations of the geographic features. To overcome these problems and to facilitate the inter-operability of different GIS, spatial data must be defined according to a “schema” that includes the definition, acquisition, analysis, access, presentation and transfer of such data between different users and systems. We propose an application “schema” of holdings for GIS applications in the veterinary domain according to the European directive framework (directive 2007/2/EC - INSPIRE. The conceptual model put forward has been developed at two specific levels to produce the essential and the abstract model, respectively. The former establishes the conceptual linkage of the system design to the real world, while the latter describes how the system or software works. The result is an application “schema” that formalises and unifies the information-theoretic foundations of how to spatially represent a holding in order to ensure straightforward information-sharing within the veterinary community.
Implications of model uncertainty for the practice of risk assessment
International Nuclear Information System (INIS)
Laskey, K.B.
1994-01-01
A model is a representation of a system that can be used to answer questions about the system's behavior. The term model uncertainty refers to problems in which there is no generally agreed upon, validated model that can be used as a surrogate for the system itself. Model uncertainty affects both the methodology appropriate for building models and how models should be used. This paper discusses representations of model uncertainty, methodologies for exercising and interpreting models in the presence of model uncertainty, and the appropriate use of fallible models for policy making
Creating a Universe, a Conceptual Model
Directory of Open Access Journals (Sweden)
James R. Johnson
2016-10-01
Full Text Available Space is something. Space inherently contains laws of nature: universal rules (mathematics, space dimensions, types of forces, types of fields, and particle species, laws (relativity, quantum mechanics, thermodynamics, and electromagnetism and symmetries (Lorentz, Gauge, and symmetry breaking. We have significant knowledge about these laws of nature because all our scientific theories assume their presence. Their existence is critical for developing either a unique theory of our universe or more speculative multiverse theories. Scientists generally ignore the laws of nature because they “are what they are” and because visualizing different laws of nature challenges the imagination. This article defines a conceptual model separating space (laws of nature from the universe’s energy source (initial conditions and expansion (big bang. By considering the ramifications of changing the laws of nature, initial condition parameters, and two variables in the big bang theory, the model demonstrates that traditional fine tuning is not the whole story when creating a universe. Supporting the model, space and “nothing” are related to the laws of nature, mathematics and multiverse possibilities. Speculation on the beginning of time completes the model.
A Conceptual Modeling Approach for OLAP Personalization
Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan
Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.
Impact of geological model uncertainty on integrated catchment hydrological modeling
He, Xin; Jørgensen, Flemming; Refsgaard, Jens Christian
2014-05-01
Various types of uncertainty can influence hydrological model performance. Among them, uncertainty originated from geological model may play an important role in process-based integrated hydrological modeling, if the model is used outside the calibration base. In the present study, we try to assess the hydrological model predictive uncertainty caused by uncertainty of the geology using an ensemble of geological models with equal plausibility. The study is carried out in the 101 km2 Norsminde catchment in western Denmark. Geostatistical software TProGS is used to generate 20 stochastic geological realizations for the west side the of study area. This process is done while incorporating the borehole log data from 108 wells and high resolution airborne transient electromagnetic (AEM) data for conditioning. As a result, 10 geological models are generated based solely on borehole data, and another 10 geological models are based on both borehole and AEM data. Distributed surface water - groundwater models are developed using MIKE SHE code for each of the 20 geological models. The models are then calibrated using field data collected from stream discharge and groundwater head observations. The model simulation results are evaluated based on the same two types of field data. The results show that the differences between simulated discharge flows caused by using different geological models are relatively small. The model calibration is shown to be able to account for the systematic bias in different geological realizations and hence varies the calibrated model parameters. This results in an increase in the variance between the hydrological realizations compared to the uncalibrated models that uses the same parameter values in all 20 models. Furthermore, borehole based hydrological models in general show more variance between simulations than the AEM based models; however, the combined total uncertainty, bias plus variance, is not necessarily higher.
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen
2012-01-01
The need for estimating micropollutants fluxes in stormwater systems increases the role of stormwater quality models as support for urban water managers, although the application of such models is affected by high uncertainty. This study presents a procedure for identifying the major sources...... of uncertainty in a conceptual lumped dynamic stormwater runoff quality model that is used in a study catchment to estimate (i) copper loads, (ii) compliance with dissolved Cu concentration limits on stormwater discharge and (iii) the fraction of Cu loads potentially intercepted by a planned treatment facility...
Climate change decision-making: Model & parameter uncertainties explored
Energy Technology Data Exchange (ETDEWEB)
Dowlatabadi, H.; Kandlikar, M.; Linville, C.
1995-12-31
A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.
How to: understanding SWAT model uncertainty relative to measured results
Watershed models are being relied upon to contribute to most policy-making decisions of watershed management, and the demand for an accurate accounting of complete model uncertainty is rising. Generalized likelihood uncertainty estimation (GLUE) is a widely used method for quantifying uncertainty i...
Representing and managing uncertainty in qualitative ecological models
Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.
2009-01-01
Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete
Energy Technology Data Exchange (ETDEWEB)
Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA
2017-04-01
Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.
Guiding principles for conceptual model creation in manufacturing simulation
Van der Zee, Durk-Jouke; Van der Vorst, Jack G. A. J.; Henderson, S.G.; Biller, B.; Hsieh, M.-H.; Shortle, J.; Tew, J.D.; Barton, R.R.
2007-01-01
Conceptual models serve as abstractions of user's perceptions of a system. The choice and detailing of these abstractions are key to model use and understanding for analyst and project stakeholders. In this article we consider guidance for the analyst in his creative job of conceptual modeling. More
Physical and Model Uncertainty for Fatigue Design of Composite Material
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...... for linear damage accumulation. Test data analyzed are taken from the Optimat database [1] which is public available. The composite material tested within the Optimat project is normally used for wind turbine blades....
Modelling of data uncertainties on hybrid computers
International Nuclear Information System (INIS)
Schneider, Anke
2016-06-01
The codes d 3 f and r 3 t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d 3 f and r 3 t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d 3 f and r 3 t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d 3 f and r 3 t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d 3 f and r 3 t were combined to one conjoint code d 3 f++. A direct estimation of uncertainties for complex groundwater flow models with the help of Monte Carlo simulations will not be
Modelling of data uncertainties on hybrid computers
Energy Technology Data Exchange (ETDEWEB)
Schneider, Anke (ed.)
2016-06-15
The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the
A market model: uncertainty and reachable sets
Directory of Open Access Journals (Sweden)
Raczynski Stanislaw
2015-01-01
Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.
Solomatine, Dimitri
2016-04-01
When speaking about model uncertainty many authors implicitly assume the data uncertainty (mainly in parameters or inputs) which is probabilistically described by distributions. Often however it is look also into the residual uncertainty as well. It is hence reasonable to classify the main approaches to uncertainty analysis with respect to the two main types of model uncertainty that can be distinguished: A. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. it uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. The following methods can be mentioned: (a) quantile regression (QR) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) a more recent approach that takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (neural networks, model trees etc.) - the UNEEC method [2,3,7] (c) and even more recent DUBRAUE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals (it corrects the model residual first and then carries out the uncertainty prediction by a autoregressive statistical model) [5] B. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. In case of simple functions representing models analytical approaches can be used, or approximation methods (e.g., first-order second moment method). However, for real complex non-linear models implemented in software there is no other choice except using
Conceptual models of the wind-driven and thermohaline circulation
Drijfhout, S.S.; Marshall, D.P.; Dijkstra, H.A.
2013-01-01
Conceptual models are a vital tool for understanding the processes that maintain the global ocean circulation, both in nature and in complex numerical ocean models. In this chapter we provide a broad overview of our conceptual understanding of the wind-driven circulation, the thermohaline
On the general ontological foundations of conceptual modeling
Guizzardi, G.; Herre, Heinrich; Wagner, Gerd; Spaccapietra, Stefano; March, Salvatore T.; Kambayashi, Yahiko
2002-01-01
As pointed out in the pioneering work of [WSW99,EW01], an upper level ontology allows to evaluate the ontological correctness of a conceptual model and to develop guidelines how the constructs of a conceptual modeling language should be used. In this paper we adopt the General Ontological Language
Lumped conceptual hydrological model for Purna river basin, India
Indian Academy of Sciences (India)
Home; Journals; Sadhana; Volume 40; Issue 8 ... Conceptual hydrological NAM model; calibration; sensitivity analysis; validation; Tapi basin; Purna catchment. ... In present study, a lumped conceptual hydrological model, NAM (MIKE11), is calibrated while optimizing the runoff simulations on the basis of minimization of ...
Characterization uncertainty and its effects on models and performance
Energy Technology Data Exchange (ETDEWEB)
Rautman, C.A.; Treadway, A.H.
1991-01-01
Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization.
Uncertainty in a spatial evacuation model
Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De
2017-08-01
Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.
Conceptualizing Telehealth in Nursing Practice: Advancing a Conceptual Model to Fill a Virtual Gap.
Nagel, Daniel A; Penner, Jamie L
2016-03-01
Increasingly nurses use various telehealth technologies to deliver health care services; however, there has been a lag in research and generation of empirical knowledge to support nursing practice in this expanding field. One challenge to generating knowledge is a gap in development of a comprehensive conceptual model or theoretical framework to illustrate relationships of concepts and phenomena inherent to adoption of a broad range of telehealth technologies to holistic nursing practice. A review of the literature revealed eight published conceptual models, theoretical frameworks, or similar entities applicable to nursing practice. Many of these models focus exclusively on use of telephones and four were generated from qualitative studies, but none comprehensively reflect complexities of bridging nursing process and elements of nursing practice into use of telehealth. The purpose of this article is to present a review of existing conceptual models and frameworks, discuss predominant themes and features of these models, and present a comprehensive conceptual model for telehealth nursing practice synthesized from this literature for consideration and further development. This conceptual model illustrates characteristics of, and relationships between, dimensions of telehealth practice to guide research and knowledge development in provision of holistic person-centered care delivery to individuals by nurses through telehealth technologies. © The Author(s) 2015.
Identification and communication of uncertainties of phenomenological models in PSA
International Nuclear Information System (INIS)
Pulkkinen, U.; Simola, K.
2001-11-01
This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)
Aerosol model selection and uncertainty modelling by adaptive MCMC technique
Directory of Open Access Journals (Sweden)
M. Laine
2008-12-01
Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.
The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.
We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.
Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz
2016-04-01
~100 mm for SWAP and <10 mm for AquaCrop. Both findings show the strong variety in model structures and the effects on simulation performance. In general, the results of this study indicate a greater impact of conceptual than parameter uncertainty and demonstrate the need for further research concerning water balance modeling for irrigation management.
Energy Technology Data Exchange (ETDEWEB)
Follin, Sven (SF GeoLogic AB, Taeby (Sweden)); Hartley, Lee; Jackson, Peter; Roberts, David (Serco TAP (United Kingdom)); Marsic, Niko (Kemakta Konsult AB, Stockholm (Sweden))
2008-05-15
Three versions of a site descriptive model (SDM) have been completed for the Forsmark area. Version 0 established the state of knowledge prior to the start of the site investigation programme. Version 1.1 was essentially a training exercise and was completed during 2004. Version 1.2 was a preliminary site description and concluded the initial site investigation work (ISI) in June 2005. Three modelling stages are planned for the complete site investigation work (CSI). These are labelled stage 2.1, 2.2 and 2.3, respectively. An important component of each of these stages is to address and continuously try to resolve discipline-specific uncertainties of importance for repository engineering and safety assessment. Stage 2.1 included an updated geological model for Forsmark and aimed to provide a feedback from the modelling working group to the site investigation team to enable completion of the site investigation work. Stage 2.2 described the conceptual understanding and the numerical modelling of the bedrock hydrogeology in the Forsmark area based on data freeze 2.2. The present report describes the modelling based on data freeze 2.3, which is the final data freeze in Forsmark. In comparison, data freeze 2.3 is considerably smaller than data freeze 2.2. Therefore, stage 2.3 deals primarily with model confirmation and uncertainty analysis, e.g. verification of important hypotheses made in stage 2.2 and the role of parameter uncertainty in the numerical modelling. On the whole, the work reported here constitutes an addendum to the work reported in stage 2.2. Two changes were made to the CONNECTFLOW code in stage 2.3. These serve to: 1) improve the representation of the hydraulic properties of the regolith, and 2) improve the conditioning of transmissivity of the deformation zones against single-hole hydraulic tests. The changes to the modelling of the regolith were made to improve the consistency with models made with the MIKE SHE code, which involved the introduction
Conceptual and Numerical Models for UZ Flow and Transport
International Nuclear Information System (INIS)
Liu, H.
2000-01-01
The purpose of this Analysis/Model Report (AMR) is to document the conceptual and numerical models used for modeling of unsaturated zone (UZ) fluid (water and air) flow and solute transport processes. This is in accordance with ''AMR Development Plan for U0030 Conceptual and Numerical Models for Unsaturated Zone (UZ) Flow and Transport Processes, Rev 00''. The conceptual and numerical modeling approaches described in this AMR are used for models of UZ flow and transport in fractured, unsaturated rock under ambient and thermal conditions, which are documented in separate AMRs. This AMR supports the UZ Flow and Transport Process Model Report (PMR), the Near Field Environment PMR, and the following models: Calibrated Properties Model; UZ Flow Models and Submodels; Mountain-Scale Coupled Processes Model; Thermal-Hydrologic-Chemical (THC) Seepage Model; Drift Scale Test (DST) THC Model; Seepage Model for Performance Assessment (PA); and UZ Radionuclide Transport Models
A novel approach to parameter uncertainty analysis of hydrological models using neural networks
Directory of Open Access Journals (Sweden)
D. P. Solomatine
2009-07-01
Full Text Available In this study, a methodology has been developed to emulate a time consuming Monte Carlo (MC simulation by using an Artificial Neural Network (ANN for the assessment of model parametric uncertainty. First, MC simulation of a given process model is run. Then an ANN is trained to approximate the functional relationships between the input variables of the process model and the synthetic uncertainty descriptors estimated from the MC realizations. The trained ANN model encapsulates the underlying characteristics of the parameter uncertainty and can be used to predict uncertainty descriptors for the new data vectors. This approach was validated by comparing the uncertainty descriptors in the verification data set with those obtained by the MC simulation. The method is applied to estimate the parameter uncertainty of a lumped conceptual hydrological model, HBV, for the Brue catchment in the United Kingdom. The results are quite promising as the prediction intervals estimated by the ANN are reasonably accurate. The proposed techniques could be useful in real time applications when it is not practicable to run a large number of simulations for complex hydrological models and when the forecast lead time is very short.
Scheerens, Jaap; Scheerens, Jaap
2017-01-01
In the second chapter a conceptual analysis of Opportunity to Learn (OTL) is given, covering also related terms, such as instructional alignment and test preparation. The OTL issue is highlighted from three educational research traditions: educational effectiveness research, curriculum research and
Using Conceptual Change Theories to Model Position Concepts in Astronomy
Yang, Chih-Chiang; Hung, Jeng-Fung
2012-01-01
The roles of conceptual change and model building in science education are very important and have a profound and wide effect on teaching science. This study examines the change in children's position concepts after instruction, based on different conceptual change theories. Three classes were chosen and divided into three groups, including a…
An ontologically well-founded profile for UML conceptual models
Guizzardi, G.; Wagner, Gerd; van Sinderen, Marten J.; Guarino, Nicola; Persson, Anne; Stirna, Janis
2004-01-01
UML class diagrams can be used as a language for expressing a conceptual model of a domain. In a series of papers [1,2,3] we have been using the General Ontological Language (GOL) and its underlying upper level ontology, proposed in [4,5], to evaluate the ontological correctness of a conceptual UML
Toolkit for Conceptual Modeling (TCM): User's Guide and Reference
Dehne, F.; Wieringa, Roelf J.
1997-01-01
The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes
Imprecision and Uncertainty in the UFO Database Model.
Van Gyseghem, Nancy; De Caluwe, Rita
1998-01-01
Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…
Utilizing Uncertainty Multidisciplinary Design Optimization for Conceptual Design of Space Systems
Yao, W.; Guo, J.; Chen, X.; Van Tooren, M.
2010-01-01
With progress of space technology and increase of space mission demand, requirements for robustness and reliability of space systems are ever-increasing. For the whole space mission life cycle, the most important decisions are made in the conceptual design phase, so it is very crucial to take
Incorporating model uncertainty into optimal insurance contract design
Pflug, G.; Timonina-Farkas, A.; Hochrainer-Stigler, S.
2017-01-01
In stochastic optimization models, the optimal solution heavily depends on the selected probability model for the scenarios. However, the scenario models are typically chosen on the basis of statistical estimates and are therefore subject to model error. We demonstrate here how the model uncertainty can be incorporated into the decision making process. We use a nonparametric approach for quantifying the model uncertainty and a minimax setup to find model-robust solutions. The method is illust...
Multi-scenario modelling of uncertainty in stochastic chemical systems
International Nuclear Information System (INIS)
Evans, R. David; Ricardez-Sandoval, Luis A.
2014-01-01
Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo
Analytic uncertainty and sensitivity analysis of models with input correlations
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
Uncertainty models applied to the substation planning
Energy Technology Data Exchange (ETDEWEB)
Fontoura Filho, Roberto N. [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O.; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)
1994-12-31
The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.
Aspects of uncertainty analysis in accident consequence modeling
International Nuclear Information System (INIS)
Travis, C.C.; Hoffman, F.O.
1981-01-01
Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data
Appropriatie spatial scales to achieve model output uncertainty goals
Booij, Martijn J.; Melching, Charles S.; Chen, Xiaohong; Chen, Yongqin; Xia, Jun; Zhang, Hailun
2008-01-01
Appropriate spatial scales of hydrological variables were determined using an existing methodology based on a balance in uncertainties from model inputs and parameters extended with a criterion based on a maximum model output uncertainty. The original methodology uses different relationships between
Estimated Frequency Domain Model Uncertainties used in Robust Controller Design
DEFF Research Database (Denmark)
Tøffner-Clausen, S.; Andersen, Palle; Stoustrup, Jakob
1994-01-01
This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are......This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are...
Modeling of uncertainty in atmospheric transport system using hybrid method
International Nuclear Information System (INIS)
Pandey, M.; Ranade, Ashok; Brij Kumar; Datta, D.
2012-01-01
Atmospheric dispersion models are routinely used at nuclear and chemical plants to estimate exposure to the members of the public and occupational workers due to release of hazardous contaminants into the atmosphere. Atmospheric dispersion is a stochastic phenomenon and in general, the concentration of the contaminant estimated at a given time and at a predetermined location downwind of a source cannot be predicted precisely. Uncertainty in atmospheric dispersion model predictions is associated with: 'data' or 'parameter' uncertainty resulting from errors in the data used to execute and evaluate the model, uncertainties in empirical model parameters, and initial and boundary conditions; 'model' or 'structural' uncertainty arising from inaccurate treatment of dynamical and chemical processes, approximate numerical solutions, and internal model errors; and 'stochastic' uncertainty, which results from the turbulent nature of the atmosphere as well as from unpredictability of human activities related to emissions, The possibility theory based on fuzzy measure has been proposed in recent years as an alternative approach to address knowledge uncertainty of a model in situations where available information is too vague to represent the parameters statistically. The paper presents a novel approach (called Hybrid Method) to model knowledge uncertainty in a physical system by a combination of probabilistic and possibilistic representation of parametric uncertainties. As a case study, the proposed approach is applied for estimating the ground level concentration of hazardous contaminant in air due to atmospheric releases through the stack (chimney) of a nuclear plant. The application illustrates the potential of the proposed approach. (author)
The Model Vision Project: A Conceptual Framework for Service Delivery
Bourgeault, Stanley E.; And Others
1977-01-01
Described are the conceptualization, implementation, and results to date of the George Peabody College for Teachers Model Center for Severely Handicapped Multi-impaired Children with Visual Impairment as a Primary Handicapping Condition. (Author/IM)
Investigating the Propagation of Meteorological Model Uncertainty for Tracer Modeling
Lopez-Coto, I.; Ghosh, S.; Karion, A.; Martin, C.; Mueller, K. L.; Prasad, K.; Whetstone, J. R.
2016-12-01
The North-East Corridor project aims to use a top-down inversion method to quantify sources of Greenhouse Gas (GHG) emissions in the urban areas of Washington DC and Baltimore at approximately 1km2 resolutions. The aim of this project is to help establish reliable measurement methods for quantifying and validating GHG emissions independently of the inventory methods typically used to guide mitigation efforts. Since inversion methods depend strongly on atmospheric transport modeling, analyzing the uncertainties on the meteorological fields and their propagation through the sensitivities of observations to surface fluxes (footprints) is a fundamental step. To this end, six configurations of the Weather Research and Forecasting Model (WRF-ARW) version 3.8 were used to generate an ensemble of meteorological simulations. Specifically, we used 4 planetary boundary layer parameterizations (YSU, MYNN2, BOULAC, QNSE), 2 sources of initial and boundary conditions (NARR and HRRR) and 1 configuration including the building energy parameterization (BEP) urban canopy model. The simulations were compared with more than 150 meteorological surface stations, a wind profiler and radiosondes for a month (February) in 2016 to account for the uncertainties and the ensemble spread for wind speed, direction and mixing height. In addition, we used the Stochastic Time-Inverted Lagrangian Transport model (STILT) to derive the sensitivity of 12 hypothetical observations to surface emissions (footprints) with each WRF configuration. The footprints and integrated sensitivities were compared and the resulting uncertainties estimated.
Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor
Directory of Open Access Journals (Sweden)
Jae-Han Park
2012-06-01
Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.
Spatial uncertainty model for visual features using a Kinect™ sensor.
Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong
2012-01-01
This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.
Dimensionality reduction for uncertainty quantification of nuclear engineering models.
Energy Technology Data Exchange (ETDEWEB)
Roderick, O.; Wang, Z.; Anitescu, M. (Mathematics and Computer Science)
2011-01-01
The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and efficiency of engineering projects. In our previous work, we have shown that the effect of uncertainty can be approximated by polynomial regression with derivatives (PRD): a hybrid regression method that uses first-order derivatives of the model output as additional fitting conditions for a polynomial expansion. Numerical experiments have demonstrated the advantage of this approach over classical methods of uncertainty analysis: in precision, computational efficiency, or both. To obtain derivatives, we used automatic differentiation (AD) on the simulation code; hand-coded derivatives are acceptable for simpler models. We now present improvements on the method. We use a tuned version of the method of snapshots, a technique based on proper orthogonal decomposition (POD), to set up the reduced order representation of essential information on uncertainty in the model inputs. The automatically obtained sensitivity information is required to set up the method. Dimensionality reduction in combination with PRD allows analysis on a larger dimension of the uncertainty space (>100), at modest computational cost.
Comparison of evidence theory and Bayesian theory for uncertainty modeling
International Nuclear Information System (INIS)
Soundappan, Prabhu; Nikolaidis, Efstratios; Haftka, Raphael T.; Grandhi, Ramana; Canfield, Robert
2004-01-01
This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions
Uncertainty shocks in a model of effective demand
Bundick, Brent; Basu, Susanto
2014-01-01
Can increased uncertainty about the future cause a contraction in output and its components? An identified uncertainty shock in the data causes significant declines in output, consumption, investment, and hours worked. Standard general-equilibrium models with flexible prices cannot reproduce this comovement. However, uncertainty shocks can easily generate comovement with countercyclical markups through sticky prices. Monetary policy plays a key role in offsetting the negative impact of uncert...
Uncertainty modelling of atmospheric dispersion by stochastic ...
Indian Academy of Sciences (India)
sensitivity and uncertainty of atmospheric dispersion using fuzzy set theory can be found in. Chutia et al (2013). ..... tainties have been presented, will facilitate the decision makers in the said field to take a decision on the quality of the air if ..... Annals of Fuzzy Mathematics and Informatics 5(1): 213–22. Chutia R, Mahanta S ...
Modeling uncertainty in requirements engineering decision support
Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.
2005-01-01
One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.
[Application of an uncertainty model for fibromyalgia].
Triviño Martínez, Ángeles; Solano Ruiz, M Carmen; Siles González, José
2016-04-01
Finding out women's experiences diagnosed with fibromyalgia applying the Theory of Uncertainty proposed by M. Mishel. A qualitative study was conducted, using a phenomenological approach. An Association of patients in the province of Alicante during the months of June 2012 to November 2013. A total of 14 women diagnosed with fibromyalgia participated in the study as volunteers, aged between 45 and 65 years. Information generated through structured interviews with recording and transcription, prior confidentiality pledge and informed consent. Analysis content by extracting different categories according to the theory proposed. The study patients perceive a high level of uncertainty related to the difficulty to deal with symptoms, uncertainty about diagnosis and treatment complexity. Moreover, the ability of coping with the disease it is influenced by social support, relationships with health professionals and help and information attending to patient associations. The health professional must provide clear information on the pathology to the fibromyalgia suffers, the larger lever of knowledge of the patients about their disease and the better the quality of the information provided, it is reported to be the less anxiety and uncertainty in the experience of the disease. Likewise patient associations should have health professionals in order to avoid bias in the information and advice with scientific evidence. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Uncertainty modelling of atmospheric dispersion by stochastic ...
Indian Academy of Sciences (India)
discharges and related regulated pollution criteria for the marine environment. An Integrated. Simulation-Assessment Approach (ISAA) (Yang et al 2010) is developed to systematically tackle multiple uncertainties associated with hydrocarbon contaminant transport in subsurface and assessment of carcinogenic health risk ...
Reservoir management under geological uncertainty using fast model update
Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.
2015-01-01
Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU
Dynamic modeling of predictive uncertainty by regression on absolute errors
Pianosi, F.; Raso, L.
2012-01-01
Uncertainty of hydrological forecasts represents valuable information for water managers and hydrologists. This explains the popularity of probabilistic models, which provide the entire distribution of the hydrological forecast. Nevertheless, many existing hydrological models are deterministic and
Assessing Groundwater Model Uncertainty for the Central Nevada Test Area
International Nuclear Information System (INIS)
Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd
2002-01-01
The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation
Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty
DEFF Research Database (Denmark)
Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens
that can also provide estimates of uncertainties in predictions of properties and their effects on process design becomes necessary. For instance, the accuracy of design of distillation column to achieve a given product purity is dependent on many pure compound properties such as critical pressure...... of formation, standard enthalpy of fusion, standard enthalpy of vaporization at 298 K and at the normal boiling point, entropy of vaporization at the normal boiling point, surface tension at 298 K, viscosity at 300 K, flash point, auto ignition temperature, Hansen solubility parameters, Hildebrand solubility....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column...
Incorporating parametric uncertainty into population viability analysis models
McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.
2011-01-01
Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.
Development of a Prototype Model-Form Uncertainty Knowledge Base
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
Energy Technology Data Exchange (ETDEWEB)
Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2016-09-01
Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise
Improved Wave-vessel Transfer Functions by Uncertainty Modelling
DEFF Research Database (Denmark)
Nielsen, Ulrik Dam; Fønss Bach, Kasper; Iseki, Toshio
2016-01-01
This paper deals with uncertainty modelling of wave-vessel transfer functions used to calculate or predict wave-induced responses of a ship in a seaway. Although transfer functions, in theory, can be calculated to exactly reflect the behaviour of the ship when exposed to waves, uncertainty in input...
Urban drainage models simplifying uncertainty analysis for practitioners
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana
2013-01-01
There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a m...
A Model-Free Definition of Increasing Uncertainty
Grant, S.; Quiggin, J.
2001-01-01
We present a definition of increasing uncertainty, independent of any notion of subjective probabilities, or of any particular model of preferences.Our notion of an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' which increases consumption by a
Uncertainty modelling of critical column buckling for reinforced ...
Indian Academy of Sciences (India)
Buckling is a critical issue for structural stability in structural design. ... This study investigates the material uncertainties on column design and proposes an uncertainty model for critical column buckling reinforced concrete buildings. ... Civil Engineering Department, Suleyman Demirel University, Isparta 32260, Turkey ...
Uncertainty in a monthly water balance model using the generalized ...
Indian Academy of Sciences (India)
Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology. Diego Rivera1,∗. , Yessica Rivas. 2 and Alex Godoy. 3. 1. Laboratory of Comparative Policy in Water Resources Management, University of Concepcion,. CONICYT/FONDAP 15130015, Concepcion, Chile. 2.
Uncertainty in Discount Models and Environmental Accounting
Directory of Open Access Journals (Sweden)
Donald Ludwig
2005-12-01
Full Text Available Cost-benefit analysis (CBA is controversial for environmental issues, but is nevertheless employed by many governments and private organizations for making environmental decisions. Controversy centers on the practice of economic discounting in CBA for decisions that have substantial long-term consequences, as do most environmental decisions. Customarily, economic discounting has been calculated at a constant exponential rate, a practice that weights the present heavily in comparison with the future. Recent analyses of economic data show that the assumption of constant exponential discounting should be modified to take into account large uncertainties in long-term discount rates. A proper treatment of this uncertainty requires that we consider returns over a plausible range of assumptions about future discounting rates. When returns are averaged in this way, the schemes with the most severe discounting have a negligible effect on the average after a long period of time has elapsed. This re-examination of economic uncertainty provides support for policies that prevent or mitigate environmental damage. We examine these effects for three examples: a stylized renewable resource, management of a long-lived species (Atlantic Right Whales, and lake eutrophication.
Uncertainties in environmental radiological assessment models and their implications
International Nuclear Information System (INIS)
Hoffman, F.O.; Miller, C.W.
1983-01-01
Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible
Uncertainty and error in complex plasma chemistry models
Turner, Miles M.
2015-06-01
Chemistry models that include dozens of species and hundreds to thousands of reactions are common in low-temperature plasma physics. The rate constants used in such models are uncertain, because they are obtained from some combination of experiments and approximate theories. Since the predictions of these models are a function of the rate constants, these predictions must also be uncertain. However, systematic investigations of the influence of uncertain rate constants on model predictions are rare to non-existent. In this work we examine a particular chemistry model, for helium-oxygen plasmas. This chemistry is of topical interest because of its relevance to biomedical applications of atmospheric pressure plasmas. We trace the primary sources for every rate constant in the model, and hence associate an error bar (or equivalently, an uncertainty) with each. We then use a Monte Carlo procedure to quantify the uncertainty in predicted plasma species densities caused by the uncertainty in the rate constants. Under the conditions investigated, the range of uncertainty in most species densities is a factor of two to five. However, the uncertainty can vary strongly for different species, over time, and with other plasma conditions. There are extreme (pathological) cases where the uncertainty is more than a factor of ten. One should therefore be cautious in drawing any conclusion from plasma chemistry modelling, without first ensuring that the conclusion in question survives an examination of the related uncertainty.
Bayesian models for comparative analysis integrating phylogenetic uncertainty
2012-01-01
Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for
Fuzzy techniques for subjective workload-score modeling under uncertainties.
Kumar, Mohit; Arndt, Dagmar; Kreuzfeld, Steffi; Thurow, Kerstin; Stoll, Norbert; Stoll, Regina
2008-12-01
This paper deals with the development of a computer model to estimate the subjective workload score of individuals by evaluating their heart-rate (HR) signals. The identification of a model to estimate the subjective workload score of individuals under different workload situations is too ambitious a task because different individuals (due to different body conditions, emotional states, age, gender, etc.) show different physiological responses (assessed by evaluating the HR signal) under different workload situations. This is equivalent to saying that the mathematical mappings between physiological parameters and the workload score are uncertain. Our approach to deal with the uncertainties in a workload-modeling problem consists of the following steps: 1) The uncertainties arising due the individual variations in identifying a common model valid for all the individuals are filtered out using a fuzzy filter; 2) stochastic modeling of the uncertainties (provided by the fuzzy filter) use finite-mixture models and utilize this information regarding uncertainties for identifying the structure and initial parameters of a workload model; and 3) finally, the workload model parameters for an individual are identified in an online scenario using machine learning algorithms. The contribution of this paper is to propose, with a mathematical analysis, a fuzzy-based modeling technique that first filters out the uncertainties from the modeling problem, analyzes the uncertainties statistically using finite-mixture modeling, and, finally, utilizes the information about uncertainties for adapting the workload model to an individual's physiological conditions. The approach of this paper, demonstrated with the real-world medical data of 11 subjects, provides a fuzzy-based tool useful for modeling in the presence of uncertainties.
Factoring uncertainty into restoration modeling of in-situ leach uranium mines
Johnson, Raymond H.; Friedel, Michael J.
2009-01-01
Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.
Uncertainties in radioecological assessment models-Their nature and approaches to reduce them
International Nuclear Information System (INIS)
Kirchner, G.; Steiner, M.
2008-01-01
Radioecological assessment models are necessary tools for estimating the radiation exposure of humans and non-human biota. This paper focuses on factors affecting their predictive accuracy, discusses the origin and nature of the different contributions to uncertainty and variability and presents approaches to separate and quantify them. The key role of the conceptual model, notably in relation to its structure and complexity, as well as the influence of the number and type of input parameters, are highlighted. Guidelines are provided to improve the degree of reliability of radioecological models
Generalized martingale model of the uncertainty evolution of streamflow forecasts
Zhao, Tongtiegang; Zhao, Jianshi; Yang, Dawen; Wang, Hao
2013-07-01
Streamflow forecasts are dynamically updated in real-time, thus facilitating a process of forecast uncertainty evolution. Forecast uncertainty generally decreases over time and as more hydrologic information becomes available. The process of forecasting and uncertainty updating can be described by the martingale model of forecast evolution (MMFE), which formulates the total forecast uncertainty of a streamflow in one future period as the sum of forecast improvements in the intermediate periods. This study tests the assumptions, i.e., unbiasedness, Gaussianity, temporal independence, and stationarity, of MMFE using real-world streamflow forecast data. The results show that (1) real-world forecasts can be biased and tend to underestimate the actual streamflow, and (2) real-world forecast uncertainty is non-Gaussian and heavy-tailed. Based on these statistical tests, this study proposes a generalized martingale model GMMFE for the simulation of biased and non-Gaussian forecast uncertainties. The new model combines the normal quantile transform (NQT) with MMFE to formulate the uncertainty evolution of real-world streamflow forecasts. Reservoir operations based on a synthetic forecast by GMMFE illustrates that applications of streamflow forecasting facilitate utility improvements and that special attention should be focused on the statistical distribution of forecast uncertainty.
Lumped conceptual hydrological model for Purna river basin, India
Indian Academy of Sciences (India)
graphical information system (GIS) based hydrological models are becoming increasingly useful in prediction of ... models are preferred over lumped conceptual models in prediction of runoff provided exten- sive data ..... high values of CQOF may be ascribed due to presence of certain low permeable soil like clay and bare ...
Conceptual model for assessment of inhalation exposure: Defining modifying factors
Tielemans, E.; Schneider, T.; Goede, H.; Tischer, M.; Warren, N.; Kromhout, H.; Tongeren, M. van; Hemmen, J. van; Cherrie, J.W.
2008-01-01
The present paper proposes a source-receptor model to schematically describe inhalation exposure to help understand the complex processes leading to inhalation of hazardous substances. The model considers a stepwise transfer of a contaminant from the source to the receptor. The conceptual model is
Meteorological Uncertainty of atmospheric Dispersion model results (MUD)
DEFF Research Database (Denmark)
Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik
. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties......The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario...... of the meteorological model results. These uncertainties stem from e.g. limits in meteorological obser-vations used to initialise meteorological forecast series. By perturbing the initial state of an NWP model run in agreement with the available observa-tional data, an ensemble of meteorological forecasts is produced...
Meteorological Uncertainty of atmospheric Dispersion model results (MUD)
DEFF Research Database (Denmark)
Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik
’ dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent......The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the ‘most likely...... uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble...
Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model
International Nuclear Information System (INIS)
Otis, M.D.
1983-01-01
Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs
Conceptual adsorption models and open issues pertaining to performance assessment
International Nuclear Information System (INIS)
Serne, R.J.
1992-01-01
Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab
Guide for developing conceptual models for ecological risk assessments
Energy Technology Data Exchange (ETDEWEB)
Suter, G.W., II
1996-05-01
Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs.
Guide for developing conceptual models for ecological risk assessments
International Nuclear Information System (INIS)
Suter, G.W., II.
1996-05-01
Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs
Modeling theoretical uncertainties in phenomenological analyses for particle physics
Energy Technology Data Exchange (ETDEWEB)
Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)
2017-04-15
The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)
Error and Uncertainty Analysis for Ecological Modeling and Simulation
National Research Council Canada - National Science Library
Gertner, George
1998-01-01
The main objectives of this project are a) to develop a general methodology for conducting sensitivity and uncertainty analysis and building error budgets in simulation modeling over space and time; and b...
Uncertainty and sensitivity analysis for photovoltaic system modeling.
Energy Technology Data Exchange (ETDEWEB)
Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pohl, Andrew Phillip [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Dirk [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2013-12-01
We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.
Conceptual Modeling of Time-Varying Information
DEFF Research Database (Denmark)
Gregersen, Heidi; Jensen, Christian Søndergaard
2004-01-01
A wide range of database applications manage information that varies over time. Many of the underlying database schemas of these were designed using the Entity-Relationship (ER) model. In the research community as well as in industry, it is common knowledge that the temporal aspects of the mini-world...... are important, but difficult to capture using the ER model. Several enhancements to the ER model have been proposed in an attempt to support the modeling of temporal aspects of information. Common to the existing temporally extended ER models, few or no specific requirements to the models were given...
Data-driven Modelling for decision making under uncertainty
Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus
2018-01-01
The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.
A Conceptual Framework of Business Model Emerging Resilience
Goumagias, Nikolaos; Fernandes, Kiran; Cabras, Ignazio; Li, Feng; Shao, Jianhua; Devlin, Sam; Hodge, Victoria; Cowling, Peter; Kudenko, Daniel
2016-01-01
In this paper we introduce an environmentally driven conceptual framework of Business Model change. Business models acquired substantial momentum in academic literature during the past decade. Several studies focused on what exactly constitutes a Business Model (role model, recipe, architecture etc.) triggering a theoretical debate about the Business Model’s components and their corresponding dynamics and relationships. In this paper, we argue that for Business Models as cognitive structures,...
An Iterative Uncertainty Assessment Technique for Environmental Modeling
International Nuclear Information System (INIS)
Engel, David W.; Liebetrau, Albert M.; Jarman, Kenneth D.; Ferryman, Thomas A.; Scheibe, Timothy D.; Didier, Brett T.
2004-01-01
The reliability of and confidence in predictions from model simulations are crucial--these predictions can significantly affect risk assessment decisions. For example, the fate of contaminants at the U.S. Department of Energy's Hanford Site has critical impacts on long-term waste management strategies. In the uncertainty estimation efforts for the Hanford Site-Wide Groundwater Modeling program, computational issues severely constrain both the number of uncertain parameters that can be considered and the degree of realism that can be included in the models. Substantial improvements in the overall efficiency of uncertainty analysis are needed to fully explore and quantify significant sources of uncertainty. We have combined state-of-the-art statistical and mathematical techniques in a unique iterative, limited sampling approach to efficiently quantify both local and global prediction uncertainties resulting from model input uncertainties. The approach is designed for application to widely diverse problems across multiple scientific domains. Results are presented for both an analytical model where the response surface is ''known'' and a simplified contaminant fate transport and groundwater flow model. The results show that our iterative method for approximating a response surface (for subsequent calculation of uncertainty estimates) of specified precision requires less computing time than traditional approaches based upon noniterative sampling methods
Model-specification uncertainty in future forest pest outbreak.
Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis
2016-04-01
Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John
Parameter uncertainty analysis of a biokinetic model of caesium
International Nuclear Information System (INIS)
Li, W.B.; Oeh, U.; Klein, W.; Blanchardon, E.; Puncher, M.; Leggett, R.W.; Breustedt, B.; Nosske, D.; Lopez, M.A.
2015-01-01
Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects at different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5. and 2.5. percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS. (authors)
Educational game models: conceptualization and evaluation ...
African Journals Online (AJOL)
The relationship between educational theories, game design and game development are used to develop models for the creation of complex learning environments. The Game Object Model (GOM), that marries educational theory and game design, forms the basis for the development of the Persona Outlining Model (POM) ...
Menthor Editor: An Ontology-Driven Conceptual Modeling Platform
Moreira, João Luiz; Sales, Tiago Prince; Guerson, John; Braga, Bernardo F.B; Brasileiro, Freddy; Sobral, Vinicius
2016-01-01
The lack of well-founded constructs in ontology tools can lead to the construction of non-intended models. In this demonstration we present the Menthor Editor, an ontology-driven conceptual modelling platform which incorporates the theories of the Unified Foundational Ontology (UFO). We illustrate
Conceptual Model of Artifacts for Design Science Research
DEFF Research Database (Denmark)
Bækgaard, Lars
2015-01-01
We present a conceptual model of design science research artifacts. The model views an artifact at three levels. At the artifact level a selected artifact is viewed as a combination of material and immaterial aspects and a set of representations hereof. At the design level the selected artifact...
Metrics for evaluating performance and uncertainty of Bayesian network models
Bruce G. Marcot
2012-01-01
This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood
Conceptual astronomy: A novel model for teaching postsecondary science courses
Zeilik, Michael; Schau, Candace; Mattern, Nancy; Hall, Shannon; Teague, Kathleen W.; Bisard, Walter
1997-10-01
An innovative, conceptually based instructional model for teaching large undergraduate astronomy courses was designed, implemented, and evaluated in the Fall 1995 semester. This model was based on cognitive and educational theories of knowledge and, we believe, is applicable to other large postsecondary science courses. Major components were: (a) identification of the basic important concepts and their interrelationships that are necessary for connected understanding of astronomy in novice students; (b) use of these concepts and their interrelationships throughout the design, implementation, and evaluation stages of the model; (c) identification of students' prior knowledge and misconceptions; and (d) implementation of varied instructional strategies targeted toward encouraging conceptual understanding in students (i.e., instructional concept maps, cooperative small group work, homework assignments stressing concept application, and a conceptually based student assessment system). Evaluation included the development and use of three measures of conceptual understanding and one of attitudes toward studying astronomy. Over the semester, students showed very large increases in their understanding as assessed by a conceptually based multiple-choice measure of misconceptions, a select-and-fill-in concept map measure, and a relatedness-ratings measure. Attitudes, which were slightly positive before the course, changed slightly in a less favorable direction.
Graphical models and their (un)certainties
Leisink, M.A.R.
2004-01-01
'A graphical models is a powerful tool to deal with complex probability models. Although in principle any set of probabilistic relationships can be modelled, the calculation of the actual numbers can be very hard. Every graphical model suffers from a phenomenon known as exponential scaling. To
Meteorological uncertainty of atmospheric dispersion model results (MUD)
International Nuclear Information System (INIS)
Havskov Soerensen, J.; Amstrup, B.; Feddersen, H.
2013-08-01
The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be derived. (Author)
Motivation to Improve Work through Learning: A Conceptual Model
Directory of Open Access Journals (Sweden)
Kueh Hua Ng
2014-12-01
Full Text Available This study aims to enhance our current understanding of the transfer of training by proposing a conceptual model that supports the mediating role of motivation to improve work through learning about the relationship between social support and the transfer of training. The examination of motivation to improve work through motivation to improve work through a learning construct offers a holistic view pertaining to a learner's profile in a workplace setting, which emphasizes learning for the improvement of work performance. The proposed conceptual model is expected to benefit human resource development theory building, as well as field practitioners by emphasizing the motivational aspects crucial for successful transfer of training.
Partitioning uncertainty in streamflow projections under nonstationary model conditions
Chawla, Ila; Mujumdar, P. P.
2018-02-01
Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them
Problem Solving, Modeling, and Local Conceptual Development.
Lesh, Richard; Harel, Guershon
2003-01-01
Describes similarities and differences between modeling cycles and stages of development. Includes examples of relevant constructs underlying children's developing ways of thinking about fractions, ratios, rates, proportions, or other mathematical ideas. Concludes that modeling cycles appear to be local or situated versions of the general stages…
Geologic Conceptual Model of Mosul Dam
2007-09-01
geomechanical characteristics or geotechnical properties. As an example, if the locations and depths of zones of high grout-take are known, these zones can...Dam safety Foundation grout Geologic model GIS Gypsum Hydrogeology Iraq geology Karst 3-D modeling 16. SECURITY CLASSIFICATION
Uncertainty Assessment in Long Term Urban Drainage Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren
on the rainfall inputs. In order to handle the uncertainties three different stochastic approaches are investigated applying a case catchment in the town Frejlev: (1) a reliability approach in which a parameterization of the rainfall input is conducted in order to generate synthetic rainfall events and find...... return periods, and even within the return periods specified in the design criteria. If urban drainage models are based on standard parameters and hence not calibrated, the uncertainties are even larger. The greatest uncertainties are shown to be the rainfall input and the assessment of the contributing...
The ACTIVE conceptual framework as a structural equation model.
Gross, Alden L; Payne, Brennan R; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M; Farias, Sarah; Giovannetti, Tania; Ip, Edward H; Marsiske, Michael; Rebok, George W; Schaie, K Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N
2018-01-01
Background/Study Context: Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be different from
Sensitivities and uncertainties of modeled ground temperatures in mountain environments
Directory of Open Access Journals (Sweden)
S. Gubler
2013-08-01
Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several
UNCERTAINTY SUPPLY CHAIN MODEL AND TRANSPORT IN ITS DEPLOYMENTS
Directory of Open Access Journals (Sweden)
Fabiana Lucena Oliveira
2014-05-01
Full Text Available This article discusses the Model Uncertainty of Supply Chain, and proposes a matrix with their transportation modes best suited to their chains. From the detailed analysis of the matrix of uncertainty, it is suggested transportation modes best suited to the management of these chains, so that transport is the most appropriate optimization of the gains previously proposed by the original model, particularly when supply chains are distant from suppliers of raw materials and / or supplies.Here we analyze in detail Agile Supply Chains, which is a result of Uncertainty Supply Chain Model, with special attention to Manaus Industrial Center. This research was done at Manaus Industrial Pole, which is a model of industrial agglomerations, based in Manaus, State of Amazonas (Brazil, which contemplates different supply chains and strategies sharing same infrastructure of transport, handling and storage and clearance process and uses inbound for suppliers of raw material. The state of art contemplates supply chain management, uncertainty supply chain model, agile supply chains, Manaus Industrial Center (MIC and Brazilian legislation, as a business case, and presents concepts and features, of each one. The main goal is to present and discuss how transport is able to support Uncertainty Supply Chain Model, in order to complete management model. The results obtained confirms the hypothesis of integrated logistics processes are able to guarantee attractivity for industrial agglomerations, and open discussions when the suppliers are far from the manufacturer center, in a logistics management.
Bayesian tsunami fragility modeling considering input data uncertainty
De Risi, Raffaele; Goda, Katsu; Mori, Nobuhito; Yasuda, Tomohiro
2017-01-01
Empirical tsunami fragility curves are developed based on a Bayesian framework by accounting for uncertainty of input tsunami hazard data in a systematic and comprehensive manner. Three fragility modeling approaches, i.e. lognormal method, binomial logistic method, and multinomial logistic method, are considered, and are applied to extensive tsunami damage data for the 2011 Tohoku earthquake. A unique aspect of this study is that uncertainty of tsunami inundation data (i.e. input hazard data ...
Uncertainty quantification of squeal instability via surrogate modelling
Nobari, Amir; Ouyang, Huajiang; Bannister, Paul
2015-08-01
One of the major issues that car manufacturers are facing is the noise and vibration of brake systems. Of the different sorts of noise and vibration, which a brake system may generate, squeal as an irritating high-frequency noise costs the manufacturers significantly. Despite considerable research that has been conducted on brake squeal, the root cause of squeal is still not fully understood. The most common assumption, however, is mode-coupling. Complex eigenvalue analysis is the most widely used approach to the analysis of brake squeal problems. One of the major drawbacks of this technique, nevertheless, is that the effects of variability and uncertainty are not included in the results. Apparently, uncertainty and variability are two inseparable parts of any brake system. Uncertainty is mainly caused by friction, contact, wear and thermal effects while variability mostly stems from the manufacturing process, material properties and component geometries. Evaluating the effects of uncertainty and variability in the complex eigenvalue analysis improves the predictability of noise propensity and helps produce a more robust design. The biggest hurdle in the uncertainty analysis of brake systems is the computational cost and time. Most uncertainty analysis techniques rely on the results of many deterministic analyses. A full finite element model of a brake system typically consists of millions of degrees-of-freedom and many load cases. Running time of such models is so long that automotive industry is reluctant to do many deterministic analyses. This paper, instead, proposes an efficient method of uncertainty propagation via surrogate modelling. A surrogate model of a brake system is constructed in order to reproduce the outputs of the large-scale finite element model and overcome the issue of computational workloads. The probability distribution of the real part of an unstable mode can then be obtained by using the surrogate model with a massive saving of
An independent verification and validation of the Future Theater Level Model conceptual model
Energy Technology Data Exchange (ETDEWEB)
Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.
1994-08-01
This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.
A nursing conceptual model for contamination.
Green, Pauline M; Polk, Laura V
2012-02-01
To propose a nursing model of contamination that describes the key features of contamination at a level of abstraction needed for clinical decision making. Relevant literature on contamination, biopreparedness, and classic epidemiologic literature were reviewed and analyzed. A model of contamination was created along with a description of benefits of use in practice, education, and research. The nursing profession is called to respond to contamination incidents on a local, national, and global level. Achieving optimum health outcomes while managing contamination incidents is enhanced by nurses' use of a model that incorporates six elements to identify and name instances of contamination and select suitable outcomes and interventions. © 2011, The Authors. International Journal of Nursing Knowledge © 2011, NANDA International.
Modelling students' knowledge organisation: Genealogical conceptual networks
Koponen, Ismo T.; Nousiainen, Maija
2018-04-01
Learning scientific knowledge is largely based on understanding what are its key concepts and how they are related. The relational structure of concepts also affects how concepts are introduced in teaching scientific knowledge. We model here how students organise their knowledge when they represent their understanding of how physics concepts are related. The model is based on assumptions that students use simple basic linking-motifs in introducing new concepts and mostly relate them to concepts that were introduced a few steps earlier, i.e. following a genealogical ordering. The resulting genealogical networks have relatively high local clustering coefficients of nodes but otherwise resemble networks obtained with an identical degree distribution of nodes but with random linking between them (i.e. the configuration-model). However, a few key nodes having a special structural role emerge and these nodes have a higher than average communicability betweenness centralities. These features agree with the empirically found properties of students' concept networks.
Evaluation of uncertainties in selected environmental dispersion models
International Nuclear Information System (INIS)
Little, C.A.; Miller, C.W.
1979-01-01
Compliance with standards of radiation dose to the general public has necessitated the use of dispersion models to predict radionuclide concentrations in the environment due to releases from nuclear facilities. Because these models are only approximations of reality and because of inherent variations in the input parameters used in these models, their predictions are subject to uncertainty. Quantification of this uncertainty is necessary to assess the adequacy of these models for use in determining compliance with protection standards. This paper characterizes the capabilities of several dispersion models to predict accurately pollutant concentrations in environmental media. Three types of models are discussed: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations
A Conceptual Model of Investor Behavior
M. Lovric (Milan); U. Kaymak (Uzay); J. Spronk (Jaap)
2008-01-01
textabstractBased on a survey of behavioral finance literature, this paper presents a descriptive model of individual investor behavior in which investment decisions are seen as an iterative process of interactions between the investor and the investment environment. This investment process is
Learning strategies: a synthesis and conceptual model
Hattie, John A. C.; Donoghue, Gregory M.
2016-08-01
The purpose of this article is to explore a model of learning that proposes that various learning strategies are powerful at certain stages in the learning cycle. The model describes three inputs and outcomes (skill, will and thrill), success criteria, three phases of learning (surface, deep and transfer) and an acquiring and consolidation phase within each of the surface and deep phases. A synthesis of 228 meta-analyses led to the identification of the most effective strategies. The results indicate that there is a subset of strategies that are effective, but this effectiveness depends on the phase of the model in which they are implemented. Further, it is best not to run separate sessions on learning strategies but to embed the various strategies within the content of the subject, to be clearer about developing both surface and deep learning, and promoting their associated optimal strategies and to teach the skills of transfer of learning. The article concludes with a discussion of questions raised by the model that need further research.
Alchemy and uncertainty: What good are models?
F.L. Bunnell
1989-01-01
Wildlife-habitat models are increasing in abundance, diversity, and use, but symptoms of failure are evident in their application, including misuse, disuse, failure to test, and litigation. Reasons for failure often relate to the different purposes managers and researchers have for using the models to predict and to aid understanding. This paper examines these two...
Uncertainty and Complexity in Mathematical Modeling
Cannon, Susan O.; Sanders, Mark
2017-01-01
Modeling is an effective tool to help students access mathematical concepts. Finding a math teacher who has not drawn a fraction bar or pie chart on the board would be difficult, as would finding students who have not been asked to draw models and represent numbers in different ways. In this article, the authors will discuss: (1) the properties of…
Model Uncertainty and Exchange Rate Forecasting
Kouwenberg, R.; Markiewicz, A.; Verhoeks, R.; Zwinkels, R.C.J.
2017-01-01
Exchange rate models with uncertain and incomplete information predict that investors focus on a small set of fundamentals that changes frequently over time. We design a model selection rule that captures the current set of fundamentals that best predicts the exchange rate. Out-of-sample tests show
Modeling transport phenomena and uncertainty quantification in solidification processes
Fezi, Kyle S.
Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification
Designing Public Library Websites for Teens: A Conceptual Model
Naughton, Robin Amanda
2012-01-01
The main goal of this research study was to develop a conceptual model for the design of public library websites for teens (TLWs) that would enable designers and librarians to create library websites that better suit teens' information needs and practices. It bridges a gap in the research literature between user interface design in human-computer…
Exploring conceptual models for community engagement at higher ...
African Journals Online (AJOL)
A critical conceptual analysis of the South African Higher Education context reflects the lack of a structural and functional framework for the conceptualisation of community engagement (CE) in higher education. The purpose of this article is to explore a framework and model for the conceptualisation of CE for a better ...
LCM 3.0: A Language for describing Conceptual Models
Feenstra, Remco; Wieringa, Roelf J.
1993-01-01
The syntax of the conceptual model specification language LCM is defined. LCM uses equational logic to specify data types and order-sorted dynamic logic to specify objects with identity and mutable state. LCM specifies database transactions as finite sets of atomic object transitions.
A conceptual model specification language (CMSL Version 2)
Wieringa, Roelf J.
1992-01-01
Version 2 of a language (CMSL) to specify conceptual models is defined. CMSL consists of two parts, the value specification language VSL and the object spercification language OSL. There is a formal semantics and an inference system for CMSL but research on this still continues. A method for
A conceptual framework for a mentoring model for nurse educators ...
African Journals Online (AJOL)
Transformation in South Africa resulted in changes in the mandate of Higher Education Institutions (HEIs). Therefore, the need to design a mentoring model for recruiting and retaining nurse educators to meet the demands of teaching and learning became evident. The aim of the study was to develop a conceptual ...
A New Conceptual Model for Understanding International Students' College Needs
Alfattal, Eyad
2016-01-01
This study concerns the theory and practice of international marketing in higher education with the purpose of exploring a conceptual model for understanding international students' needs in the context of a four-year college in the United States. A transcendental phenomenological design was employed to investigate the essence of international…
Developing a Conceptual Model of STEAM Teaching Practices
Quigley, Cassie F.; Herro, Dani; Jamil, Faiza M.
2017-01-01
STEAM, where the "A" represents arts and humanities, is considered a transdisciplinary learning process that has the potential to increase diverse participation in science, technology, engineering, and math (STEM) fields. However, a well-defined conceptual model that clearly articulates essential components of the STEAM approach is…
Conceptualizations of Creativity: Comparing Theories and Models of Giftedness
Miller, Angie L.
2012-01-01
This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…
Conceptual basis for developing of trainig models in complex ...
African Journals Online (AJOL)
This paper presents conceptual basis for developing of training models of interactive assembling system for automatic building of application software systems, obtained during practical works over "Design and architecture of software systems" and "Object-oriented analysis and design" courses. The system is intended for ...
A Conceptual Model of Investor Behavior
Lovric, M.; Kaymak, U.; Spronk, J.
2008-01-01
textabstractBased on a survey of behavioral finance literature, this paper presents a descriptive model of individual investor behavior in which investment decisions are seen as an iterative process of interactions between the investor and the investment environment. This investment process is influenced by a number of interdependent variables and driven by dual mental systems, the interplay of which contributes to boundedly rational behavior where investors use various heuristics and may exh...
"Wrong, but useful": negotiating uncertainty in infectious disease modelling.
Directory of Open Access Journals (Sweden)
Robert M Christley
Full Text Available For infectious disease dynamical models to inform policy for containment of infectious diseases the models must be able to predict; however, it is well recognised that such prediction will never be perfect. Nevertheless, the consensus is that although models are uncertain, some may yet inform effective action. This assumes that the quality of a model can be ascertained in order to evaluate sufficiently model uncertainties, and to decide whether or not, or in what ways or under what conditions, the model should be 'used'. We examined uncertainty in modelling, utilising a range of data: interviews with scientists, policy-makers and advisors, and analysis of policy documents, scientific publications and reports of major inquiries into key livestock epidemics. We show that the discourse of uncertainty in infectious disease models is multi-layered, flexible, contingent, embedded in context and plays a critical role in negotiating model credibility. We argue that usability and stability of a model is an outcome of the negotiation that occurs within the networks and discourses surrounding it. This negotiation employs a range of discursive devices that renders uncertainty in infectious disease modelling a plastic quality that is amenable to 'interpretive flexibility'. The utility of models in the face of uncertainty is a function of this flexibility, the negotiation this allows, and the contexts in which model outputs are framed and interpreted in the decision making process. We contend that rather than being based predominantly on beliefs about quality, the usefulness and authority of a model may at times be primarily based on its functional status within the broad social and political environment in which it acts.
An educational model for ensemble streamflow simulation and uncertainty analysis
Directory of Open Access Journals (Sweden)
A. AghaKouchak
2013-02-01
Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.
Modeling in transport phenomena a conceptual approach
Tosun, Ismail
2007-01-01
Modeling in Transport Phenomena, Second Edition presents and clearly explains with example problems the basic concepts and their applications to fluid flow, heat transfer, mass transfer, chemical reaction engineering and thermodynamics. A balanced approach is presented between analysis and synthesis, students will understand how to use the solution in engineering analysis. Systematic derivations of the equations and the physical significance of each term are given in detail, for students to easily understand and follow up the material. There is a strong incentive in science and engineering to
Enhancing uncertainty tolerance in the modelling creep of ligaments
International Nuclear Information System (INIS)
Taha, M M Reda; Lucero, J
2006-01-01
The difficulty in performing biomechanical tests and the scarcity of biomechanical experimental databases necessitate extending the current knowledge base to allow efficient modelling using limited data sets. This study suggests a framework to reduce uncertainties in biomechanical systems using limited data sets. The study also shows how sparse data and epistemic input can be exploited using fuzzy logic to represent biomechanical relations. An example application to model collagen fibre recruitment in the medial collateral ligaments during time-dependent deformation under cyclic loading (creep) is presented. The study suggests a quality metric that can be employed to observe and enhance uncertainty tolerance in the modelling process
Integration of inaccurate data into model building and uncertainty assessment
Energy Technology Data Exchange (ETDEWEB)
Coleou, Thierry
1998-12-31
Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.
Incorporating model parameter uncertainty into inverse treatment planning
International Nuclear Information System (INIS)
Lian Jun; Xing Lei
2004-01-01
Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment
Eigenspace perturbations for structural uncertainty estimation of turbulence closure models
Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca
2017-11-01
With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).
Conceptual design interpretations, mindset and models
Andreasen, Mogens Myrup; Cash, Philip
2015-01-01
Maximising reader insights into the theory, models, methods and fundamental reasoning of design, this book addresses design activities in industrial settings, as well as the actors involved. This approach offers readers a new understanding of design activities and related functions, properties and dispositions. Presenting a ‘design mindset’ that seeks to empower students, researchers, and practitioners alike, it features a strong focus on how designers create new concepts to be developed into products, and how they generate new business and satisfy human needs. Employing a multi-faceted perspective, the book supplies the reader with a comprehensive worldview of design in the form of a proposed model that will empower their activities as student, researcher or practitioner. We draw the reader into the core role of design conceptualisation for society, for the development of industry, for users and buyers of products, and for citizens in relation to public systems. The book also features original con...
Energy Technology Data Exchange (ETDEWEB)
Ericsson, Lars O. (Lars O. Ericsson Consulting AB (Sweden)); Holmen, Johan (Golder Associates (Sweden))
2010-12-15
The primary aim of this report is: - To present a supplementary, in-depth evaluation of certain conceptual simplifications, descriptions and model uncertainties in conjunction with regional groundwater simulation, which in the first instance refer to model depth, topography, groundwater table level and boundary conditions. Implementation was based on geo-scientifically available data compilations from the Smaaland region but different conceptual assumptions have been analysed
Conceptual Models as Tools for Communication Across Disciplines
Directory of Open Access Journals (Sweden)
Marieke Heemskerk
2003-12-01
Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.
Estimation and uncertainty of reversible Markov models.
Trendelkamp-Schroer, Benjamin; Wu, Hao; Paul, Fabian; Noé, Frank
2015-11-07
Reversibility is a key concept in Markov models and master-equation models of molecular kinetics. The analysis and interpretation of the transition matrix encoding the kinetic properties of the model rely heavily on the reversibility property. The estimation of a reversible transition matrix from simulation data is, therefore, crucial to the successful application of the previously developed theory. In this work, we discuss methods for the maximum likelihood estimation of transition matrices from finite simulation data and present a new algorithm for the estimation if reversibility with respect to a given stationary vector is desired. We also develop new methods for the Bayesian posterior inference of reversible transition matrices with and without given stationary vector taking into account the need for a suitable prior distribution preserving the meta-stable features of the observed process during posterior inference. All algorithms here are implemented in the PyEMMA software--http://pyemma.org--as of version 2.0.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
River meander modeling and confronting uncertainty.
Energy Technology Data Exchange (ETDEWEB)
Posner, Ari J. (University of Arizona Tucson, AZ)
2011-05-01
This study examines the meandering phenomenon as it occurs in media throughout terrestrial, glacial, atmospheric, and aquatic environments. Analysis of the minimum energy principle, along with theories of Coriolis forces (and random walks to explain the meandering phenomenon) found that these theories apply at different temporal and spatial scales. Coriolis forces might induce topological changes resulting in meandering planforms. The minimum energy principle might explain how these forces combine to limit the sinuosity to depth and width ratios that are common throughout various media. The study then compares the first order analytical solutions for flow field by Ikeda, et al. (1981) and Johannesson and Parker (1989b). Ikeda's et al. linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g., cohesiveness, stratigraphy, or vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of a meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations several measures are formulated in order to determine which of the resulting planforms is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model.
Examination of a conceptual model of child neglect.
Dubowitz, Howard; Newton, Rae R; Litrownik, Alan J; Lewis, Terri; Briggs, Ernestine C; Thompson, Richard; English, Diana; Lee, Li-Ching; Feerick, Margaret M
2005-05-01
This study attempted to provide empirical support for conceptual definitions of child neglect. We identified 12 types of needs, conceptualizing neglect as occurring when children's basic needs are not adequately met. We examined measures administered to 377 children and caregivers at ages 4 and 6 years participating in longitudinal studies on child mal-treatment to identify potential indicators of these needs. Indicators were found for latent constructs, operationalizing three of the basic needs (emotional support and/or affection, protection from family conflict and/or violence, and from community violence). These latent constructs were used in a measurement model; this supported the conceptual definitions of neglect. A structural equation model then assessed whether the latent constructs were associated with child adjustment at age 8 years. Low level of perceived support from mother was associated with internalizing and externalizing behavior problems. Exposure to family conflict was also linked to these problems, and to social difficulties. Finally, children's sense of experiencing little early affection was associated with subsequent externalizing behavior and social problems. The approach of conceptualizing neglect in terms of unmet child needs, developing a measurement model to define latent neglect constructs, and relating these constructs to subsequent adjustment can build our understanding of neglect.
Conceptual model innovation management: market orientation
Directory of Open Access Journals (Sweden)
L.Ya. Maljuta
2015-06-01
Full Text Available The article highlights issues that determine the beginning of the innovation process. Determined that until recently in Ukraine at all levels of innovation management (regional, sectoral, institutional dominated grocery orientation innovation that focus on production innovation and found that the transition to a market economy, the restructuring of production and complexity of social needs led to the strengthening of the consumer. It is proved that innovation itself – not the ultimate goal, but only a means of satisfying consumer needs. It proved that changing production conditions, complications of social needs and the need to improve the competitiveness of innovations require finding new forms of innovation. In this regard, proposed to allocate such basic scheme (model of innovation in small businesses, individual entrepreneurs, venture capital firms, eksplerents, patients, violents and commutants, spin-offs and spin-out company, network (or shell company and a network of small businesses.
Collaborative Rural Healthcare Network: A Conceptual Model
Directory of Open Access Journals (Sweden)
U. Raja
2011-07-01
Full Text Available Healthcare is a critical issue in rural communities throughout the world. Provision of timely and cost effective health care in these communities is a challenge since it is coupled with a lack of adequate infrastructure and manpower support. Twenty percent of the United States of America‘s population resides in rural communities, i.e., 59 million people; however, only nine percent of the nation’s physicians practice in rural communities. Shortage of health care personnel and the lack of equipment and facilities often force rural residents to travel long distances to receive needed medical treatment. Researchers and practitioners are in search of solutions to address these unique challenges. In this research, we present a proposed collaborative model of a health information system for rural communities and the challenges and opportunities of this global issue.
A Conceptual Modeling for a GoldSim Program for Safety Assessment of an LILW Repository
International Nuclear Information System (INIS)
Lee, Youn Myoung; Hwang, Yong Soo; Kang, Chul Hyung; Lee, Sung Ho
2009-12-01
Modeling study and development of a total system performance assessment (TSPA) program, by which an assessment of safety and performance for a low- and intermediate-level radioactive waste disposal repository with normal or abnormal nuclide release cases associated with the various FEPs involved in the performance of the proposed repository could be made has been carrying out by utilizing GoldSim under contract with KRMC. The report deals with a detailed conceptual modeling scheme by which a GoldSim program modules, all of which are integrated into a TSPA program as well as the input data set currently available. In-depth system models that are conceptually and rather practically described and then ready for implementing into a GoldSim program are introduced with plenty of illustrative conceptual models and sketches. The GoldSim program that will be finally developed through this project is expected to be successfully applied to the post closure safety assessment required both for the LILW repository and pyro processed repository by the regulatory body with both increased practicality and much reduced uncertainty
Uncertainty propagation through dynamic models of assemblies of mechanical structures
International Nuclear Information System (INIS)
Daouk, Sami
2016-01-01
When studying the behaviour of mechanical systems, mathematical models and structural parameters are usually considered deterministic. Return on experience shows however that these elements are uncertain in most cases, due to natural variability or lack of knowledge. Therefore, quantifying the quality and reliability of the numerical model of an industrial assembly remains a major question in low-frequency dynamics. The purpose of this thesis is to improve the vibratory design of bolted assemblies through setting up a dynamic connector model that takes account of different types and sources of uncertainty on stiffness parameters, in a simple, efficient and exploitable in industrial context. This work has been carried out in the framework of the SICODYN project, led by EDF R and D, that aims to characterise and quantify, numerically and experimentally, the uncertainties in the dynamic behaviour of bolted industrial assemblies. Comparative studies of several numerical methods of uncertainty propagation demonstrate the advantage of using the Lack-Of-Knowledge theory. An experimental characterisation of uncertainties in bolted structures is performed on a dynamic test rig and on an industrial assembly. The propagation of many small and large uncertainties through different dynamic models of mechanical assemblies leads to the assessment of the efficiency of the Lack-Of-Knowledge theory and its applicability in an industrial environment. (author)
Modeling multibody systems with uncertainties. Part II: Numerical applications
Energy Technology Data Exchange (ETDEWEB)
Sandu, Corina, E-mail: csandu@vt.edu; Sandu, Adrian; Ahmadian, Mehdi [Virginia Polytechnic Institute and State University, Mechanical Engineering Department (United States)
2006-04-15
This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties.
Modeling multibody systems with uncertainties. Part II: Numerical applications
International Nuclear Information System (INIS)
Sandu, Corina; Sandu, Adrian; Ahmadian, Mehdi
2006-01-01
This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties
Uncertainties in spatially aggregated predictions from a logistic regression model
Horssen, P.W. van; Pebesma, E.J.; Schot, P.P.
2002-01-01
This paper presents a method to assess the uncertainty of an ecological spatial prediction model which is based on logistic regression models, using data from the interpolation of explanatory predictor variables. The spatial predictions are presented as approximate 95% prediction intervals. The
Can agent based models effectively reduce fisheries management implementation uncertainty?
Drexler, M.
2016-02-01
Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.
Model uncertainty and systematic risk in US banking
Baele, L.T.M.; De Bruyckere, Valerie; De Jonghe, O.G.; Vander Vennet, Rudi
This paper uses Bayesian Model Averaging to examine the driving factors of equity returns of US Bank Holding Companies. BMA has as an advantage over OLS that it accounts for the considerable uncertainty about the correct set (model) of bank risk factors. We find that out of a broad set of 12 risk
Stothoff, Stuart A.
2013-06-01
The U.S. Nuclear Regulatory Commission actively investigated climate and infiltration at Yucca Mountain for two decades to (i) understand important controls and uncertainties influencing percolation through the unsaturated zone on multimillennial time scales and (ii) provide flux boundary conditions for up to 1 million years in performance assessment models of the proposed Yucca Mountain repository. This second part of a two-part series describes site-scale model results for present and potential future conditions and confirmatory analyses for present-day conditions. At both the grid-cell and site-average scale, the calculated uncertainty distribution for net infiltration is approximately lognormal, and the coefficient of variation decreases with increasing net infiltration. Smaller relative but larger absolute responses to climate change occur where net infiltration is large. Comparisons of distributed model estimates with temperature and geochemical observations from the unsaturated zone suggest that average estimates are generally consistent but exhibit significant variability. An observed seepage event in the South Ramp of the Exploratory Studies Facility, combined with related subsurface observations across the site, suggests that subsurface spreading from zones of high infiltration to zones of low infiltration may occur in stratabound fractures, laterally extensive discontinuities, or at transitions between welded and nonwelded tuff units. Two conceptual models for unsaturated-zone flow each explain the subsurface observations, collectively providing bounding estimates for net infiltration. Model-predicted uncertainty distribution for decadal-average site-scale net infiltration is generally consistent with estimated percolation fluxes using the bounding hypotheses, suggesting that the model-calculated uncertainty is reasonably consistent with the uncertainty in interpreting site observations.
Conceptual model of male military sexual trauma.
Elder, William B; Domino, Jessica L; Rentz, Timothy O; Mata-Galán, Emma L
2017-08-01
Male sexual trauma is understudied, leaving much to be known about the unique mental health needs of male survivors. This study examined veteran men's perceptions of the effects of military sexual trauma. Military sexual trauma was defined as physically forced, verbally coerced, or substance-incapacitated acts experienced during military service. Interviews were conducted with 21 male veterans who reported experiencing military sexual trauma. Data were drawn together using a grounded theory methodology. Three categories emerged from data analysis, including (a) types of military sexual trauma (being touched in a sexual way against their will [N = 18]; sexual remarks directed at them [N = 15]; being physically forced to have sex [N = 13]); (b) negative life effects (difficulty trusting others [N = 18]; fear of abandonment [N = 17]; substance use [N = 13]; fear of interpersonal violence [N = 12]; conduct and vocational problems [N = 11]; irritability/aggression [N = 8]; insecurity about sexual performance [N = 8]; difficulty managing anger [N = 8]); and (c) posttraumatic growth (N = 15). Results from this study suggest sexual trauma in the military context may affect systems of self-organization, specifically problems in affective, self-concept, and relational domains, similar to symptoms of those who have experienced prolonged traumatic stressors. This model can be used by clinicians to select treatments that specifically target these symptoms and promote posttraumatic growth. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
Model-based uncertainty in species range prediction
DEFF Research Database (Denmark)
Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel
2006-01-01
algorithm when extrapolating beyond the range of data used to build the model. The effects of these factors should be carefully considered when using this modelling approach to predict species ranges. Main conclusions We highlight an important source of uncertainty in assessments of the impacts of climate......Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions......, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. Location The Western Cape of South Africa. Methods We applied nine of the most widely used modelling techniques to model potential distributions under current...
Trade-offs underlying maternal breastfeeding decisions: A conceptual model
Tully, Kristin P.; Ball, Helen L.
2011-01-01
This paper presents a new conceptual model that generates predictions about breastfeeding decisions and identifies interactions that affect outcomes. We offer a contextual approach to infant feeding that models multi-directional influences by expanding on the evolutionary parent–offspring conflict and situation-specific breastfeeding theories. The main hypothesis generated from our framework suggests that simultaneously addressing breastfeeding costs and benefits, in relation to how they are ...
A Traceability-based Method to Support Conceptual Model Evolution
Ruiz Carmona, Luz Marcela
2014-01-01
Renewing software systems is one of the most cost-effective ways to protect software investment, which saves time, money and ensures uninter- rupted access to technical support and product upgrades. There are several mo- tivations to promote investment and scientific effort for specifying systems by means of conceptual models and supporting its evolution. As an example, the software engineering community is addressing solutions for supporting model traceability, continuous improvement of busi...
Impact of inherent meteorology uncertainty on air quality model predictions
Gilliam, Robert C.; Hogrefe, Christian; Godowitch, James M.; Napelenok, Sergey; Mathur, Rohit; Rao, S. Trivikrama
2015-12-01
It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is important to understand how uncertainties in these inputs affect the simulated concentrations. Ensembles are one method to explore how uncertainty in meteorology affects air pollution concentrations. Most studies explore this uncertainty by running different meteorological models or the same model with different physics options and in some cases combinations of different meteorological and air quality models. While these have been shown to be useful techniques in some cases, we present a technique that leverages the initial condition perturbations of a weather forecast ensemble, namely, the Short-Range Ensemble Forecast system to drive the four-dimensional data assimilation in the Weather Research and Forecasting (WRF)-Community Multiscale Air Quality (CMAQ) model with a key focus being the response of ozone chemistry and transport. Results confirm that a sizable spread in WRF solutions, including common weather variables of temperature, wind, boundary layer depth, clouds, and radiation, can cause a relatively large range of ozone-mixing ratios. Pollutant transport can be altered by hundreds of kilometers over several days. Ozone-mixing ratios of the ensemble can vary as much as 10-20 ppb or 20-30% in areas that typically have higher pollution levels.
Model for predicting mountain wave field uncertainties
Damiens, Florentin; Lott, François; Millet, Christophe; Plougonven, Riwal
2017-04-01
Studying the propagation of acoustic waves throughout troposphere requires knowledge of wind speed and temperature gradients from the ground up to about 10-20 km. Typical planetary boundary layers flows are known to present vertical low level shears that can interact with mountain waves, thereby triggering small-scale disturbances. Resolving these fluctuations for long-range propagation problems is, however, not feasible because of computer memory/time restrictions and thus, they need to be parameterized. When the disturbances are small enough, these fluctuations can be described by linear equations. Previous works by co-authors have shown that the critical layer dynamics that occur near the ground produces large horizontal flows and buoyancy disturbances that result in intense downslope winds and gravity wave breaking. While these phenomena manifest almost systematically for high Richardson numbers and when the boundary layer depth is relatively small compare to the mountain height, the process by which static stability affects downslope winds remains unclear. In the present work, new linear mountain gravity wave solutions are tested against numerical predictions obtained with the Weather Research and Forecasting (WRF) model. For Richardson numbers typically larger than unity, the mesoscale model is used to quantify the effect of neglected nonlinear terms on downslope winds and mountain wave patterns. At these regimes, the large downslope winds transport warm air, a so called "Foehn" effect than can impact sound propagation properties. The sensitivity of small-scale disturbances to Richardson number is quantified using two-dimensional spectral analysis. It is shown through a pilot study of subgrid scale fluctuations of boundary layer flows over realistic mountains that the cross-spectrum of mountain wave field is made up of the same components found in WRF simulations. The impact of each individual component on acoustic wave propagation is discussed in terms of
Modeling flow in fractured medium. Uncertainty analysis with stochastic continuum approach
International Nuclear Information System (INIS)
Niemi, A.
1994-01-01
For modeling groundwater flow in formation-scale fractured media, no general method exists for scaling the highly heterogeneous hydraulic conductivity data to model parameters. The deterministic approach is limited in representing the heterogeneity of a medium and the application of fracture network models has both conceptual and practical limitations as far as site-scale studies are concerned. The study investigates the applicability of stochastic continuum modeling at the scale of data support. No scaling of the field data is involved, and the original variability is preserved throughout the modeling. Contributions of various aspects to the total uncertainty in the modeling prediction can also be determined with this approach. Data from five crystalline rock sites in Finland are analyzed. (107 refs., 63 figs., 7 tabs.)
Fenicia, F.; Kavetski, D.; Savenije, H.H.G.
2011-01-01
This paper introduces a flexible framework for conceptual hydrological modeling, with two related objectives: (1) generalize and systematize the currently fragmented field of conceptual models and (2) provide a robust platform for understanding and modeling hydrological systems. In contrast to
Conceptual Commitments of the LIDA Model of Cognition
Franklin, Stan; Strain, Steve; McCall, Ryan; Baars, Bernard
2013-06-01
Significant debate on fundamental issues remains in the subfields of cognitive science, including perception, memory, attention, action selection, learning, and others. Psychology, neuroscience, and artificial intelligence each contribute alternative and sometimes conflicting perspectives on the supervening problem of artificial general intelligence (AGI). Current efforts toward a broad-based, systems-level model of minds cannot await theoretical convergence in each of the relevant subfields. Such work therefore requires the formulation of tentative hypotheses, based on current knowledge, that serve to connect cognitive functions into a theoretical framework for the study of the mind. We term such hypotheses "conceptual commitments" and describe the hypotheses underlying one such model, the Learning Intelligent Distribution Agent (LIDA) Model. Our intention is to initiate a discussion among AGI researchers about which conceptual commitments are essential, or particularly useful, toward creating AGI agents.
Uncertainty Quantification for Large-Scale Ice Sheet Modeling
Energy Technology Data Exchange (ETDEWEB)
Ghattas, Omar [Univ. of Texas, Austin, TX (United States)
2016-02-05
This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.
Flight Dynamics and Control of Elastic Hypersonic Vehicles Uncertainty Modeling
Chavez, Frank R.; Schmidt, David K.
1994-01-01
It has been shown previously that hypersonic air-breathing aircraft exhibit strong aeroelastic/aeropropulsive dynamic interactions. To investigate these, especially from the perspective of the vehicle dynamics and control, analytical expressions for key stability derivatives were derived, and an analysis of the dynamics was performed. In this paper, the important issue of model uncertainty, and the appropriate forms for representing this uncertainty, is addressed. It is shown that the methods suggested in the literature for analyzing the robustness of multivariable feedback systems, which as a prerequisite to their application assume particular forms of model uncertainty, can be difficult to apply on real atmospheric flight vehicles. Also, the extent to which available methods are conservative is demonstrated for this class of vehicle dynamics.
Sensitivity of wildlife habitat models to uncertainties in GIS data
Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.
1992-01-01
Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.
Linear models in the mathematics of uncertainty
Mordeson, John N; Clark, Terry D; Pham, Alex; Redmond, Michael A
2013-01-01
The purpose of this book is to present new mathematical techniques for modeling global issues. These mathematical techniques are used to determine linear equations between a dependent variable and one or more independent variables in cases where standard techniques such as linear regression are not suitable. In this book, we examine cases where the number of data points is small (effects of nuclear warfare), where the experiment is not repeatable (the breakup of the former Soviet Union), and where the data is derived from expert opinion (how conservative is a political party). In all these cases the data is difficult to measure and an assumption of randomness and/or statistical validity is questionable. We apply our methods to real world issues in international relations such as nuclear deterrence, smart power, and cooperative threat reduction. We next apply our methods to issues in comparative politics such as successful democratization, quality of life, economic freedom, political stability, and fail...
Solar model uncertainties, MSW analysis, and future solar neutrino experiments
International Nuclear Information System (INIS)
Hata, N.; Langacker, P.
1994-01-01
Various theoretical uncertainties in the standard solar model and in the Mikheyev-Smirnov-Wolfenstein (MSW) analysis are discussed. It is shown that two methods give consistent estimations of the solar neutrino flux uncertainties: (a) a simple parametrization of the uncertainties using the core temperature and the ncuelar production cross sections; (b) the Monte Carlo method of Bahcall and Ulrich. In the MSW analysis, we emphasize proper treatments of correlations of theoretical uncertainties between flux components and between different detectors, the Earth effect, and multiple solutions in a combined χ 2 procedure. In particular the large-angle solution of the combined observation is allowed at 95% C.L. only when the theoretical uncertainties are included. If their correlations were ignored, the region would be overestimated. The MSW solutions for various standard and nonstandard solar models are also shown. The MSW predictions of the global solutions for the future solar neutrino experiments are given, emphasizing the measurement of the energy spectrum and the day-night effect in Sudbury Neutrino Observatory and Super-Kamiokande to distinguish the two solutions
Uncertainty assessment in building energy performance with a simplified model
Directory of Open Access Journals (Sweden)
Titikpina Fally
2015-01-01
Full Text Available To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared to the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of the dynamic and the static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the Guide to the Expression of Measurement Uncertainty (GUM as well as by Bayesian Statistical Theory (BST. Another choice is the use of numerical methods like Monte Carlo Simulation (MCS. In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST is given. Therefore, an office building has been monitored and multiple temperature sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 m2.
Simulation of shallow groundwater levels: Comparison of a data-driven and a conceptual model
Fahle, Marcus; Dietrich, Ottfried; Lischeid, Gunnar
2015-04-01
Despite an abundance of models aimed at simulating shallow groundwater levels, application of such models is often hampered by a lack of appropriate input data. Difficulties especially arise with regard to soil data, which are typically hard to obtain and prone to spatial variability, eventually leading to uncertainties in the model results. Modelling approaches relying entirely on easily measured quantities are therefore an alternative to encourage the applicability of models. We present and compare two models for calculating 1-day-ahead predictions of the groundwater level that are only based on measurements of potential evapotranspiration, precipitation and groundwater levels. The first model is a newly developed conceptual model that is parametrized using the White method (which estimates the actual evapotranspiration on basis of diurnal groundwater fluctuations) and a rainfall-response ratio. Inverted versions of the two latter approaches are then used to calculate the predictions of the groundwater level. Furthermore, as a completely data-driven alternative, a simple feed-forward multilayer perceptron neural network was trained based on the same inputs and outputs. Data of 4 growing periods (April to October) from a study site situated in the Spreewald wetland in North-east Germany were taken to set-up the models and compare their performance. In addition, response surfaces that relate model outputs to combinations of different input variables are used to reveal those aspects in which the two approaches coincide and those in which they differ. Finally, it will be evaluated whether the conceptual approach can be enhanced by extracting knowledge of the neural network. This is done by replacing in the conceptual model the default function that relates groundwater recharge and groundwater level, which is assumed to be linear, by the non-linear function extracted from the neural network.
Energy Technology Data Exchange (ETDEWEB)
Vermeul, Vincent R.; Cole, Charles R.; Bergeron, Marcel P.; Thorne, Paul D.; Wurstner, Signe K.
2001-08-29
The baseline three-dimensional transient inverse model for the estimation of site-wide scale flow parameters, including their uncertainties, using data on the transient behavior of the unconfined aquifer system over the entire historical period of Hanford operations, has been modified to account for the effects of basalt intercommunication between the Hanford unconfined aquifer and the underlying upper basalt confined aquifer. Both the baseline and alternative conceptual models (ACM-1) considered only the groundwater flow component and corresponding observational data in the 3-Dl transient inverse calibration efforts. Subsequent efforts will examine both groundwater flow and transport. Comparisons of goodness of fit measures and parameter estimation results for the ACM-1 transient inverse calibrated model with those from previous site-wide groundwater modeling efforts illustrate that the new 3-D transient inverse model approach will strengthen the technical defensibility of the final model(s) and provide the ability to incorporate uncertainty in predictions related to both conceptual model and parameter uncertainty. These results, however, indicate that additional improvements are required to the conceptual model framework. An investigation was initiated at the end of this basalt inverse modeling effort to determine whether facies-based zonation would improve specific yield parameter estimation results (ACM-2). A description of the justification and methodology to develop this zonation is discussed.
Uncertainties in model-based outcome predictions for treatment planning
International Nuclear Information System (INIS)
Deasy, Joseph O.; Chao, K.S. Clifford; Markman, Jerry
2001-01-01
Purpose: Model-based treatment-plan-specific outcome predictions (such as normal tissue complication probability [NTCP] or the relative reduction in salivary function) are typically presented without reference to underlying uncertainties. We provide a method to assess the reliability of treatment-plan-specific dose-volume outcome model predictions. Methods and Materials: A practical method is proposed for evaluating model prediction based on the original input data together with bootstrap-based estimates of parameter uncertainties. The general framework is applicable to continuous variable predictions (e.g., prediction of long-term salivary function) and dichotomous variable predictions (e.g., tumor control probability [TCP] or NTCP). Using bootstrap resampling, a histogram of the likelihood of alternative parameter values is generated. For a given patient and treatment plan we generate a histogram of alternative model results by computing the model predicted outcome for each parameter set in the bootstrap list. Residual uncertainty ('noise') is accounted for by adding a random component to the computed outcome values. The residual noise distribution is estimated from the original fit between model predictions and patient data. Results: The method is demonstrated using a continuous-endpoint model to predict long-term salivary function for head-and-neck cancer patients. Histograms represent the probabilities for the level of posttreatment salivary function based on the input clinical data, the salivary function model, and the three-dimensional dose distribution. For some patients there is significant uncertainty in the prediction of xerostomia, whereas for other patients the predictions are expected to be more reliable. In contrast, TCP and NTCP endpoints are dichotomous, and parameter uncertainties should be folded directly into the estimated probabilities, thereby improving the accuracy of the estimates. Using bootstrap parameter estimates, competing treatment
Promoting Conceptual Coherence in Quantum Learning through Computational Models
Lee, Hee-Sun
2012-02-01
In order to explain phenomena at the quantum level, scientists use multiple representations in verbal, pictorial, mathematical, and computational forms. Conceptual coherence among these multiple representations is used as an analytical framework to describe student learning trajectories in quantum physics. A series of internet-based curriculum modules are designed to address topics in quantum mechanics, semiconductor physics, and nano-scale engineering applications. In these modules, students are engaged in inquiry-based activities situated in a highly interactive computational modeling environment. This study was conducted in an introductory level solid state physics course. Based on in-depth interviews with 13 students, methods for identifying conceptual coherence as a function of students' level of understanding are presented. Pre-post test comparisons of 20 students in the course indicate a statistically significant improvement in students' conceptual coherence of understanding quantum phenomena before and after the course, Effect Size = 1.29 SD. Additional analyses indicate that students who responded to the modules more coherently improved their conceptual coherence to a greater extent than those who did less to the modules after controlling for their course grades.
Uncertainty Quantification in Control Problems for Flocking Models
Directory of Open Access Journals (Sweden)
Giacomo Albi
2015-01-01
Full Text Available The optimal control of flocking models with random inputs is investigated from a numerical point of view. The effect of uncertainty in the interaction parameters is studied for a Cucker-Smale type model using a generalized polynomial chaos (gPC approach. Numerical evidence of threshold effects in the alignment dynamic due to the random parameters is given. The use of a selective model predictive control permits steering of the system towards the desired state even in unstable regimes.
Where is positional uncertainty a problem for species distribution modelling?
Naimi, N.; Hamm, N.A.S.; Groen, T.A.; Skidmore, A.K.; Toxopeus, A.G.
2014-01-01
Species data held in museum and herbaria, survey data and opportunistically observed data are a substantial information resource. A key challenge in using these data is the uncertainty about where an observation is located. This is important when the data are used for species distribution modelling
Reducing uncertainty based on model fitness: Application to a ...
African Journals Online (AJOL)
A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.
Uncertainty modelling of critical column buckling for reinforced ...
Indian Academy of Sciences (India)
ances the accuracy of the structural models by using experimental results and design codes. (Baalbaki et al 1991; ... in calculation of column buckling load as defined in the following section. 4. Fuzzy logic ... material uncertainty, using the value becomes a critical solution and is a more accurate and safe method compared ...
System convergence in transport models: algorithms efficiency and output uncertainty
DEFF Research Database (Denmark)
Rich, Jeppe; Nielsen, Otto Anker
2015-01-01
of this paper is to analyse convergence performance for the external loop and to illustrate how an improper linkage between the converging parts can lead to substantial uncertainty in the final output. Although this loop is crucial for the performance of large-scale transport models it has not been analysed......-scale in the Danish National Transport Model (DNTM). It is revealed that system convergence requires that either demand or supply is without random noise but not both. In that case, if MSA is applied to the model output with random noise, it will converge effectively as the random effects are gradually dampened...... in the MSA process. In connection to DNTM it is shown that MSA works well when applied to travel-time averaging, whereas trip averaging is generally infected by random noise resulting from the assignment model. The latter implies that the minimum uncertainty in the final model output is dictated...
Geostatistical modeling of groundwater properties and assessment of their uncertainties
International Nuclear Information System (INIS)
Honda, Makoto; Yamamoto, Shinya; Sakurai, Hideyuki; Suzuki, Makoto; Sanada, Hiroyuki; Matsui, Hiroya; Sugita, Yutaka
2010-01-01
The distribution of groundwater properties is important for understanding of the deep underground hydrogeological environments. This paper proposes a geostatistical system for modeling the groundwater properties which have a correlation with the ground resistivity data obtained from widespread and exhaustive survey. That is, the methodology for the integration of resistivity data measured by various methods and the methodology for modeling the groundwater properties using the integrated resistivity data has been developed. The proposed system has also been validated using the data obtained in the Horonobe Underground Research Laboratory project. Additionally, the quantification of uncertainties in the estimated model has been tried by numerical simulations based on the data. As a result, the uncertainties of the proposal model have been estimated lower than other traditional model's. (author)
Conceptual Model of the Globalization for Domain-Specific Languages
Clark, Tony; Van Den Brand, Mark; Combemale, Benoit; Rumpe, Bernhard
2015-01-01
International audience; Domain Specific Languages (DSL) have received some prominence recently. Designing a DSL and all their tools is still cumbersome and lots of work. Engineering of DSLs is still at infancy, not even the terms have been coined and agreed on. In particular globalization and all its consequences need to be precisely defined and discussed. This chapter provides a definition of the relevant terms and relates them, such that a conceptual model emerges. The authors think that th...
Advanced practice nursing and conceptual models of nursing.
Fawcett, Jacqueline; Newman, Diana M L; McAllister, Margaret
2004-04-01
This column focuses on advanced practice nursing. A definition and central competency of advanced practice are given and four roles assumed by advanced practice nurses are identified. Questions related primarily to the advanced practice role of nurse practitioner are raised. Two nurse scholars who teach and practice discuss their experiences as advanced practice nurses, with an emphasis on the importance of using a conceptual model of nursing as a guide for their practice.
Fatigue in fibromyalgia: a conceptual model informed by patient interviews
DEFF Research Database (Denmark)
Humphrey, Louise; Arbuckle, Rob; Mease, Philip
2010-01-01
Fatigue is increasingly recognized as an important symptom in fibromyalgia (FM). Unknown however is how fatigue is experienced by individuals in the context of FM. We conducted qualitative research in order to better understand aspects of fatigue that might be unique to FM as well as the impact...... it has on patients' lives. The data obtained informed the development of a conceptual model of fatigue in FM....
Identification of Chemistry Learning Problems Viewed From Conceptual Change Model
Redhana, I. W; Sudria, I. B. N; Hidayat, I; Merta, L. M
2017-01-01
This study aimed at describing and explaining chemistry learning problems viewed from conceptual change model and misconceptions of students. The study was qualitative research of case study type conducted in one class of SMAN 1 Singaraja. Subjects of the study were a chemistry teacher and students. Data were obtained through classroom observation, interviews, and conception tests. The chemistry learning problems were grouped based on aspects of necessity, intelligibility, plausibility, and f...
Conceptual Model of Climate Change Impacts at LANL
Energy Technology Data Exchange (ETDEWEB)
Dewart, Jean Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-17
Goal 9 of the LANL FY15 Site Sustainability Plan (LANL 2014a) addresses Climate Change Adaptation. As part of Goal 9, the plan reviews many of the individual programs the Laboratory has initiated over the past 20 years to address climate change impacts to LANL (e.g. Wildland Fire Management Plan, Forest Management Plan, etc.). However, at that time, LANL did not yet have a comprehensive approach to climate change adaptation. To fill this gap, the FY15 Work Plan for the LANL Long Term Strategy for Environmental Stewardship and Sustainability (LANL 2015) included a goal of (1) establishing a comprehensive conceptual model of climate change impacts at LANL and (2) establishing specific climate change indices to measure climate change and impacts at Los Alamos. Establishing a conceptual model of climate change impacts will demonstrate that the Laboratory is addressing climate change impacts in a comprehensive manner. This paper fulfills the requirement of goal 1. The establishment of specific indices of climate change at Los Alamos (goal 2), will improve our ability to determine climate change vulnerabilities and assess risk. Future work will include prioritizing risks, evaluating options/technologies/costs, and where appropriate, taking actions. To develop a comprehensive conceptual model of climate change impacts, we selected the framework provided in the National Oceanic and Atmospheric Administration (NOAA) Climate Resilience Toolkit (http://toolkit.climate.gov/).
Incorporating agricultural land cover in conceptual rainfall runoff models
Euser, Tanja; Hrachowitz, Markus; Winsemius, Hessel; Savenije, Hubert
2015-04-01
Incorporating spatially variable information is a frequently discussed option to increase the performance of (semi) distributed conceptual rainfall runoff models. One of the methods to do this is by using these spatially variable information to delineate Hydrological Response Units (HRUs) within a catchment. This study tests whether the incorporation of an additional agricultural HRU in a conceptual hydrological model can better reflect the spatial differences in runoff generation and therefore improve the simulation of the wetting phase in autumn. The study area is the meso-scale Ourthe catchment in Belgium. A previous study in this area showed that spatial patterns in runoff generation were already better represented by incorporation of a wetland and a hillslope HRU, compared to a lumped model structure. The influences which are considered by including an agriculture HRU are increased drainage speed due to roads, plough pans and increased infiltration excess overland flow (drainage pipes area only limited present), and variable vegetation patterns due to sowing and harvesting. In addition, the vegetation is not modelled as a static resistance towards evaporation, but the Jarvis stress functions are used to increase the realism of the modelled transpiration; in land-surface models the Jarvis stress functions are already often used for modelling transpiration. The results show that an agricultural conceptualisation in addition to wetland and hillslope conceptualisations leads to small improvements in the modelled discharge. However, the influence is larger on the representation of spatial patterns and the modelled contributions of different HRUs to the total discharge.
Robust nonlinear control of nuclear reactors under model uncertainty
International Nuclear Information System (INIS)
Park, Moon Ghu
1993-02-01
A nonlinear model-based control method is developed for the robust control of a nuclear reactor. The nonlinear plant model is used to design a unique control law which covers a wide operating range. The robustness is a crucial factor for the fully automatic control of reactor power due to time-varying, uncertain parameters, and state estimation error, or unmodeled dynamics. A variable structure control (VSC) method is introduced which consists of an adaptive performance specification (fime control) after the tracking error reaches the narrow boundary-layer by a time-optimal control (coarse control). Variable structure control is a powerful method for nonlinear system controller design which has inherent robustness to parameter variations or external disturbances using the known uncertainty bounds, and it requires very low computational efforts. In spite of its desirable properties, conventional VSC presents several important drawbacks that limit its practical applicability. One of the most undesirable phenomena is chattering, which implies extremely high control activity and may excite high-frequency unmodeled dynamics. This problem is due to the neglected actuator time-delay or sampling effects. The problem was partially remedied by replacing chattering control by a smooth control inter-polation in a boundary layer neighnboring a time-varying sliding surface. But, for the nuclear reactor systems which has very fast dynamic response, the sampling effect may destroy the narrow boundary layer when a large uncertainty bound is used. Due to the very short neutron life time, large uncertainty bound leads to the high gain in feedback control. To resolve this problem, a derivative feedback is introduced that gives excellent performance by reducing the uncertainty bound. The stability of tracking error dynamics is guaranteed by the second method of Lyapunov using the two-level uncertainty bounds that are obtained from the knowledge of uncertainty bound and the estimated
Bayesian uncertainty analysis with applications to turbulence modeling
International Nuclear Information System (INIS)
Cheung, Sai Hung; Oliver, Todd A.; Prudencio, Ernesto E.; Prudhomme, Serge; Moser, Robert D.
2011-01-01
In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.
A simplified model of choice behavior under uncertainty
Directory of Open Access Journals (Sweden)
Ching-Hung Lin
2016-08-01
Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.
Conceptual Model of Quantities, Units, Dimensions, and Values
Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar
2011-01-01
JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.
On conceptual differentiation and integration of strategy and business model
Directory of Open Access Journals (Sweden)
Ivan Stefanovic
2012-06-01
Full Text Available The objective of this paper is to develop the conceptual integration of strategy and business model. Theoretical method is used in order to achieve this objective. The theory building leads to the construction of conceptual model of strategy and business model, and provides its underlying logic. The main finding is that strategy is a pattern within which a business model changes. Only one strategy may exist for a firm in a concrete time frame, while there may be countless business models in the same period. Therefore, strategy represents the sum of all business models and their changes within a specified period. Each business model matches the set of functional strategies and their interdependencies, making strategic content in some particular moment, i.e. each business model is actually a bisection of the business strategy or a bisection of a set of functional strategies in one concrete moment. This specific contribution can be understood only if one takes an appropriate viewpoint of the process of strategy formation, namely the reactive perspective.
Uncertainty modelling and code calibration for composite materials
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Branner, Kim; Mishnaevsky, Leon, Jr
2013-01-01
Uncertainties related to the material properties of a composite material can be determined from the micro-, meso- or macro-scales. These three starting points for a stochastic modelling of the material properties are investigated. The uncertainties are divided into physical, model, statistical...... between risk of failure and cost of the structure. Consideration related to calibration of partial safety factors for composite material is described, including the probability of failure, format for the partial safety factor method and weight factors for different load cases. In a numerical example......, it is demonstrated how probabilistic models for the material properties formulated on micro-scale can be calibrated using tests on the meso- and macro-scales. The results are compared to probabilistic models estimated directly from tests on the macro-scale. In another example, partial safety factors for application...
Tyler Jon Smith; Lucy Amanda. Marshall
2010-01-01
Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...
Gray, Kathleen; Sockolow, Paulina
2016-02-24
Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one's choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines.
2016-01-01
Background Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. Objectives The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. Methods A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. Results The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one’s choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Conclusions Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other
Approximating prediction uncertainty for random forest regression models
John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne
2016-01-01
Machine learning approaches such as random forest haveÂ increased for the spatial modeling and mapping of continuousÂ variables. Random forest is a non-parametric ensembleÂ approach, and unlike traditional regression approaches thereÂ is no direct quantification of prediction error. UnderstandingÂ prediction uncertainty is important when using model-basedÂ continuous maps as...
Uncertainties in soil-plant interactions in advanced models for long-timescale dose assessment
Energy Technology Data Exchange (ETDEWEB)
Klos, R. [Aleksandria Sciences Ltd. (United Kingdom); Limer, L. [Limer Scientific Ltd. (United Kingdom); Perez-Sanchez, D. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Xu, S.; Andersson, P. [Swedish Radiation Safty Authority (Sweden)
2014-07-01
Traditional models for long-timescale dose assessment are generally conceptually straightforward, featuring one, two or three spatial compartments in the soil column and employing data based on annually averaged parameters for climate characteristics. The soil-plant system is usually modelled using concentration ratios. The justification for this approach is that the timescales relevant to the geologic disposal of radioactive waste are so long that simple conceptual models are necessary to account for the inherent uncertainties over the timescale of the dose assessment. In the past few years, attention has been given to more detailed 'advanced' models for use dose assessment that have a high degree of site-specific detail. These recognise more features, events and processes since they have higher spatial and temporal resolution. This modelling approach has been developed to account for redox sensitive radionuclides, variability of the water table position and accumulation in non-agricultural ecosystems prior to conversion to an agricultural ecosystem. The models feature higher spatial and temporal resolution in the soil column (up to ten layers with spatially varying k{sub d}s dependent on soil conditions) and monthly rather than annually averaged parameters. Soil-plant interaction is treated as a dynamic process, allowing for root uptake as a function of time and depth, according to the root profile. Uncertainty in dose assessment models associated with the treatment of prior accumulations in agricultural soils has demonstrated the importance of the model's representation of the soil-plant interaction. The treatment of root uptake as a dynamic process as opposed to a simple concentration ratio implies a potentially important difference despite the dynamic soil-plant transfer rate being based on established concentration ratio values. These discrepancies have also appeared in the results from the higher spatio-temporal resolution models. This paper
National Identity: Conceptual models, discourses and political change
DEFF Research Database (Denmark)
Harder, Peter
2014-01-01
Cognitive Linguistics has demonstrated the applicability of a conceptual approach to the understanding of political issues, cf. Lakoff (2008) and many others. From a different perspective, critical discourse analysis has approached political concepts with a focus on issues involving potentially...... in Harder (2010), however, both the analytic stance of critical discourse analysis (based on the hermeneutics of suspicion), and the cognitivist stance of Lakoff (2008) are too narrow: The understanding of political language requires a wider framework of social cognitive linguistics. Essential features...... of conceptual models or discourses. This is especially important in cases that involve conflictive political issues such as national and ethnic identity. The article reports on a historical project with a linguistic dimension in my department (PI Stuart Ward, cf. Ward 2004), which aims to throw light...
Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models
Energy Technology Data Exchange (ETDEWEB)
Ahmed Hassan; Jenny Chapman
2006-02-01
The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP
Conceptual Modeling in the Time of the Revolution: Part II
Mylopoulos, John
Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.
Chu, W.; Gao, X.; Sorooshian, S.
2009-12-01
With the advancement of modern computer technology, many heuristic global optimization algorithms have been developed and applied to various fields of science and engineering in the last two decades. In surface hydrology, parameter optimization is a bridge connecting model simulation and real observation. Due to the lack of detailed physical understanding or descriptions of the hydrological process, most rainfall-runoff models are built with conceptual components. Therefore, the model parameters mostly include unknown correlations and uncertainties and have to be calibrated based on observation to make the model function properly. As a good attempt to automatically calibrate conceptual rainfall-runoff models, the shuffled complex evolution (SCE-UA) method was developed and has exhibited its efficacy and efficiency. However, our recent study reveals that the SCE-UA method overlooks some fundamental assumption of direct search theory and hence loses its power when optimizing complex and high-dimensional problems. By integrating some new techniques of heuristic search and overcoming the above-mentioned shortage, a new method has been developed. This method is applied to calibrate the Sacramento Soil Moisture Accounting (SAC-SMA) model and study the parameter uncertainties. Results show that the method outperforms SCE-UA in the following aspects: 1) It retrieves better parameter values which further reduce the model’s root mean square error; 2) The method is more robust; 3) The ensemble of optimized parameters using this method better delineates model parameters’ uncertainty, which is critical to understanding model behaviors.
A Conceptual Model of eLearning Adoption
Directory of Open Access Journals (Sweden)
Muneer Abbad
2011-05-01
Full Text Available Internet-based learning systems are being used in many universities and firms but their adoption requires a solid understanding of the user acceptance processes. The technology acceptance model (TAM has been used to test the acceptance of various technologies and software within an e-learning context. This research aims to discuss the main factors of a successful e-learning adoption by students. A conceptual research framework of e-learning adoption is proposed based on the TAM model.
A Conceptual Model for Multidimensional Analysis of Documents
Ravat, Franck; Teste, Olivier; Tournier, Ronan; Zurlfluh, Gilles
Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations.
Approaches to handling uncertainty when setting environmental exposure standards
DEFF Research Database (Denmark)
Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe
2009-01-01
attempts for the first time to cover the full range of issues related to model uncertainties, from the subjectivity of setting up a conceptual model of a given system, all the way to communicating the nature of model uncertainties to non-scientists and accounting for model uncertainties in policy decisions...
A Conceptual Culture Model for Design Science Research
Directory of Open Access Journals (Sweden)
Thomas Richter
2016-03-01
Full Text Available The aim of design science research (DSR in information systems is the user-centred creation of IT-artifacts with regard to specific social environments. For culture research in the field, which is necessary for a proper localization of IT-artifacts, models and research approaches from social sciences usually are adopted. Descriptive dimension-based culture models most commonly are applied for this purpose, which assume culture being a national phenomenon and tend to reduce it to basic values. Such models are useful for investigations in behavioural culture research because it aims to isolate, describe and explain culture-specific attitudes and characteristics within a selected society. In contrast, with the necessity to deduce concrete decisions for artifact-design, research results from DSR need to go beyond this aim. As hypothesis, this contribution generally questions the applicability of such generic culture dimensions’ models for DSR and focuses on their theoretical foundation, which goes back to Hofstede’s conceptual Onion Model of Culture. The herein applied literature-based analysis confirms the hypothesis. Consequently, an alternative conceptual culture model is being introduced and discussed as theoretical foundation for culture research in DSR.
Conceptual models of microseismicity induced by fluid injection
Baro Urbea, J.; Lord-May, C.; Eaton, D. W. S.; Joern, D.
2017-12-01
Variations in the pore pressure due to fluid invasion are accountable for microseismic activity recorded in geothermal systems and during hydraulic fracturing operations. To capture this phenomenon on a conceptual level, invasion percolation models have been suggested to represent the flow network of fluids within a porous media and seismic activity is typically considered to be directly related to the expansion of the percolated area. Although such models reproduce scale-free frequency-magnitude distributions, the associated b-values of the Gutenberg-Richter relation do not align with observed data. Here, we propose an alternative conceptual invasion percolation model that decouples the fluid propagation from the microseismic events. Instead of a uniform pressure, the pressure is modeled to decay along the distance from the injection site. Wet fracture events are simulated with a stochastic spring block model exhibiting stick-slip dynamics as a result of the variations of the pore pressure. We show that the statistics of the stick-slip events are scale-free, but now the b-values depend on the level of heterogeneity in the local static friction coefficients. Thus, this model is able to reproduce the wide spectrum of b-values observed in field catalogs associated with fluid induced microseismicity. Moreover, the spatial distribution of microseismic events is also consistent with observations.
Effects of input uncertainty on cross-scale crop modeling
Waha, Katharina; Huth, Neil; Carberry, Peter
2014-05-01
The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input
A Conceptual Model for Engagement of the Online Learner
Directory of Open Access Journals (Sweden)
Lorraine M. Angelino
2009-01-01
Full Text Available Engagement of the online learner is one approach to reduce attrition rates. Attrition rates for classes taught through distance education are 10 – 20% higher than classes taught in a face-to-face setting. This paper introduces a Model for Engagement and provides strategies to engage the online learner. The Model depicts various opportunities where student-instructor, student-student, student-content, and student-community engagement can occur. The Model is divided into four strategic areas: (a recruitment, (b coursework, (c post coursework, and (d alumni. The theoretical framework for the model is Tinto‟s student integration model. The conceptual design of the model is based on engagement practices from an online Health Care Management (HCMT certificate program at a university in South Carolina.
Regional knowledge economy development indicative planning system conceptual model
Directory of Open Access Journals (Sweden)
Elena Davidovna Vaisman
2012-12-01
Full Text Available The subject of the research is the processes of Russian knowledge economy development, its progress on the regional level is taken as a theme, which determined the purpose of research: development of the regional knowledge economy development indicative planning method conceptual model. The methodological base of the research is the knowledge economy concept and supply and demand theory, the methods of comparative and system analysis and theoretical modeling; common generalization and classification methods and regression models are used in the work. As a result, we managed to create the regional knowledge economy development indicative planning method conceptual model, which includes the choice of the types of indicative plans and the justification for the complex of indicators according to the stated requirements to this complex. The model of supply and demand for knowledge dependency from the knowledge cost, allowing to determine the acceptable range for the indicators proceeding from the demand and supply levels and their interrelation, is developed. The obtained results may be used by the regional government authorities while planning the regional innovative development and consulting companies while making the proposals for this development
Hydrologic Scenario Uncertainty in a Comprehensive Assessment of Hydrogeologic Uncertainty
Nicholson, T. J.; Meyer, P. D.; Ye, M.; Neuman, S. P.
2005-12-01
A method to jointly assess hydrogeologic conceptual model and parameter uncertainties has recently been developed based on a Maximum Likelihood implementation of Bayesian Model Averaging (MLBMA). Evidence from groundwater model post-audits suggests that errors in the projected future hydrologic conditions of a site (hydrologic scenarios) are a significant source of model predictive errors. MLBMA can be extended to include hydrologic scenario uncertainty, along with conceptual model and parameter uncertainties, in a systematic and quantitative assessment of predictive uncertainty. Like conceptual model uncertainty, scenario uncertainty is represented by a discrete set of alternative scenarios. The effect of scenario uncertainty on model predictions is quantitatively assessed by conducting an MLBMA analysis under each scenario. We demonstrate that posterior model probability is a function of the scenario only through the possible dependence of prior model probabilities on the scenario. As a result, the model likelihoods (computed from calibration results), are not a function of the scenario and do not need to be recomputed under each scenario. MLBMA results for each scenario are weighted by the scenario probability and combined to render a joint assessment of scenario, conceptual model, and parameter uncertainty. Like model probability, scenario probability represents a subjective evaluation, in this case of the plausibility of the occurrence of the specific scenario. Because the scenarios describe future conditions, the scenario probabilities represent prior estimates and cannot be updated using the (past) system state data as is used to compute posterior model probabilities. Assessment of hydrologic scenario uncertainty is illustrated using a site-specific application considering future changes in land use, dam operations, and climate. Estimation of scenario probabilities and consideration of scenario characteristics (e.g., timing, magnitude) are discussed.
Model Uncertainties for Valencia RPA Effect for MINERvA
Energy Technology Data Exchange (ETDEWEB)
Gran, Richard [Univ. of Minnesota, Duluth, MN (United States)
2017-05-08
This technical note describes the application of the Valencia RPA multi-nucleon effect and its uncertainty to QE reactions from the GENIE neutrino event generator. The analysis of MINERvA neutrino data in Rodrigues et al. PRL 116 071802 (2016) paper makes clear the need for an RPA suppression, especially at very low momentum and energy transfer. That published analysis does not constrain the magnitude of the effect; it only tests models with and without the effect against the data. Other MINERvA analyses need an expression of the model uncertainty in the RPA effect. A well-described uncertainty can be used for systematics for unfolding, for model errors in the analysis of non-QE samples, and as input for fitting exercises for model testing or constraining backgrounds. This prescription takes uncertainties on the parameters in the Valencia RPA model and adds a (not-as-tight) constraint from muon capture data. For MINERvA we apply it as a 2D ($q_0$,$q_3$) weight to GENIE events, in lieu of generating a full beyond-Fermi-gas quasielastic events. Because it is a weight, it can be applied to the generated and fully Geant4 simulated events used in analysis without a special GENIE sample. For some limited uses, it could be cast as a 1D $Q^2$ weight without much trouble. This procedure is a suitable starting point for NOvA and DUNE where the energy dependence is modest, but probably not adequate for T2K or MicroBooNE.
Sensitivity of modeled ozone concentrations to uncertainties in biogenic emissions
International Nuclear Information System (INIS)
Roselle, S.J.
1992-06-01
The study examines the sensitivity of regional ozone (O3) modeling to uncertainties in biogenic emissions estimates. The United States Environmental Protection Agency's (EPA) Regional Oxidant Model (ROM) was used to simulate the photochemistry of the northeastern United States for the period July 2-17, 1988. An operational model evaluation showed that ROM had a tendency to underpredict O3 when observed concentrations were above 70-80 ppb and to overpredict O3 when observed values were below this level. On average, the model underpredicted daily maximum O3 by 14 ppb. Spatial patterns of O3, however, were reproduced favorably by the model. Several simulations were performed to analyze the effects of uncertainties in biogenic emissions on predicted O3 and to study the effectiveness of two strategies of controlling anthropogenic emissions for reducing high O3 concentrations. Biogenic hydrocarbon emissions were adjusted by a factor of 3 to account for the existing range of uncertainty in these emissions. The impact of biogenic emission uncertainties on O3 predictions depended upon the availability of NOx. In some extremely NOx-limited areas, increasing the amount of biogenic emissions decreased O3 concentrations. Two control strategies were compared in the simulations: (1) reduced anthropogenic hydrocarbon emissions, and (2) reduced anthropogenic hydrocarbon and NOx emissions. The simulations showed that hydrocarbon emission controls were more beneficial to the New York City area, but that combined NOx and hydrocarbon controls were more beneficial to other areas of the Northeast. Hydrocarbon controls were more effective as biogenic hydrocarbon emissions were reduced, whereas combined NOx and hydrocarbon controls were more effective as biogenic hydrocarbon emissions were increased
Uncertainty propagation in a multiscale model of nanocrystalline plasticity
International Nuclear Information System (INIS)
Koslowski, M.; Strachan, Alejandro
2011-01-01
We characterize how uncertainties propagate across spatial and temporal scales in a physics-based model of nanocrystalline plasticity of fcc metals. Our model combines molecular dynamics (MD) simulations to characterize atomic-level processes that govern dislocation-based-plastic deformation with a phase field approach to dislocation dynamics (PFDD) that describes how an ensemble of dislocations evolve and interact to determine the mechanical response of the material. We apply this approach to a nanocrystalline Ni specimen of interest in micro-electromechanical (MEMS) switches. Our approach enables us to quantify how internal stresses that result from the fabrication process affect the properties of dislocations (using MD) and how these properties, in turn, affect the yield stress of the metallic membrane (using the PFMM model). Our predictions show that, for a nanocrystalline sample with small grain size (4 nm), a variation in residual stress of 20 MPa (typical in today's microfabrication techniques) would result in a variation on the critical resolved shear yield stress of approximately 15 MPa, a very small fraction of the nominal value of approximately 9 GPa. - Highlights: → Quantify how fabrication uncertainties affect yield stress in a microswitch component. → Propagate uncertainties in a multiscale model of single crystal plasticity. → Molecular dynamics quantifies how fabrication variations affect dislocations. → Dislocation dynamics relate variations in dislocation properties to yield stress.
Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops
Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said
2017-11-01
The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.
Our evolving conceptual model of the coastal eutrophication problem
Cloern, James E.
2001-01-01
A primary focus of coastal science during the past 3 decades has been the question: How does anthropogenic nutrient enrichment cause change in the structure or function of nearshore coastal ecosystems? This theme of environmental science is recent, so our conceptual model of the coastal eutrophication problem continues to change rapidly. In this review, I suggest that the early (Phase I) conceptual model was strongly influenced by limnologists, who began intense study of lake eutrophication by the 1960s. The Phase I model emphasized changing nutrient input as a signal, and responses to that signal as increased phytoplankton biomass and primary production, decomposition of phytoplankton-derived organic matter, and enhanced depletion of oxygen from bottom waters. Coastal research in recent decades has identified key differences in the responses of lakes and coastal-estuarine ecosystems to nutrient enrichment. The contemporary (Phase II) conceptual model reflects those differences and includes explicit recognition of (1) system-specific attributes that act as a filter to modulate the responses to enrichment (leading to large differences among estuarine-coastal systems in their sensitivity to nutrient enrichment); and (2) a complex suite of direct and indirect responses including linked changes in: water transparency, distribution of vascular plants and biomass of macroalgae, sediment biogeochemistry and nutrient cycling, nutrient ratios and their regulation of phytoplankton community composition, frequency of toxic/harmful algal blooms, habitat quality for metazoans, reproduction/growth/survival of pelagic and benthic invertebrates, and subtle changes such as shifts in the seasonality of ecosystem functions. Each aspect of the Phase II model is illustrated here with examples from coastal ecosystems around the world. In the last section of this review I present one vision of the next (Phase III) stage in the evolution of our conceptual model, organized around 5
Directory of Open Access Journals (Sweden)
Mehran FARAJOLLAHI
2010-07-01
Full Text Available The present research aims at presenting a conceptual model for effective distance learning in higher education. Findings of this research shows that an understanding of the technological capabilities and learning theories especially constructive theory and independent learning theory and communicative and interaction theory in Distance learning is an efficient factor in the planning of effective Distance learning in higher education. Considering the theoretical foundations of the present research, in the effective distance learning model, the learner is situated at the center of learning environment. For this purpose, the learner needs to be ready for successful learning and the teacher has to be ready to design the teaching- learning activities when they initially enter the environment. In the present model, group and individual active teaching-learning approach, timely feedback, using IT and eight types of interactions have been designed with respect to theoretical foundations and current university missions. From among the issues emphasized in this model, one can refer to the Initial, Formative and Summative evaluations. In an effective distance learning environment, evaluation should be part of the learning process and the feedback resulting from it should be used to improve learning. For validating the specified features, the opinions of Distance learning experts in Payame Noor, Shiraz, Science and Technology and Amirkabir Universities have been used which verified a high percentage of the statistical sample of the above mentioned features.
Uncertainty and sensitivity analysis of environmental transport models
International Nuclear Information System (INIS)
Margulies, T.S.; Lancaster, L.E.
1985-01-01
An uncertainty and sensitivity analysis has been made of the CRAC-2 (Calculations of Reactor Accident Consequences) atmospheric transport and deposition models. Robustness and uncertainty aspects of air and ground deposited material and the relative contribution of input and model parameters were systematically studied. The underlying data structures were investigated using a multiway layout of factors over specified ranges generated via a Latin hypercube sampling scheme. The variables selected in our analysis include: weather bin, dry deposition velocity, rain washout coefficient/rain intensity, duration of release, heat content, sigma-z (vertical) plume dispersion parameter, sigma-y (crosswind) plume dispersion parameter, and mixing height. To determine the contributors to the output variability (versus distance from the site) step-wise regression analyses were performed on transformations of the spatial concentration patterns simulated. 27 references, 2 figures, 3 tables
Review of strategies for handling geological uncertainty in groundwater flow and transport modeling
DEFF Research Database (Denmark)
Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.
2012-01-01
be accounted for, but is often neglected, in assessments of prediction uncertainties. Strategies for assessing prediction uncertainty due to geologically related uncertainty may be divided into three main categories, accounting for uncertainty due to: (a) the geological structure; (b) effective model...... parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...
Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification
Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.
2017-12-01
Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data
Economic-mathematical methods and models under uncertainty
Aliyev, A G
2013-01-01
Brief Information on Finite-Dimensional Vector Space and its Application in EconomicsBases of Piecewise-Linear Economic-Mathematical Models with Regard to Influence of Unaccounted Factors in Finite-Dimensional Vector SpacePiecewise Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence in Three-Dimensional Vector SpacePiecewise-Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence on a PlaneBases of Software for Computer Simulation and Multivariant Prediction of Economic Even at Uncertainty Conditions on the Base of N-Comp
Conceptual Model of Offshore Wind Environmental Risk Evaluation System
Energy Technology Data Exchange (ETDEWEB)
Anderson, Richard M.; Copping, Andrea E.; Van Cleve, Frances B.; Unwin, Stephen D.; Hamilton, Erin L.
2010-06-01
In this report we describe the development of the Environmental Risk Evaluation System (ERES), a risk-informed analytical process for estimating the environmental risks associated with the construction and operation of offshore wind energy generation projects. The development of ERES for offshore wind is closely allied to a concurrent process undertaken to examine environmental effects of marine and hydrokinetic (MHK) energy generation, although specific risk-relevant attributes will differ between the MHK and offshore wind domains. During FY10, a conceptual design of ERES for offshore wind will be developed. The offshore wind ERES mockup described in this report will provide a preview of the functionality of a fully developed risk evaluation system that will use risk assessment techniques to determine priority stressors on aquatic organisms and environments from specific technology aspects, identify key uncertainties underlying high-risk issues, compile a wide-range of data types in an innovative and flexible data organizing scheme, and inform planning and decision processes with a transparent and technically robust decision-support tool. A fully functional version of ERES for offshore wind will be developed in a subsequent phase of the project.
Stochastic reduced order models for inverse problems under uncertainty.
Warner, James E; Aquino, Wilkins; Grigoriu, Mircea D
2015-03-01
This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well.
Conceptual Change Texts in Chemistry Teaching: A Study on the Particle Model of Matter
Beerenwinkel, Anne; Parchmann, Ilka; Grasel, Cornelia
2011-01-01
This study explores the effect of a conceptual change text on students' awareness of common misconceptions on the particle model of matter. The conceptual change text was designed based on principles of text comprehensibility, of conceptual change instruction and of instructional approaches how to introduce the particle model. It was evaluated in…
Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models
Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea
2014-05-01
Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.
Penetration Testing Professional Ethics: a conceptual model and taxonomy
Directory of Open Access Journals (Sweden)
Justin Pierce
2006-05-01
Full Text Available In an environment where commercial software is continually patched to correct security flaws, penetration testing can provide organisations with a realistic assessment of their security posture. Penetration testing uses the same principles as criminal hackers to penetrate corporate networks and thereby verify the presence of software vulnerabilities. Network administrators can use the results of a penetration test to correct flaws and improve overall security. The use of hacking techniques, however, raises several ethical questions that centre on the integrity of the tester to maintain professional distance and uphold the profession. This paper discusses the ethics of penetration testing and presents our conceptual model and revised taxonomy.
Handling Uncertainty in Palaeo-Climate Models and Data
Voss, J.; Haywood, A. M.; Dolan, A. M.; Domingo, D.
2017-12-01
The study of palaeoclimates can provide data on the behaviour of the Earth system with boundary conditions different from the ones we observe in the present. One of the main challenges in this approach is that data on past climates comes with large uncertainties, since quantities of interest cannot be observed directly, but must be derived from proxies instead. We consider proxy-derived data from the Pliocene (around 3 millions years ago; the last interval in Earth history when CO2 was at modern or near future levels) and contrast this data to the output of complex climate models. In order to perform a meaningful data-model comparison, uncertainties must be taken into account. In this context, we discuss two examples of complex data-model comparison problems. Both examples have in common that they involve fitting a statistical model to describe how the output of the climate simulations depends on various model parameters, including atmospheric CO2 concentration and orbital parameters (obliquity, excentricity, and precession). This introduces additional uncertainties, but allows to explore a much larger range of model parameters than would be feasible by only relying on simulation runs. The first example shows how Gaussian process emulators can be used to perform data-model comparison when simulation runs only differ in the choice of orbital parameters, but temperature data is given in the (somewhat inconvenient) form of "warm peak averages". The second example shows how a simpler approach, based on linear regression, can be used to analyse a more complex problem where we use a larger and more varied ensemble of climate simulations with the aim to estimate Earth System Sensitivity.
A conceptual model for local content development in petroleum industry
Directory of Open Access Journals (Sweden)
Abolfazl Kazzazi
2012-10-01
Full Text Available A novel concept, local content, in oil industry is gradually emerging. Local content should be defined in terms of value addition in local country (by local staff, local materials, local services and facilities rather than in terms of ownership of the company performing the value added activities. Many oil exporting countries have taken a positive approach toward local content development to maximize the benefits from oil and gas extraction. The purpose of this study is to develop a conceptual model for local content development in petroleum industry. Local content can generally be defined in terms of the ownership and/ or location of the enterprises involved in production and/ or the value-added in the production process. Local content promotion will have to vary significantly between countries, depending on the current status of their economic, political and social development. This model is useful for state governments to consider all aspects and factors affecting local content development generally. Local content development outcomes are economic growth, industrial growth and spillover effects. The paper begins with examining the factors accommodated in literature believed to influence the local content promotion. Based on our review, the conceptual model derived includes key factors of local content that evaluate local content development, and examine interrelations between local policies, local infrastructure, local environment, and local capability.
Uncertainties in modelling the climate impact of irrigation
de Vrese, Philipp; Hagemann, Stefan
2017-11-01
Irrigation-based agriculture constitutes an essential factor for food security as well as fresh water resources and has a distinct impact on regional and global climate. Many issues related to irrigation's climate impact are addressed in studies that apply a wide range of models. These involve substantial uncertainties related to differences in the model's structure and its parametrizations on the one hand and the need for simplifying assumptions for the representation of irrigation on the other hand. To address these uncertainties, we used the Max Planck Institute for Meteorology's Earth System model into which a simple irrigation scheme was implemented. In order to estimate possible uncertainties with regard to the model's more general structure, we compared the climate impact of irrigation between three simulations that use different schemes for the land-surface-atmosphere coupling. Here, it can be shown that the choice of coupling scheme does not only affect the magnitude of possible impacts but even their direction. For example, when using a scheme that does not explicitly resolve spatial subgrid scale heterogeneity at the surface, irrigation reduces the atmospheric water content, even in heavily irrigated regions. Contrarily, in simulations that use a coupling scheme that resolves heterogeneity at the surface or even within the lowest layers of the atmosphere, irrigation increases the average atmospheric specific humidity. A second experiment targeted possible uncertainties related to the representation of irrigation characteristics. Here, in four simulations the irrigation effectiveness (controlled by the target soil moisture and the non-vegetated fraction of the grid box that receives irrigation) and the timing of delivery were varied. The second experiment shows that uncertainties related to the modelled irrigation characteristics, especially the irrigation effectiveness, are also substantial. In general the impact of irrigation on the state of the land
Sensitivity and uncertainty analysis of a polyurethane foam decomposition model
Energy Technology Data Exchange (ETDEWEB)
HOBBS,MICHAEL L.; ROBINSON,DAVID G.
2000-03-14
Sensitivity/uncertainty analyses are not commonly performed on complex, finite-element engineering models because the analyses are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, an analytical sensitivity/uncertainty analysis is used to determine the standard deviation and the primary factors affecting the burn velocity of polyurethane foam exposed to firelike radiative boundary conditions. The complex, finite element model has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state burn velocity calculated as the derivative of the burn front location versus time. The standard deviation of the burn velocity was determined by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation is essentially determined from a second derivative that is extremely sensitive to numerical noise. To minimize the numerical noise, 50-micron elements and approximately 1-msec time steps were required to obtain stable uncertainty results. The primary effect variable was shown to be the emissivity of the foam.
Uncertainty quantification for quantum chemical models of complex reaction networks.
Proppe, Jonny; Husch, Tamara; Simm, Gregor N; Reiher, Markus
2016-12-22
For the quantitative understanding of complex chemical reaction mechanisms, it is, in general, necessary to accurately determine the corresponding free energy surface and to solve the resulting continuous-time reaction rate equations for a continuous state space. For a general (complex) reaction network, it is computationally hard to fulfill these two requirements. However, it is possible to approximately address these challenges in a physically consistent way. On the one hand, it may be sufficient to consider approximate free energies if a reliable uncertainty measure can be provided. On the other hand, a highly resolved time evolution may not be necessary to still determine quantitative fluxes in a reaction network if one is interested in specific time scales. In this paper, we present discrete-time kinetic simulations in discrete state space taking free energy uncertainties into account. The method builds upon thermo-chemical data obtained from electronic structure calculations in a condensed-phase model. Our kinetic approach supports the analysis of general reaction networks spanning multiple time scales, which is here demonstrated for the example of the formose reaction. An important application of our approach is the detection of regions in a reaction network which require further investigation, given the uncertainties introduced by both approximate electronic structure methods and kinetic models. Such cases can then be studied in greater detail with more sophisticated first-principles calculations and kinetic simulations.
A python framework for environmental model uncertainty analysis
White, Jeremy; Fienen, Michael N.; Doherty, John E.
2016-01-01
We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.
Uncertainty and Preference Modelling for Multiple Criteria Vehicle Evaluation
Directory of Open Access Journals (Sweden)
Qiuping Yang
2010-12-01
Full Text Available A general framework for vehicle assessment is proposed based on both mass survey information and the evidential reasoning (ER approach. Several methods for uncertainty and preference modeling are developed within the framework, including the measurement of uncertainty caused by missing information, the estimation of missing information in original surveys, the use of nonlinear functions for data mapping, and the use of nonlinear functions as utility function to combine distributed assessments into a single index. The results of the investigation show that various measures can be used to represent the different preferences of decision makers towards the same feedback from respondents. Based on the ER approach, credible and informative analysis can be conducted through the complete understanding of the assessment problem in question and the full exploration of available information.
Uncertainty Quantification for Combined Polynomial Chaos Kriging Surrogate Models
Weinmeister, Justin; Gao, Xinfeng; Krishna Prasad, Aditi; Roy, Sourajeet
2017-11-01
Surrogate modeling techniques are currently used to perform uncertainty quantification on computational fluid dynamics (CFD) models for their ability to identify the most impactful parameters on CFD simulations and help reduce computational cost in engineering design process. The accuracy of these surrogate models depends on a number of factors, such as the training data created from the CFD simulations, the target functions, the surrogate model framework, and so on. Recently, we have combined polynomial chaos expansions (PCE) and Kriging to produce a more accurate surrogate model, polynomial chaos Kriging (PCK). In this talk, we analyze the error convergence rate for the Kriging, PCE, and PCK model on a convection-diffusion-reaction problem, and validate the statistical measures and performance of the PCK method for its application to practical CFD simulations.
Assessment of errors and uncertainty patterns in GIA modeling
DEFF Research Database (Denmark)
Barletta, Valentina Roberta; Spada, G.
2012-01-01
During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... "preferred" GIA model has been used, without any consideration of the possible errors involved. Lacking a rigorous assessment of systematic errors in GIA modeling, the reliabil-ity of the results is uncertain. GIA sensitivity and uncertainties associated with the viscosity mod-els have been explored...... in the literature. However, at least two major sources of errors remain. The first is associated with the ice models, spatial distribution of ice and history of melting (this is especially the case of Antarctica), the second with the numerical implementation of model fea-tures relevant to sea level modeling...
A CONCEPTUAL FRAMEWORK FOR SUSTAINABLE POULTRY SUPPLY CHAIN MODEL
Directory of Open Access Journals (Sweden)
Mohammad SHAMSUDDOHA
2013-12-01
Full Text Available Now a day, sustainable supply chain is the crucially considerable matter for future focused industries. As a result, attention in supply chain management has increasingly amplified since the 1980s when firms discovered its benefits of mutual relationships within and beyond their own organization. This is why, concern researchers are trying hard to develop new theory or model which might help the corporate sector for achieving sustainability in their supply chains. This kind of reflection can be seen by the number of papers published and in particular by journal since 1980. The objectives of this paper are twofold. First, it offers a literature review on sustainable supply chain management taking papers published in last three decades. Second, it offers a conceptual sustainable supply chain process model in light of triple bottom line theory. The model has been developed by taking in-depth interview of an entrepreneur from a Poultry case industry in Bangladesh.
A sliding mode observer for hemodynamic characterization under modeling uncertainties
Zayane, Chadia
2014-06-01
This paper addresses the case of physiological states reconstruction in a small region of the brain under modeling uncertainties. The misunderstood coupling between the cerebral blood volume and the oxygen extraction fraction has lead to a partial knowledge of the so-called balloon model describing the hemodynamic behavior of the brain. To overcome this difficulty, a High Order Sliding Mode observer is applied to the balloon system, where the unknown coupling is considered as an internal perturbation. The effectiveness of the proposed method is illustrated through a set of synthetic data that mimic fMRI experiments.
International Nuclear Information System (INIS)
Hofer, E.; Hoffman, F.O.
1987-02-01
The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model
Updated Conceptual Model for the 300 Area Uranium Groundwater Plume
Energy Technology Data Exchange (ETDEWEB)
Zachara, John M.; Freshley, Mark D.; Last, George V.; Peterson, Robert E.; Bjornstad, Bruce N.
2012-11-01
The 300 Area uranium groundwater plume in the 300-FF-5 Operable Unit is residual from past discharge of nuclear fuel fabrication wastes to a number of liquid (and solid) disposal sites. The source zones in the disposal sites were remediated by excavation and backfilled to grade, but sorbed uranium remains in deeper, unexcavated vadose zone sediments. In spite of source term removal, the groundwater plume has shown remarkable persistence, with concentrations exceeding the drinking water standard over an area of approximately 1 km2. The plume resides within a coupled vadose zone, groundwater, river zone system of immense complexity and scale. Interactions between geologic structure, the hydrologic system driven by the Columbia River, groundwater-river exchange points, and the geochemistry of uranium contribute to persistence of the plume. The U.S. Department of Energy (DOE) recently completed a Remedial Investigation/Feasibility Study (RI/FS) to document characterization of the 300 Area uranium plume and plan for beginning to implement proposed remedial actions. As part of the RI/FS document, a conceptual model was developed that integrates knowledge of the hydrogeologic and geochemical properties of the 300 Area and controlling processes to yield an understanding of how the system behaves and the variables that control it. Recent results from the Hanford Integrated Field Research Challenge site and the Subsurface Biogeochemistry Scientific Focus Area Project funded by the DOE Office of Science were used to update the conceptual model and provide an assessment of key factors controlling plume persistence.
A conceptual model of people's vulnerability to floods
Milanesi, Luca; Pilotti, Marco; Ranzi, Roberto
2015-01-01
Hydraulic risk maps provide the baseline for land use and emergency planning. Accordingly, they should convey clear information on the potential physical implications of the different hazards to the stakeholders. This paper presents a vulnerability criterion focused on human stability in a flow specifically devised for rapidly evolving floods where life, before than economic values, might be threatened. The human body is conceptualized as a set of cylinders and its stability to slipping and toppling is assessed by forces and moments equilibrium. Moreover, a depth threshold to consider drowning is assumed. In order to widen its scope of application, the model takes the destabilizing effect of local slope (so far disregarded in the literature) and fluid density into account. The resulting vulnerability classification could be naturally subdivided in three levels (low, medium, and high) that are limited by two stability curves for children and adults, respectively. In comparison with the most advanced literature conceptual approaches, the proposed model is weakly parameterized and the computed thresholds fit better the available experimental data sets. A code that implements the proposed algorithm is provided.
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
Denys Yemshanov; Frank H Koch; Mark Ducey
2015-01-01
Uncertainty is inherent in model-based forecasts of ecological invasions. In this chapter, we explore how the perceptions of that uncertainty can be incorporated into the pest risk assessment process. Uncertainty changes a decision makerâs perceptions of risk; therefore, the direct incorporation of uncertainty may provide a more appropriate depiction of risk. Our...
Application of Probability Methods to Assess Crash Modeling Uncertainty
Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.
2007-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.
Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance
Energy Technology Data Exchange (ETDEWEB)
Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)
2017-03-23
In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.
Importance of incorporating agriculture in conceptual rainfall-runoff models
de Boer-Euser, Tanja; Hrachowitz, Markus; Winsemius, Hessel; Savenije, Hubert
2016-04-01
Incorporating spatially variable information is a frequently discussed option to increase the performance of (semi-)distributed conceptual rainfall-runoff models. One of the methods to do this is by using this spatially variable information to delineate Hydrological Response Units (HRUs) within a catchment. In large parts of Europe the original forested land cover is replaced by an agricultural land cover. This change in land cover probably affects the dominant runoff processes in the area, for example by increasing the Hortonian overland flow component, especially on the flatter and higher elevated parts of the catchment. A change in runoff processes implies a change in HRUs as well. A previous version of our model distinguished wetlands (areas close to the stream) from the remainder of the catchment. However, this configuration was not able to reproduce all fast runoff processes, both in summer as in winter. Therefore, this study tests whether the reproduction of fast runoff processes can be improved by incorporating a HRU which explicitly accounts for the effect of agriculture. A case study is carried out in the Ourthe catchment in Belgium. For this case study the relevance of different process conceptualisations is tested stepwise. Among the conceptualisations are Hortonian overland flow in summer and winter, reduced infiltration capacity due to a partly frozen soil and the relative effect of rainfall and snow smelt in case of this frozen soil. The results show that the named processes can make a large difference on event basis, especially the Hortonian overland flow in summer and the combination of rainfall and snow melt on (partly) frozen soil in winter. However, differences diminish when the modelled period of several years is evaluated based on standard metrics like Nash-Sutcliffe Efficiency. These results emphasise on one hand the importance of incorporating the effects of agricultural in conceptual models and on the other hand the importance of more event
Małolepszy, Zbigniew; Szynkaruk, Ewa
2015-04-01
same degrees of generalization shall be applied to uncertainties. However, approach for uncertainty assessment and quantification may vary depending on the scale of the model. In small scale regional and sub-regional models deterministic modelling methods are used, while stochastic algorithms can be applied for uncertainty modelling at large scale multi-prospect and field models. We believe that the 3D multiscale modelling describing geological architecture with quantified structure uncertainties, presented on standard deviation maps and grids, will allow us to outline exploration opportunities as well as to refine existing and build new conceptual models. As the tectonic setting of the area is the subject of long-term dispute, the model depicting at different resolutions both structures and gaps in geological knowledge shall allow to confirm some of the concepts related to geological history of the Lublin Basin and reject or modify the others.
Quantifying uncertainty, variability and likelihood for ordinary differential equation models
LENUS (Irish Health Repository)
Weisse, Andrea Y
2010-10-28
Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.
Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the
A conceptual model for translating omic data into clinical action
Directory of Open Access Journals (Sweden)
Timothy M Herr
2015-01-01
Full Text Available Genomic, proteomic, epigenomic, and other "omic" data have the potential to enable precision medicine, also commonly referred to as personalized medicine. The volume and complexity of omic data are rapidly overwhelming human cognitive capacity, requiring innovative approaches to translate such data into patient care. Here, we outline a conceptual model for the application of omic data in the clinical context, called "the omic funnel." This model parallels the classic "Data, Information, Knowledge, Wisdom pyramid" and adds context for how to move between each successive layer. Its goal is to allow informaticians, researchers, and clinicians to approach the problem of translating omic data from bench to bedside, by using discrete steps with clearly defined needs. Such an approach can facilitate the development of modular and interoperable software that can bring precision medicine into widespread practice.
Selection of Representative Models for Decision Analysis Under Uncertainty
Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.
2016-03-01
The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.
International Nuclear Information System (INIS)
Hammond, Glenn E.; Cygan, Randall Timothy
2007-01-01
Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given
Sustainable infrastructure system modeling under uncertainties and dynamics
Huang, Yongxi
potential risks caused by feedstock seasonality and demand uncertainty. Facility spatiality, time variation of feedstock yields, and demand uncertainty are integrated into a two-stage stochastic programming (SP) framework. In the study of Transitional Energy System Modeling under Uncertainty, a multistage stochastic dynamic programming is established to optimize the process of building and operating fuel production facilities during the transition. Dynamics due to the evolving technologies and societal changes and uncertainty due to demand fluctuations are the major issues to be addressed.
A study on modeling customer preferences for conceptual design
International Nuclear Information System (INIS)
Han, Soon Young; Seo, Seok Hoon; Choi, Hae Jin
2015-01-01
In this paper, we propose a concept selection method to evaluate future market performances of concept candidates, and to choose the best concept among those. The main and interaction effects of product performance factors, economic factors, and time on a market performance are modeled using a Bayesian framework-based Artificial neural network (ANN). The Bayesian framework is employed to measure the potential risk of wrong selection in using a trained ANN model. Based on the measured uncertainty bounds in the predicted future market performance, the most promising and robust concept may be selected. To validate our concept-selection method, we employed an automobile concept selection problem in the U.S. market. Seventeen concepts were assumed to compete in 2013, and the future market share with error bounds was predicted using the trained model based on sale data
A conceptual ENSO model under realistic noise forcing
Directory of Open Access Journals (Sweden)
J. Saynisch
2006-01-01
Full Text Available We investigated the influence of atmospheric noise on the generation of interannual El Niño variability. Therefore, we perturbed a conceptual ENSO delay model with surrogate windstress data generated from tropical windspeed measurements. The effect of the additional stochastic forcing was studied for various parameter sets including periodic and chaotic regimes. The evaluation was based on a spectrum and amplitude-period relation comparison between model and measured sea surface temperature data. The additional forcing turned out to increase the variability of the model output in general. The noise-free model was unable to reproduce the observed spectral bandwidth for any choice of parameters. On the contrary, the stochastically forced model is capable of producing a realistic spectrum. The weakly nonlinear regimes of the model exhibit a proportional relation between amplitude and period matching the relation derived from measurement data. The chaotic regime, however, shows an inversely proportional relation. A stability analysis of the different regimes revealed that the spectra of the weakly nonlinear regimes are robust against slight parameter changes representing disregarded physical mechanisms, whereas the chaotic regime exhibits a very unstable realistic spectrum. We conclude that the model including stochastic forcing in a parameter range of moderate nonlinearity best matches the real conditions. This suggests that atmospheric noise plays an important role in the coupled tropical pacific ocean-atmosphere system.
Özkaynak, Halûk; Frey, H. Christopher; Burke, Janet; Pinder, Robert W.
Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into account linkages and feedbacks. The current state-of-practice for such assessments is to exercise emission, meteorology, air quality, exposure, and dose models separately, and to link them together by using the output of one model as input to the subsequent downstream model. Quantification of variability and uncertainty has been an important topic in the exposure assessment community for a number of years. Variability refers to differences in the value of a quantity (e.g., exposure) over time, space, or among individuals. Uncertainty refers to lack of knowledge regarding the true value of a quantity. An emerging challenge is how to quantify variability and uncertainty in integrated assessments over the source-to-dose continuum by considering contributions from individual as well as linked components. For a case study of fine particulate matter (PM 2.5) in North Carolina during July 2002, we characterize variability and uncertainty associated with each of the individual concentration, exposure and dose models that are linked, and use a conceptual framework to quantify and evaluate the implications of coupled model uncertainties. We find that the resulting overall uncertainties due to combined effects of both variability and uncertainty are smaller (usually by a factor of 3-4) than the crudely multiplied model-specific overall uncertainty ratios. Future research will need to examine the impact of potential dependencies among the model components by conducting a truly coupled modeling analysis.
Workshop on Model Uncertainty and its Statistical Implications
1988-01-01
In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.
The uncertainty of modeled soil carbon stock change for Finland
Lehtonen, Aleksi; Heikkinen, Juha
2013-04-01
Countries should report soil carbon stock changes of forests for Kyoto Protocol. Under Kyoto Protocol one can omit reporting of a carbon pool by verifying that the pool is not a source of carbon, which is especially tempting for the soil pool. However, verifying that soils of a nation are not a source of carbon in given year seems to be nearly impossible. The Yasso07 model was parametrized against various decomposition data using MCMC method. Soil carbon change in Finland between 1972 and 2011 were simulated with Yasso07 model using litter input data derived from the National Forest Inventory (NFI) and fellings time series. The uncertainties of biomass models, litter turnoverrates, NFI sampling and Yasso07 model were propagated with Monte Carlo simulations. Due to biomass estimation methods, uncertainties of various litter input sources (e.g. living trees, natural mortality and fellings) correlate strongly between each other. We show how original covariance matrices can be analytically combined and the amount of simulated components reduce greatly. While doing simulations we found that proper handling correlations may be even more essential than accurate estimates of standard errors. As a preliminary results, from the analysis we found that both Southern- and Northern Finland were soil carbon sinks, coefficient of variations (CV) varying 10%-25% when model was driven with long term constant weather data. When we applied annual weather data, soils were both sinks and sources of carbon and CVs varied from 10%-90%. This implies that the success of soil carbon sink verification depends on the weather data applied with models. Due to this fact IPCC should provide clear guidance for the weather data applied with soil carbon models and also for soil carbon sink verification. In the UNFCCC reporting carbon sinks of forest biomass have been typically averaged for five years - similar period for soil model weather data would be logical.
Precipitation forecasts and their uncertainty as input into hydrological models
Directory of Open Access Journals (Sweden)
M. Kobold
2005-01-01
Full Text Available Torrential streams and fast runoff are characteristic of most Slovenian rivers and extensive damage is caused almost every year by rainstorms affecting different regions of Slovenia. Rainfall-runoff models which are tools for runoff calculation can be used for flood forecasting. In Slovenia, the lag time between rainfall and runoff is only a few hours and on-line data are used only for now-casting. Predicted precipitation is necessary in flood forecasting some days ahead. The ECMWF (European Centre for Medium-Range Weather Forecasts model gives general forecasts several days ahead while more detailed precipitation data with the ALADIN/SI model are available for two days ahead. Combining the weather forecasts with the information on catchment conditions and a hydrological forecasting model can give advance warning of potential flooding notwithstanding a certain degree of uncertainty in using precipitation forecasts based on meteorological models. Analysis of the sensitivity of the hydrological model to the rainfall error has shown that the deviation in runoff is much larger than the rainfall deviation. Therefore, verification of predicted precipitation for large precipitation events was performed with the ECMWF model. Measured precipitation data were interpolated on a regular grid and compared with the results from the ECMWF model. The deviation in predicted precipitation from interpolated measurements is shown with the model bias resulting from the inability of the model to predict the precipitation correctly and a bias for horizontal resolution of the model and natural variability of precipitation.
Effects of petrophysical uncertainty in Bayesian hydrogeophysical inversion and model selection
Brunetti, Carlotta; Linde, Niklas
2017-04-01
Hydrogeophysical studies rely on petrophysical relationships that link geophysical properties to hydrological proprieties and state variables of interest; these relationships are frequently assumed to be perfect (i.e., a one-to-one relation). Using first-arrival traveltime data from a synthetic crosshole ground-penetrating radar (GPR) experiment, we investigate the role of petrophysical uncertainty on porosity estimates from Markov chain Monte Carlo (MCMC) inversion and on Bayes factors (i.e., ratios of the evidences, or marginal likelihoods, of two competing models) used in Bayesian model selection. The petrophysical errors (PE) are conceptualized by a correlated zero-mean multi-Gaussian field with horizontal anisotropy with a resulting correlation coefficient of 0.8 between porosity and radar wave speed. We consider four different cases: (1) no PE are present (i.e., they are not used to generate the synthetic data) and they are not inferred in the MCMC inversion, (2) the PE are inferred for but they are not present in the data, (3) the PE are present in the data, but not inferred for and (4) the PE are present in the data and inferred for. To obtain appropriate acceptance ratios (i.e., between 35% and 45%), it is necessary to infer the PE as model parameters with a proper proposal distribution (simple Monte Carlo sampling of the petrophysical errors within Metropolis leads to very small acceptance rates). Case 4 provides consistent porosity field estimates (no bias) and the correlation coefficient between the "true" and posterior mean porosity field decreases from 0.9 for case 1 to 0.75. For case 2, we find that the variance of the posterior mean porosity field is too low and the porosity range is underestimated (i.e., some of the variance is accounted for by the inferred petrophysical uncertainty). Correspondingly, the porosity range is too wide for case 3 as it is used to account for petrophysical errors in the data. When comparing three different conceptual
A conceptual model to improve performance in virtual teams
Directory of Open Access Journals (Sweden)
Shopee Dube
2016-09-01
Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location. Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams. Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed. Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model. Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.
Formulations of transport in catchment-scale conceptual models
De Vos, Lotte; Hrachowitz, Markus
2017-04-01
Standard conceptual hydrological models can rarely accommodate stream tracer dynamics at the catchment scale. They rely on the generation of runoff through the propagation of a pressure wave and do not account for the actual advective movement of particles. Over the last years different model frameworks have been developed to account for this shortcoming. The difference between the frameworks lies in whether they are based on mixing coefficients or storage age selection functions. Both methods have shown their ability to capture the stream chemistry response. It is however not clear how these distinct approaches compare to each other and to reality. The object of this research is to provide clarification in this matter. To achieve this, the hydrological and stream water chemistry response for a set of contrasting research catchments is modelled, using both the mixing coefficient and the storage age selection approach. The results are analysed using the concept of transit times, where information on the fluxes and states in all model components is used to generate distributions that describe the age structure of water. By comparing the distributions generated by both methods and by evaluating the overall model performances, more insight is gained on how mixing occurs at the catchment scale. This contributes to the understanding of the integrated system dynamics of catchments, which is relevant for the development of good water quality models that accurately describe the integrated response of a hydrological system.
The structure of conceptual models with application to the Aespoe HRL project
International Nuclear Information System (INIS)
Olsson, Olle; Baeckblom, G.; Wikberg, P.; Gustafson, G.; Stanfors, R.
1994-05-01
In performance assessment a sequence of models is used to describe the function of the geological barrier. This report proposes a general structure and terminology for description of these models. A model description consists of the following components: A conceptual model which defines the geometric framework in which the problem is solved, the dimensions of the modelled volume, descriptions of the processes included in the model, and the boundary conditions; Data which are introduced into the conceptual model, and a mathematical or numerical tool used to produce output data. Contradictory to common practice in geohydrologic modelling it is proposed that the term conceptual model is restricted to define in what way the model is constructed, and that this is separated from any specific application of the conceptual model. Hence, the conceptual model should not include any specific data. 5 refs, 2 figs, 4 tabs
Geophysical Conceptual Model for Benthic Flux and Submarine Groundwater Discharge
King, J. N.
2010-12-01
Numerous investigators characterize benthic flux and submarine groundwater discharge (SGD) using a geochemical conceptual model that relies on the estimation of tracer fluxes into and out of a control volume. (Benthic flux is the rate of flow across the bed of any water body, per unit area of bed. Benthic flux is a vector that includes both discharge and recharge components. SGD is a benthic water discharge flux to a marine water body.) For the geochemical approach, benthic discharge flux or SGD is estimated by summing the flux of tracer into or out of the control volume---a water body or portion of a water body---and deducing that tracer deficiency within the control volume must be explained by SGD. Typically, estimated or measured fluxes include advection and mixing in surface-water, diffusion, evasion across the air-water interface, production, and decay. The geochemical model, however, does not account for fluxes that do not transport tracer. For example, investigators found equivalent (the upper 30 cm of sediment in the Indian River Lagoon, Florida, in June and July 2003. At this location, a surface-gravity wave with a five-centimeter amplitude and one-second period in 0.5 m of water forced a 12-cm-per-day SGD. The radon tracer technique may not characterize SGD forced by the one-second wave due to the time scale of the wave, the absence of a radon activity gradient between bed medium and surface water, and the the wave affects the flow field within the porous medium. A new geophysical conceptual model for benthic flux is proposed. The model parses benthic flux into components driven by individual forcing mechanisms. The model recognizes that benthic flux components may interact in a constructive or destructive manner, such that benthic flux generated by multiple forcing mechanisms at the same location may not be equivalent to the linear sum of benthic flux generated by single forcing mechanisms. Restated: the whole may be different than the sum of the parts
Climate stability and sensitivity in some simple conceptual models
Energy Technology Data Exchange (ETDEWEB)
Bates, J. Ray [University College Dublin, Meteorology and Climate Centre, School of Mathematical Sciences, Dublin (Ireland)
2012-02-15
A theoretical investigation of climate stability and sensitivity is carried out using three simple linearized models based on the top-of-the-atmosphere energy budget. The simplest is the zero-dimensional model (ZDM) commonly used as a conceptual basis for climate sensitivity and feedback studies. The others are two-zone models with tropics and extratropics of equal area; in the first of these (Model A), the dynamical heat transport (DHT) between the zones is implicit, in the second (Model B) it is explicitly parameterized. It is found that the stability and sensitivity properties of the ZDM and Model A are very similar, both depending only on the global-mean radiative response coefficient and the global-mean forcing. The corresponding properties of Model B are more complex, depending asymmetrically on the separate tropical and extratropical values of these quantities, as well as on the DHT coefficient. Adopting Model B as a benchmark, conditions are found under which the validity of the ZDM and Model A as climate sensitivity models holds. It is shown that parameter ranges of physical interest exist for which such validity may not hold. The 2 x CO{sub 2} sensitivities of the simple models are studied and compared. Possible implications of the results for sensitivities derived from GCMs and palaeoclimate data are suggested. Sensitivities for more general scenarios that include negative forcing in the tropics (due to aerosols, inadvertent or geoengineered) are also studied. Some unexpected outcomes are found in this case. These include the possibility of a negative global-mean temperature response to a positive global-mean forcing, and vice versa. (orig.)
Uncertainties in modeling hazardous gas releases for emergency response
Directory of Open Access Journals (Sweden)
Kathrin Baumann-Stanzer
2011-02-01
Full Text Available In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms-1 in wind speed, on the scale of 50 degrees in wind direction, up to 4°C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders.
Uncertainties in modeling hazardous gas releases for emergency response
Energy Technology Data Exchange (ETDEWEB)
Baumann-Stanzer, Kathrin; Stenzel, Sirma [Zentralanstalt fuer Meteorologie und Geodynamik, Vienna (Austria)
2011-02-15
In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA) in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms{sup -1} in wind speed, on the scale of 50 degrees in wind direction, up to 4 C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders. (orig.)
Uncertainty Analysis of Multi-Model Flood Forecasts
Directory of Open Access Journals (Sweden)
Erich J. Plate
2015-12-01
Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.
Modeling a Hybrid Microgrid Using Probabilistic Reconfiguration under System Uncertainties
Directory of Open Access Journals (Sweden)
Hadis Moradi
2017-09-01
Full Text Available A novel method for a day-ahead optimal operation of a hybrid microgrid system including fuel cells, photovoltaic arrays, a microturbine, and battery energy storage in order to fulfill the required load demand is presented in this paper. In the proposed system, the microgrid has access to the main utility grid in order to exchange power when required. Available municipal waste is utilized to produce the hydrogen required for running the fuel cells, and natural gas will be used as the backup source. In the proposed method, an energy scheduling is introduced to optimize the generating unit power outputs for the next day, as well as the power flow with the main grid, in order to minimize the operational costs and produced greenhouse gases emissions. The nature of renewable energies and electric power consumption is both intermittent and unpredictable, and the uncertainty related to the PV array power generation and power consumption has been considered in the next-day energy scheduling. In order to model uncertainties, some scenarios are produced according to Monte Carlo (MC simulations, and microgrid optimal energy scheduling is analyzed under the generated scenarios. In addition, various scenarios created by MC simulations are applied in order to solve unit commitment (UC problems. The microgrid’s day-ahead operation and emission costs are considered as the objective functions, and the particle swarm optimization algorithm is employed to solve the optimization problem. Overall, the proposed model is capable of minimizing the system costs, as well as the unfavorable influence of uncertainties on the microgrid’s profit, by generating different scenarios.
Numerical solution of dynamic equilibrium models under Poisson uncertainty
DEFF Research Database (Denmark)
Posch, Olaf; Trimborn, Timo
2013-01-01
We propose a simple and powerful numerical algorithm to compute the transition process in continuous-time dynamic equilibrium models with rare events. In this paper we transform the dynamic system of stochastic differential equations into a system of functional differential equations...... of the retarded type. We apply the Waveform Relaxation algorithm, i.e., we provide a guess of the policy function and solve the resulting system of (deterministic) ordinary differential equations by standard techniques. For parametric restrictions, analytical solutions to the stochastic growth model and a novel...... solution to Lucas' endogenous growth model under Poisson uncertainty are used to compute the exact numerical error. We show how (potential) catastrophic events such as rare natural disasters substantially affect the economic decisions of households....
Plasticity models of material variability based on uncertainty quantification techniques
Energy Technology Data Exchange (ETDEWEB)
Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)
2017-11-01
The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.
How well can we forecast future model error and uncertainty by mining past model performance data
Solomatine, Dimitri
2016-04-01
Consider a hydrological model Y(t) = M(X(t), P), where X=vector of inputs; P=vector of parameters; Y=model output (typically flow); t=time. In cases when there is enough past data on the model M performance, it is possible to use this data to build a (data-driven) model EC of model M error. This model EC will be able to forecast error E when a new input X is fed into model M; then subtracting E from the model prediction Y a better estimate of Y can be obtained. Model EC is usually called the error corrector (in meteorology - a bias corrector). However, we may go further in characterizing model deficiencies, and instead of using the error (a real value) we may consider a more sophisticated characterization, namely a probabilistic one. So instead of rather a model EC of the model M error it is also possible to build a model U of model M uncertainty; if uncertainty is described as the model error distribution D this model will calculate its properties - mean, variance, other moments, and quantiles. The general form of this model could be: D = U (RV), where RV=vector of relevant variables having influence on model uncertainty (to be identified e.g. by mutual information analysis); D=vector of variables characterizing the error distribution (typically, two or more quantiles). There is one aspect which is not always explicitly mentioned in uncertainty analysis work. In our view it is important to distinguish the following main types of model uncertainty: 1. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. its uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. Here the following methods can be mentioned: (a) quantile regression (QR
Recruiting Transcultural Qualitative Research Participants: A Conceptual Model
Directory of Open Access Journals (Sweden)
Phyllis Eide
2005-06-01
Full Text Available Working with diverse populations poses many challenges to the qualitative researcher who is a member of the dominant culture. Traditional methods of recruitment and selection (such as flyers and advertisements are often unproductive, leading to missed contributions from potential participants who were not recruited and researcher frustration. In this article, the authors explore recruitment issues related to the concept of personal knowing based on experiences with Aboriginal Hawai'ian and Micronesian populations, wherein knowing and being known are crucial to successful recruitment of participants. They present a conceptual model that incorporates key concepts of knowing the other, cultural context, and trust to guide other qualitative transcultural researchers. They also describe challenges, implications, and concrete suggestions for recruitment of participants.
CONCEPTUAL MODEL OF CONSUMERS TRUST TO ONLINE SHOPS
Directory of Open Access Journals (Sweden)
T. Dubovyk
2014-06-01
Full Text Available In the article the conceptual model of the major factors that influence consumers trust in online shop: reliability of online store, reliable information system for making purchases online, factors of ethic interactiveness (security, third-party certification, internet-marketing communications of online-shop and other factors – that is divided enterprises of trade and consumers (demographic variables, psychological perception of internet-marketing communications, experience of purchase of commodities are in the Internet. The degree of individual customer trust propensity which reflects the personality traits, culture and previous experience. An implement signs of consumer confidence due to site elements online shop – graphic design, structured design, design of content, design harmonized with perception of target audience.
Assessment of private hospital portals: A conceptual model
Directory of Open Access Journals (Sweden)
Mehdi Alipour-Hafezi
2016-01-01
Full Text Available Introduction: Hospital portals, as the first virtual entry, play an important role in connecting people with hospital and also presenting hospital virtual services. The main purpose of this article was to suggest a conceptual model to improve Tehran private hospital portals. The suggested model can be used by all the health portals that are in the same circumstances and all the health portals which are in progress. Method: This is a practical research, using evaluative survey research method. Research population includes all the private hospital portals in Tehran, 34 portals, and ten top international hospital portals. Data gathering tool used in this research was a researcher-made checklist including 14 criteria and 77 sub-criteria with their weight score. In fact, objective observation with the mentioned checklist was used to gather information. Descriptive statistics were used to analyze the data and tables and graphs were used to present the organized data. Also, data were analyzed using independent t-test. Conceptual modeling technique was used to design the model and demonstration method was used to evaluate the proposed model. In this regard, SPSS statistical software was used to perform the tests. Results:The comparative study between the two groups of portals, TPH and WTH, in the 14 main criteria showed that the value of t-test in contact information criteria was 0.862, portal page specification was -1.378, page design criteria -1.527, updating pages -0.322, general information and access roads -3.161, public services -7.302, patient services -4.154, patient data -8.703, research and education -9.155, public relationship -3.009, page technical specifications -4.726, telemedicine -7.488, pharmaceutical services -6.183, and financial services -2.782. Finally, the findings demonstrated that Tehran private hospital portals in criterion of contact information were favorable; page design criteria were relatively favorable; page technical
Modelling in Primary School: Constructing Conceptual Models and Making Sense of Fractions
Shahbari, Juhaina Awawdeh; Peled, Irit
2017-01-01
This article describes sixth-grade students' engagement in two model-eliciting activities offering students the opportunity to construct mathematical models. The findings show that students utilized their knowledge of fractions including conceptual and procedural knowledge in constructing mathematical models for the given situations. Some students…
Multi-Fidelity Uncertainty Propagation for Cardiovascular Modeling
Fleeter, Casey; Geraci, Gianluca; Schiavazzi, Daniele; Kahn, Andrew; Marsden, Alison
2017-11-01
Hemodynamic models are successfully employed in the diagnosis and treatment of cardiovascular disease with increasing frequency. However, their widespread adoption is hindered by our inability to account for uncertainty stemming from multiple sources, including boundary conditions, vessel material properties, and model geometry. In this study, we propose a stochastic framework which leverages three cardiovascular model fidelities: 3D, 1D and 0D models. 3D models are generated from patient-specific medical imaging (CT and MRI) of aortic and coronary anatomies using the SimVascular open-source platform, with fluid structure interaction simulations and Windkessel boundary conditions. 1D models consist of a simplified geometry automatically extracted from the 3D model, while 0D models are obtained from equivalent circuit representations of blood flow in deformable vessels. Multi-level and multi-fidelity estimators from Sandia's open-source DAKOTA toolkit are leveraged to reduce the variance in our estimated output quantities of interest while maintaining a reasonable computational cost. The performance of these estimators in terms of computational cost reductions is investigated for a variety of output quantities of interest, including global and local hemodynamic indicators. Sandia National Labs is a multimission laboratory managed and operated by NTESS, LLC, for the U.S. DOE under contract DE-NA0003525. Funding for this project provided by NIH-NIBIB R01 EB018302.
A model for optimization of process integration investments under uncertainty
International Nuclear Information System (INIS)
Svensson, Elin; Stroemberg, Ann-Brith; Patriksson, Michael
2011-01-01
The long-term economic outcome of energy-related industrial investment projects is difficult to evaluate because of uncertain energy market conditions. In this article, a general, multistage, stochastic programming model for the optimization of investments in process integration and industrial energy technologies is proposed. The problem is formulated as a mixed-binary linear programming model where uncertainties are modelled using a scenario-based approach. The objective is to maximize the expected net present value of the investments which enables heat savings and decreased energy imports or increased energy exports at an industrial plant. The proposed modelling approach enables a long-term planning of industrial, energy-related investments through the simultaneous optimization of immediate and later decisions. The stochastic programming approach is also suitable for modelling what is possibly complex process integration constraints. The general model formulation presented here is a suitable basis for more specialized case studies dealing with optimization of investments in energy efficiency. -- Highlights: → Stochastic programming approach to long-term planning of process integration investments. → Extensive mathematical model formulation. → Multi-stage investment decisions and scenario-based modelling of uncertain energy prices. → Results illustrate how investments made now affect later investment and operation opportunities. → Approach for evaluation of robustness with respect to variations in probability distribution.
An empirical conceptual gully evolution model for channelled sea cliffs
Leyland, Julian; Darby, Stephen E.
2008-12-01
Incised coastal channels are a specific form of incised channel that are found in locations where stream channels flowing to cliffed coasts have the excess energy required to cut down through the cliff to reach the outlet water body. The southern coast of the Isle of Wight, southern England, comprises soft cliffs that vary in height between 15 and 100 m and which are retreating at rates ≤ 1.5 m a - 1 , due to a combination of wave erosion and landslides. In several locations, river channels have cut through the cliffs to create deeply (≤ 45 m) incised gullies, known locally as 'Chines'. The Chines are unusual in that their formation is associated with dynamic shoreline encroachment during a period of rising sea-level, whereas existing models of incised channel evolution emphasise the significance of base level lowering. This paper develops a conceptual model of Chine evolution by applying space for time substitution methods using empirical data gathered from Chine channel surveys and remotely sensed data. The model identifies a sequence of evolutionary stages, which are classified based on a suite of morphometric indices and associated processes. The extent to which individual Chines are in a state of growth or decay is estimated by determining the relative rates of shoreline retreat and knickpoint recession, the former via analysis of historical aerial images and the latter through the use of a stream power erosion model.
Vulnerability Assessment Models to Drought: Toward a Conceptual Framework
Directory of Open Access Journals (Sweden)
Kiumars Zarafshani
2016-06-01
Full Text Available Drought is regarded as a slow-onset natural disaster that causes inevitable damage to water resources and to farm life. Currently, crisis management is the basis of drought mitigation plans, however, thus far studies indicate that effective drought management strategies are based on risk management. As a primary tool in mitigating the impact of drought, vulnerability assessment can be used as a benchmark in drought mitigation plans and to enhance farmers’ ability to cope with drought. Moreover, literature pertaining to drought has focused extensively on its impact, only awarding limited attention to vulnerability assessment as a tool. Therefore, the main purpose of this paper is to develop a conceptual framework for designing a vulnerability model in order to assess farmers’ level of vulnerability before, during and after the onset of drought. Use of this developed drought vulnerability model would aid disaster relief workers by enhancing the adaptive capacity of farmers when facing the impacts of drought. The paper starts with the definition of vulnerability and outlines different frameworks on vulnerability developed thus far. It then identifies various approaches of vulnerability assessment and finally offers the most appropriate model. The paper concludes that the introduced model can guide drought mitigation programs in countries that are impacted the most by drought.
A CONCEPTUAL MODEL FOR IMPROVED PROJECT SELECTION AND PRIORITISATION
Directory of Open Access Journals (Sweden)
P. J. Viljoen
2012-01-01
Full Text Available
ENGLISH ABSTRACT: Project portfolio management processes are often designed and operated as a series of stages (or project phases and gates. However, the flow of such a process is often slow, characterised by queues waiting for a gate decision and by repeated work from previous stages waiting for additional information or for re-processing. In this paper the authors propose a conceptual model that applies supply chain and constraint management principles to the project portfolio management process. An advantage of the proposed model is that it provides the ability to select and prioritise projects without undue changes to project schedules. This should result in faster flow through the system.
AFRIKAANSE OPSOMMING: Prosesse om portefeuljes van projekte te bestuur word normaalweg ontwerp en bedryf as ’n reeks fases en hekke. Die vloei deur so ’n proses is dikwels stadig en word gekenmerk deur toue wat wag vir besluite by die hekke en ook deur herwerk van vorige fases wat wag vir verdere inligting of vir herprosessering. In hierdie artikel word ‘n konseptuele model voorgestel. Die model berus op die beginsels van voorsieningskettings sowel as van beperkingsbestuur, en bied die voordeel dat projekte geselekteer en geprioritiseer kan word sonder onnodige veranderinge aan projekskedules. Dit behoort te lei tot versnelde vloei deur die stelsel.
BIM-Enabled Conceptual Modelling and Representation of Building Circulation
Directory of Open Access Journals (Sweden)
Jin Kook Lee
2014-08-01
Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC's schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs' schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.
High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME
Otis, Richard A.; Liu, Zi-Kui
2017-05-01
One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.
Uncertainty Estimation in SiGe HBT Small-Signal Modeling
DEFF Research Database (Denmark)
Masood, Syed M.; Johansen, Tom Keinicke; Vidkjær, Jens
2005-01-01
An uncertainty estimation and sensitivity analysis is performed on multi-step de-embedding for SiGe HBT small-signal modeling. The uncertainty estimation in combination with uncertainty model for deviation in measured S-parameters, quantifies the possible error value in de-embedded two-port param...
Innovative supply chain optimization models with multiple uncertainty factors
DEFF Research Database (Denmark)
Choi, Tsan Ming; Govindan, Kannan; Li, Xiang
2017-01-01
Uncertainty is an inherent factor that affects all dimensions of supply chain activities. In today’s business environment, initiatives to deal with one specific type of uncertainty might not be effective since other types of uncertainty factors and disruptions may be present. These factors relate...
Uncertainty analysis in WWTP model applications: a critical discussion using an example from design
DEFF Research Database (Denmark)
Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.
2009-01-01
This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...
TECHNICAL PRODUCT RISK ASSESSMENT: STANDARDS, INTEGRATION IN THE ERM MODEL AND UNCERTAINTY MODELING
Directory of Open Access Journals (Sweden)
Mirko Djapic
2016-03-01
Full Text Available European Union has accomplished, through introducing New Approach to technical harmonization and standardization, a breakthrough in the field of technical products safety and in assessing their conformity, in such a manner that it integrated products safety requirements into the process of products development. This is achieved by quantifying risk levels with the aim of determining the scope of the required safety measures and systems. The theory of probability is used as a tool for modeling uncertainties in the assessment of that risk. In the last forty years are developed new mathematical theories have proven to be better at modeling uncertainty when we have not enough data about uncertainty events which is usually the case in product development. Bayesian networks based on modeling of subjective probability and Evidence networks based on Dempster-Shafer theory of belief functions proved to be an excellent tool for modeling uncertainty when we do not have enough information about all events aspect.
Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates
International Nuclear Information System (INIS)
Fenwick, John D.; Nahum, Alan E.
2001-01-01
A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding
Quantile uncertainty and value-at-risk model risk.
Alexander, Carol; Sarabia, José María
2012-08-01
This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.
Testing a Conceptual Change Model Framework for Visual Data
Finson, Kevin D.; Pedersen, Jon E.
2015-01-01
An emergent data analysis technique was employed to test the veracity of a conceptual framework constructed around visual data use and instruction in science classrooms. The framework incorporated all five key components Vosniadou (2007a, 2007b) described as existing in a learner's schema: framework theory, presuppositions, conceptual domains,…
Quantifying uncertainty in LCA-modelling of waste management systems
DEFF Research Database (Denmark)
Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund
2012-01-01
Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...
Addressing model uncertainty in dose-response: The case of chloroform
International Nuclear Information System (INIS)
Evans, J.S.
1994-01-01
This paper discusses the issues involved in addressing model uncertainty in the analysis of dose-response relationships. A method for addressing model uncertainty is described and applied to characterize the uncertainty in estimates of the carcinogenic potency of chloroform. The approach, which is rooted in Bayesian concepts of subjective probability, uses probability trees and formally-elicited expert judgments to address model uncertainty. It is argued that a similar approach could be used to improve the characterization of model uncertainty in the dose-response relationships for health effects from ionizing radiation
Behar, Evelyn; DiMarco, Ilyse Dobrow; Hekler, Eric B; Mohlman, Jan; Staples, Alison M
2009-12-01
Theoretical conceptualizations of generalized anxiety disorder (GAD) continue to undergo scrutiny and refinement. The current paper critiques five contemporary models of GAD: the Avoidance Model of Worry and GAD [Borkovec, T. D. (1994). The nature, functions, and origins of worry. In: G. Davey & F. Tallis (Eds.), Worrying: perspectives on theory assessment and treatment (pp. 5-33). Sussex, England: Wiley & Sons; Borkovec, T. D., Alcaine, O. M., & Behar, E. (2004). Avoidance theory of worry and generalized anxiety disorder. In: R. Heimberg, C. Turk, & D. Mennin (Eds.), Generalized anxiety disorder: advances in research and practice (pp. 77-108). New York, NY, US: Guilford Press]; the Intolerance of Uncertainty Model [Dugas, M. J., Letarte, H., Rheaume, J., Freeston, M. H., & Ladouceur, R. (1995). Worry and problem solving: evidence of a specific relationship. Cognitive Therapy and Research, 19, 109-120; Freeston, M. H., Rheaume, J., Letarte, H., Dugas, M. J., & Ladouceur, R. (1994). Why do people worry? Personality and Individual Differences, 17, 791-802]; the Metacognitive Model [Wells, A. (1995). Meta-cognition and worry: a cognitive model of generalized anxiety disorder. Behavioural and Cognitive Psychotherapy, 23, 301-320]; the Emotion Dysregulation Model [Mennin, D. S., Heimberg, R. G., Turk, C. L., & Fresco, D. M. (2002). Applying an emotion regulation framework to integrative approaches to generalized anxiety disorder. Clinical Psychology: Science and Practice, 9, 85-90]; and the Acceptance-based Model of GAD [Roemer, L., & Orsillo, S. M. (2002). Expanding our conceptualization of and treatment for generalized anxiety disorder: integrating mindfulness/acceptance-based approaches with existing cognitive behavioral models. Clinical Psychology: Science and Practice, 9, 54-68]. Evidence in support of each model is critically reviewed, and each model's corresponding evidence-based therapeutic interventions are discussed. Generally speaking, the models share an
Model uncertainty in financial markets : Long run risk and parameter uncertainty
de Roode, F.A.
2014-01-01
Uncertainty surrounding key parameters of financial markets, such as the in- flation and equity risk premium, constitute a major risk for institutional investors with long investment horizons. Hedging the investors’ inflation exposure can be challenging due to the lack of domestic inflation-linked
Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties
Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.
2015-01-01
For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.
Calibration under uncertainty for finite element models of masonry monuments
Energy Technology Data Exchange (ETDEWEB)
Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin
2010-02-01
Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, and there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.
Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations
Energy Technology Data Exchange (ETDEWEB)
Dyer, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2017-11-30
A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).
Developing a conceptual model for selecting and evaluating online markets
Directory of Open Access Journals (Sweden)
Sadegh Feizollahi
2013-04-01
Full Text Available There are many evidences, which emphasis on the benefits of using new technologies of information and communication in international business and many believe that E-Commerce can help satisfy customer explicit and implicit requirements. Internet shopping is a concept developed after the introduction of electronic commerce. Information technology (IT and its applications, specifically in the realm of the internet and e-mail promoted the development of e-commerce in terms of advertising, motivating and information. However, with the development of new technologies, credit and financial exchange on the internet websites were constructed so to facilitate e-commerce. The proposed study sends a total of 200 questionnaires to the target group (teachers - students - professionals - managers of commercial web sites and it manages to collect 130 questionnaires for final evaluation. Cronbach's alpha test is used for measuring reliability and to evaluate the validity of measurement instruments (questionnaires, and to assure construct validity, confirmatory factor analysis is employed. In addition, in order to analyze the research questions based on the path analysis method and to determine markets selection models, a regular technique is implemented. In the present study, after examining different aspects of e-commerce, we provide a conceptual model for selecting and evaluating online marketing in Iran. These findings provide a consistent, targeted and holistic framework for the development of the Internet market in the country.
Hemispheric Asymmetry of Global Warming Explained by a Conceptual Model
Funke, C. S.; Alexeev, V. A.
2017-12-01
Polar Amplification, the process of amplified warming at high latitudes, manifests itself differently in the Arctic and Antarctic. Not only is the temperature increase in the Arctic more pronounced than in the Antarctic but the dramatic sea ice decline in the Arctic over the last few decades also contrasts sharply with trendless to weak positive trend of Antarctic sea ice throughout the same period. This asymmetric behavior is often partly attributed to the differences in configuration of continents in the Arctic and Antarctic: the Arctic Ocean is surrounded by land while the Southern Ocean has a continent in the middle. A simple conceptual energy balance model of Budyko-Sellers type, accounting for differences between the Northern and Southern hemispheres, is applied to study the mechanisms of climate sensitivity to a variety of forcings. Asymmetry in major modes of variability is explained using an eigenmode analysis of the linearized model. Negative forcings over Antarctica such as from ozone depletion were found to have an amplified effect on southern hemisphere climate and may be an important cause of the muted warming and slightly positive Antarctic sea ice trend.
DEFF Research Database (Denmark)
He, Xiulan
Groundwater modeling plays an essential role in modern subsurface hydrology research. It’s generally recognized that simulations and predictions by groundwater models are associated with uncertainties that originate from various sources. The two major uncertainty sources are related to model...... parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...
Status of standard model predictions and uncertainties for electroweak observables
International Nuclear Information System (INIS)
Kniehl, B.A.
1993-11-01
Recent progress in theoretical predictions of electroweak parameters beyond one loop in the standard model is reviewed. The topics include universal corrections of O(G F 2 M H 2 M W 2 ), O(G F 2 m t 4 ), O(α s G F M W 2 ), and those due to virtual t anti t threshold effects, as well as specific corrections to Γ(Z → b anti b) of O(G F 2 m t 4 ), O(α s G F m t 2 ), and O(α s 2 m b 2 /M Z 2 ). An update of the hadronic contributions to Δα is presented. Theoretical uncertainties, other than those due to the lack of knowledge of M H and m t , are estimated. (orig.)
Hughes, J. D.; White, J.
2013-12-01
For many numerical hydrologic models it is a challenge to quantitatively demonstrate that complex models are preferable to simpler models. Typically, a decision is made to develop and calibrate a complex model at the beginning of a study. The value of selecting a complex model over simpler models is commonly inferred from use of a model with fewer simplifications of the governing equations because it can be time consuming to develop another numerical code with data processing and parameter estimation functionality. High-level programming languages like Python can greatly reduce the effort required to develop and calibrate simple models that can be used to quantitatively demonstrate the increased value of a complex model. We have developed and calibrated a spatially-distributed surface-water/groundwater flow model for managed basins in southeast Florida, USA, to (1) evaluate the effect of municipal groundwater pumpage on surface-water/groundwater exchange, (2) investigate how the study area will respond to sea-level rise, and (3) explore combinations of these forcing functions. To demonstrate the increased value of this complex model, we developed a two-parameter conceptual-benchmark-discharge model for each basin in the study area. The conceptual-benchmark-discharge model includes seasonal scaling and lag parameters and is driven by basin rainfall. The conceptual-benchmark-discharge models were developed in the Python programming language and used weekly rainfall data. Calibration was implemented with the Broyden-Fletcher-Goldfarb-Shanno method available in the Scientific Python (SciPy) library. Normalized benchmark efficiencies calculated using output from the complex model and the corresponding conceptual-benchmark-discharge model indicate that the complex model has more explanatory power than the simple model driven only by rainfall.
International Nuclear Information System (INIS)
Olsen, A.R.; Cunningham, M.E.
1980-01-01
With the increasing sophistication and use of computer codes in the nuclear industry, there is a growing awareness of the need to identify and quantify the uncertainties of these codes. In any effort to model physical mechanisms, the results obtained from the model are subject to some degree of uncertainty. This uncertainty has two primary sources. First, there is uncertainty in the model's representation of reality. Second, there is an uncertainty in the input data required by the model. If individual models are combined into a predictive sequence, the uncertainties from an individual model will propagate through the sequence and add to the uncertainty of results later obtained. Nuclear fuel rod stored-energy models, characterized as a combination of numerous submodels, exemplify models so affected. Each submodel depends on output from previous calculations and may involve iterative interdependent submodel calculations for the solution. The iterative nature of the model and the cost of running the model severely limit the uncertainty analysis procedures. An approach for uncertainty analysis under these conditions was designed for the particular case of stored-energy models. It is assumed that the complicated model is correct, that a simplified model based on physical considerations can be designed to approximate the complicated model, and that linear error propagation techniques can be used on the simplified model
Ronald E. McRoberts; Veronica C. Lessard
2001-01-01
Uncertainty in diameter growth predictions is attributed to three general sources: measurement error or sampling variability in predictor variables, parameter covariances, and residual or unexplained variation around model expectations. Using measurement error and sampling variability distributions obtained from the literature and Monte Carlo simulation methods, the...
Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I.
Bautista, M. A.; Fivet, V.; Quinet, P.; Dunn, J.; Gull, T. R.; Kallman, T. R.; Mendoza, C.
2013-01-01
We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data.We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of Oiii and Fe ii and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe ii]. Key words: atomic data - atomic processes - line: formation - methods: data analysis - molecular data - molecular processes - techniques: spectroscopic
Three Roles of Conceptual Models in Information System Design and Use
Wieringa, Roelf J.; Falkenberg, Eckhard D.; Lindgreen, Paul
1989-01-01
This paper attempts to draw together results from information systems research, linguistic theory, and methodology in order to present a unified framework in which to understand conceptual models. Three different roles of conceptual models (CM's) in the design and use of information systems (IS's)
Non-monotonic reasoning in conceptual modeling and ontology design: A proposal
CSIR Research Space (South Africa)
Casini, G
2013-06-01
Full Text Available -1 2nd International Workshop on Ontologies and Conceptual Modeling (Onto.Com 2013), Valencia, Spain, 17-21 June 2013 Non-monotonic reasoning in conceptual modeling and ontology design: A proposal Giovanni Casini1 and Alessandro Mosca2 1...
Applying a Conceptual Model in Sport Sector Work- Integrated Learning Contexts
Agnew, Deborah; Pill, Shane; Orrell, Janice
2017-01-01
This paper applies a conceptual model for work-integrated learning (WIL) in a multidisciplinary sports degree program. Two examples of WIL in sport will be used to illustrate how the conceptual WIL model is being operationalized. The implications for practice are that curriculum design must recognize a highly flexible approach to the nature of…
Teacher Emotion Research: Introducing a Conceptual Model to Guide Future Research
Fried, Leanne; Mansfield, Caroline; Dobozy, Eva
2015-01-01
This article reports on the development of a conceptual model of teacher emotion through a review of teacher emotion research published between 2003 and 2013. By examining 82 publications regarding teacher emotion, the main aim of the review was to identify how teacher emotion was conceptualised in the literature and develop a conceptual model to…
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Facets of private practice nursing: a conceptual model.
Wilson, Anne; Averis, Andrea
2002-04-01
This paper critically examines the literature relating to private practice nursing. Particular attention is given to the reasons nurses choose private practice and the major issues involved. A conceptual model has been developed based on this information. Nurses' roles are expanding into different work domains. Private practice nursing is one of the advanced practice options available. It also requires the nurse to develop business knowledge and skills. A literature search was conducted of Pub-Med, Cinahl, Medline and InfoTrac databases using the terms 'private practice', 'nurse entrepreneur', 'nurses in business', Inurse practitioners', 'self-employed nurse', 'advanced practice' and 'clinical nurse specialist'. Further relevant articles were identified from the reference lists of papers detected by this literature search. In addition, conference proceedings were examined for any other material on this topic. A thorough search of the existing literature revealed one unpublished theoretically based study which examined limited aspects of private practice nursing in Victoria. A reasonable number of articles and publications that provided anecdotal and personal accounts of being a nurse in business were identified. This review highlights the need for further theoretically based research in this area of nursing, so as to expand nursing knowledge. Suggestions are given for further research in this topical area. Existing research into private practice nursing is limited and not sufficient to inform changes to policy and nurse education. More research is needed.
Virtual Learning Environment for Entrepreneurship: A Conceptual Model
Directory of Open Access Journals (Sweden)
Douglas Sparkes
2016-06-01
Full Text Available The University of Waterloo has a history as an innovative and entrepreneurial university. With increasing demand for entrepreneurship education and venture development support there has been increasing interest in how to provide this support virtually. To address this need, an entrepreneurship platform consisting of four primary components; entrepreneurial team engagement, mentor engagement, provision of 'just-in-time' learning resources, and social network creation is under development. Engagement and social network creation are built around a series of gamified events that provide structure and feedback for the participants, as well as focal points for mentoring and network development. The 'embedding' of these early-stage ventures into a supportive social network aligns with a belief that one does not simply launch new ventures, but rather launch networks. These event gates are supported by a system of 'just-in-time' learning modules allow the participants to develop their own learning program, and may be drawn upon as needed. In this paper we discuss the conceptual model as well as progress on development of its key features. We also discuss some of the early results and lessons learned integrating it into several initiatives underway in Canada and Kenya.
A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty
Friedel, Michael J.
2011-01-01
This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.
Developing a conceptual model of possible benefits of condensed tannins for ruminant production.
Tedeschi, L O; Ramírez-Restrepo, C A; Muir, J P
2014-07-01
Enteric methane (CH4) emissions from ruminants have compelled a wide range of research initiatives to identify environmental abatement opportunities. However, although such mitigations can theoretically be attained with feed additives and feeding strategies, the limited empirical evidence on plant extracts used as feed additives does not support extensive or long-term reductions. Nevertheless, their strategic use (i.e. alone or combined in a simultaneous or consecutive use) may provide not only acceptable CH4 abatement levels, but also relevant effects on animal physiology and productivity. Condensed tannins (CT) represent a range of polyphenolic compounds of flavan-3-ol units present in some forage species that can also be added to prepared diets. Methods to determine CT, or their conjugated metabolites, are not simple. Although there are limitations and uncertainties about the methods to be applied, CT are thought to reduce CH4 production (1) indirectly by binding to the dietary fibre and/or reducing the rumen digestion and digestibility of the fibre and (2) directly by inhibiting the growth of rumen methanogens. On the basis of their role in livestock nutrition, CT influence the digestion of protein in the rumen because of their affinity for proteins (e.g. oxidative coupling and H bonding at neutral pH) that causes the CT-protein complex to be insoluble in the rumen; and dissociate in the abomasum at pH 2.5 to 3.0 for proteolysis and absorption in the small intestine. CT may also reduce gastro-intestinal parasite burdens and improve reproductive performance, foetal development, immune system response, hormone serum concentrations, wool production and lactation. The objectives of this paper are to discuss some of the beneficial and detrimental effects of CT on ruminant production systems and to develop a conceptual model to illustrate these metabolic relationships in terms of systemic physiology using earlier investigations with the CT-containing legume Lotus
Quantum-memory-assisted entropic uncertainty in spin models with Dzyaloshinskii-Moriya interaction
Huang, Zhiming
2018-02-01
In this article, we investigate the dynamics and correlations of quantum-memory-assisted entropic uncertainty, the tightness of the uncertainty, entanglement, quantum correlation and mixedness for various spin chain models with Dzyaloshinskii-Moriya (DM) interaction, including the XXZ model with DM interaction, the XY model with DM interaction and the Ising model with DM interaction. We find that the uncertainty grows to a stable value with growing temperature but reduces as the coupling coefficient, anisotropy parameter and DM values increase. It is found that the entropic uncertainty is closely correlated with the mixedness of the system. The increasing quantum correlation can result in a decrease in the uncertainty, and the robustness of quantum correlation is better than entanglement since entanglement means sudden birth and death. The tightness of the uncertainty drops to zero, apart from slight volatility as various parameters increase. Furthermore, we propose an effective approach to steering the uncertainty by weak measurement reversal.
Assessing uncertainty in SRTM elevations for global flood modelling
Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.
2017-12-01
The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.
Conceptual and numerical modeling approach of the Guarani Aquifer System
Directory of Open Access Journals (Sweden)
L. Rodríguez
2013-01-01
Full Text Available In large aquifers, relevant for their considerable size, regional groundwater modeling remains challenging given geologic complexity and data scarcity in space and time. Yet, it may be conjectured that regional scale groundwater flow models can help in understanding the flow system functioning and the relative magnitude of water budget components, which are important for aquifer management. The Guaraní Aquifer System is the largest transboundary aquifer in South America. It contains an enormous volume of water; however, it is not well known, being difficult to assess the impact of exploitation currently used to supply over 25 million inhabitants. This is a sensitive issue because the aquifer is shared by four countries. Moreover, an integrated groundwater model, and therefore a global water balance, were not available. In this work, a transient regional scale model for the entire aquifer based upon five simplified, equally plausible conceptual models represented by different hydraulic conductivity parametrizations is used to analyze the flow system and water balance components. Combining an increasing number of hydraulic conductivity zones and an appropriate set of boundary conditions, the hypothesis of a continuous sedimentary unit yielded errors within the calibration target in a regional sense. The magnitude of the water budget terms resulted very similar for all parametrizations. Recharge and stream/aquifer fluxes were the dominant components representing, on average, 84.2% of total inflows and 61.4% of total outflows, respectively. However, leakage was small compared to stream discharges of main rivers. For instance, the simulated average leakage for the Uruguay River was 8 m^{3} s^{−1} while the observed absolute minimum discharge was 382 m^{3} s^{−1}. Streams located in heavily pumped regions switched from a gaining condition in early years to a losing condition over time. Water is discharged through
Conceptual and numerical modeling approach of the Guarani Aquifer System
Rodríguez, L.; Vives, L.; Gomez, A.
2013-01-01
In large aquifers, relevant for their considerable size, regional groundwater modeling remains challenging given geologic complexity and data scarcity in space and time. Yet, it may be conjectured that regional scale groundwater flow models can help in understanding the flow system functioning and the relative magnitude of water budget components, which are important for aquifer management. The Guaraní Aquifer System is the largest transboundary aquifer in South America. It contains an enormous volume of water; however, it is not well known, being difficult to assess the impact of exploitation currently used to supply over 25 million inhabitants. This is a sensitive issue because the aquifer is shared by four countries. Moreover, an integrated groundwater model, and therefore a global water balance, were not available. In this work, a transient regional scale model for the entire aquifer based upon five simplified, equally plausible conceptual models represented by different hydraulic conductivity parametrizations is used to analyze the flow system and water balance components. Combining an increasing number of hydraulic conductivity zones and an appropriate set of boundary conditions, the hypothesis of a continuous sedimentary unit yielded errors within the calibration target in a regional sense. The magnitude of the water budget terms resulted very similar for all parametrizations. Recharge and stream/aquifer fluxes were the dominant components representing, on average, 84.2% of total inflows and 61.4% of total outflows, respectively. However, leakage was small compared to stream discharges of main rivers. For instance, the simulated average leakage for the Uruguay River was 8 m3 s-1 while the observed absolute minimum discharge was 382 m3 s-1. Streams located in heavily pumped regions switched from a gaining condition in early years to a losing condition over time. Water is discharged through the aquifer boundaries, except at the eastern boundary. On average
Effect of Baseflow Separation on Uncertainty of Hydrological Modeling in the Xinanjiang Model
Directory of Open Access Journals (Sweden)
Kairong Lin
2014-01-01
Full Text Available Based on the idea of inputting more available useful information for evaluation to gain less uncertainty, this study focuses on how well the uncertainty can be reduced by considering the baseflow estimation information obtained from the smoothed minima method (SMM. The Xinanjiang model and the generalized likelihood uncertainty estimation (GLUE method with the shuffled complex evolution Metropolis (SCEM-UA sampling algorithm were used for hydrological modeling and uncertainty analysis, respectively. The Jiangkou basin, located in the upper of the Hanjiang River, was selected as case study. It was found that the number and standard deviation of behavioral parameter sets both decreased when the threshold value for the baseflow efficiency index increased, and the high Nash-Sutcliffe efficiency coefficients correspond well with the high baseflow efficiency coefficients. The results also showed that uncertainty interval width decreased significantly, while containing ratio did not decrease by much and the simulated runoff with the behavioral parameter sets can fit better to the observed runoff, when threshold for the baseflow efficiency index was taken into consideration. These implied that using the baseflow estimation information can reduce the uncertainty in hydrological modeling to some degree and gain more reasonable prediction bounds.
Energy Technology Data Exchange (ETDEWEB)
Langton, C.; Kosson, D.
2009-11-30
Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focus of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various
The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model
DEFF Research Database (Denmark)
Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo
2014-01-01
Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi...
Modelling public risk evaluation of natural hazards: a conceptual approach
Plattner, Th.
2005-04-01
In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.
Uncertainty modeling dedicated to professor Boris Kovalerchuk on his anniversary
2017-01-01
This book commemorates the 65th birthday of Dr. Boris Kovalerchuk, and reflects many of the research areas covered by his work. It focuses on data processing under uncertainty, especially fuzzy data processing, when uncertainty comes from the imprecision of expert opinions. The book includes 17 authoritative contributions by leading experts.
Uncertainty analysis of hydrological modeling in a tropical area using different algorithms
Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh
2018-01-01
Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18model use for policy or management decisions.
Parameter Uncertainty for Aircraft Aerodynamic Modeling using Recursive Least Squares
Grauer, Jared A.; Morelli, Eugene A.
2016-01-01
A real-time method was demonstrated for determining accurate uncertainty levels of stability and control derivatives estimated using recursive least squares and time-domain data. The method uses a recursive formulation of the residual autocorrelation to account for colored residuals, which are routinely encountered in aircraft parameter estimation and change the predicted uncertainties. Simulation data and flight test data for a subscale jet transport aircraft were used to demonstrate the approach. Results showed that the corrected uncertainties matched the observed scatter in the parameter estimates, and did so more accurately than conventional uncertainty estimates that assume white residuals. Only small differences were observed between batch estimates and recursive estimates at the end of the maneuver. It was also demonstrated that the autocorrelation could be reduced to a small number of lags to minimize computation and memory storage requirements without significantly degrading the accuracy of predicted uncertainty levels.
Slangen, A.H.L.; van Tulder, R.J.M.
2009-01-01
It is well accepted that multinational enterprises (MNEs) prefer equity joint ventures (JVs) over wholly owned subsidiaries (WOSs) in foreign countries where the formal and informal external environment is highly uncertain. Many entry mode studies have modeled the external uncertainty faced by MNEs
Modeling Uncertainty of Directed Movement via Markov Chains
Directory of Open Access Journals (Sweden)
YIN Zhangcai
2015-10-01
Full Text Available Probabilistic time geography (PTG is suggested as an extension of (classical time geography, in order to present the uncertainty of an agent located at the accessible position by probability. This may provide a quantitative basis for most likely finding an agent at a location. In recent years, PTG based on normal distribution or Brown bridge has been proposed, its variance, however, is irrelevant with the agent's speed or divergent with the increase of the speed; so they are difficult to take into account application pertinence and stability. In this paper, a new method is proposed to model PTG based on Markov chain. Firstly, a bidirectional conditions Markov chain is modeled, the limit of which, when the moving speed is large enough, can be regarded as the Brown bridge, thus has the characteristics of digital stability. Then, the directed movement is mapped to Markov chains. The essential part is to build step length, the state space and transfer matrix of Markov chain according to the space and time position of directional movement, movement speed information, to make sure the Markov chain related to the movement speed. Finally, calculating continuously the probability distribution of the directed movement at any time by the Markov chains, it can be get the possibility of an agent located at the accessible position. Experimental results show that, the variance based on Markov chains not only is related to speed, but also is tending towards stability with increasing the agent's maximum speed.
Mesh refinement for uncertainty quantification through model reduction
International Nuclear Information System (INIS)
Li, Jing; Stinis, Panos
2015-01-01
We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan–Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory
The role of uncertainty in supply chains under dynamic modeling
Directory of Open Access Journals (Sweden)
M. Fera
2017-01-01
Full Text Available The uncertainty in the supply chains (SCs for manufacturing and services firms is going to be, over the coming decades, more important for the companies that are called to compete in a new globalized economy. Risky situations for manufacturing are considered in trying to individuate the optimal positioning of the order penetration point (OPP. It aims at defining the best level of information of the client’s order going back through the several supply chain (SC phases, i.e. engineering, procurement, production and distribution. This work aims at defining a system dynamics model to assess competitiveness coming from the positioning of the order in different SC locations. A Taguchi analysis has been implemented to create a decision map for identifying possible strategic decisions under different scenarios and with alternatives for order location in the SC levels. Centralized and decentralized strategies for SC integration are discussed. In the model proposed, the location of OPP is influenced by the demand variation, production time, stock-outs and stock amount. Results of this research are as follows: (i customer-oriented strategies are preferable under high volatility of demand, (ii production-focused strategies are suggested when the probability of stock-outs is high, (iii no specific location is preferable if a centralized control architecture is implemented, (iv centralization requires cooperation among partners to achieve the SC optimum point, (v the producer must not prefer the OPP location at the Retailer level when the general strategy is focused on a decentralized approach.
Spatial uncertainty of a geoid undulation model in Guayaquil, Ecuador
Chicaiza, E. G.; Leiva, C. A.; Arranz, J. J.; Buenańo, X. E.
2017-06-01
Geostatistics is a discipline that deals with the statistical analysis of regionalized variables. In this case study, geostatistics is used to estimate geoid undulation in the rural area of Guayaquil town in Ecuador. The geostatistical approach was chosen because the estimation error of prediction map is getting. Open source statistical software R and mainly geoR, gstat and RGeostats libraries were used. Exploratory data analysis (EDA), trend and structural analysis were carried out. An automatic model fitting by Iterative Least Squares and other fitting procedures were employed to fit the variogram. Finally, Kriging using gravity anomaly of Bouguer as external drift and Universal Kriging were used to get a detailed map of geoid undulation. The estimation uncertainty was reached in the interval [-0.5; +0.5] m for errors and a maximum estimation standard deviation of 2 mm in relation with the method of interpolation applied. The error distribution of the geoid undulation map obtained in this study provides a better result than Earth gravitational models publicly available for the study area according the comparison with independent validation points. The main goal of this paper is to confirm the feasibility to use geoid undulations from Global Navigation Satellite Systems and leveling field measurements and geostatistical techniques methods in order to use them in high-accuracy engineering projects.
Spatial uncertainty of a geoid undulation model in Guayaquil, Ecuador
Directory of Open Access Journals (Sweden)
Chicaiza E.G.
2017-06-01
Full Text Available Geostatistics is a discipline that deals with the statistical analysis of regionalized variables. In this case study, geostatistics is used to estimate geoid undulation in the rural area of Guayaquil town in Ecuador. The geostatistical approach was chosen because the estimation error of prediction map is getting. Open source statistical software R and mainly geoR, gstat and RGeostats libraries were used. Exploratory data analysis (EDA, trend and structural analysis were carried out. An automatic model fitting by Iterative Least Squares and other fitting procedures were employed to fit the variogram. Finally, Kriging using gravity anomaly of Bouguer as external drift and Universal Kriging were used to get a detailed map of geoid undulation. The estimation uncertainty was reached in the interval [-0.5; +0.5] m for errors and a maximum estimation standard deviation of 2 mm in relation with the method of interpolation applied. The error distribution of the geoid undulation map obtained in this study provides a better result than Earth gravitational models publicly available for the study area according the comparison with independent validation points. The main goal of this paper is to confirm the feasibility to use geoid undulations from Global Navigation Satellite Systems and leveling field measurements and geostatistical techniques methods in order to use them in high-accuracy engineering projects.
Evaluation of Spatial Uncertainties In Modeling of Cadastral Systems
Fathi, Morteza; Teymurian, Farideh
2013-04-01
Cadastre plays an essential role in sustainable development especially in developing countries like Iran. A well-developed Cadastre results in transparency of estates tax system, transparency of data of estate, reduction of action before the courts and effective management of estates and natural sources and environment. Multipurpose Cadastre through gathering of other related data has a vital role in civil, economic and social programs and projects. Iran is being performed Cadastre for many years but success in this program is subject to correct geometric and descriptive data of estates. Since there are various sources of data with different accuracy and precision in Iran, some difficulties and uncertainties are existed in modeling of geometric part of Cadastre such as inconsistency between data in deeds and Cadastral map which cause some troubles in execution of cadastre and result in losing national and natural source, rights of nation. Now there is no uniform and effective technical method for resolving such conflicts. This article describes various aspects of such conflicts in geometric part of cadastre and suggests a solution through some modeling tools of GIS.
Conceptual model of water resources in the Kabul Basin, Afghanistan
Mack, Thomas J.; Akbari, M. Amin; Ashoor, M. Hanif; Chornack, Michael P.; Coplen, Tyler B.; Emerson, Douglas G.; Hubbard, Bernard E.; Litke, David W.; Michel, Robert L.; Plummer, Niel; Rezai, M. Taher; Senay, Gabriel B.; Verdin, James P.; Verstraeten, Ingrid M.
2010-01-01
The United States (U.S.) Geological Survey has been working with the Afghanistan Geological Survey and the Afghanistan Ministry of Energy and Water on water-resources investigations in the Kabul Basin under an agreement supported by the United States Agency for International Development. This collaborative investigation compiled, to the extent possible in a war-stricken country, a varied hydrogeologic data set and developed limited data-collection networks to assist with the management of water resources in the Kabul Basin. This report presents the results of a multidisciplinary water-resources assessment conducted between 2005 and 2007 to address questions of future water availability for a growing population and of the potential effects of climate change. Most hydrologic and climatic data-collection activities in Afghanistan were interrupted in the early 1980s as a consequence of war and civil strife and did not resume until 2003 or later. Because of the gap of more than 20 years in the record of hydrologic and climatic observations, this investigation has made considerable use of remotely sensed data and, where available, historical records to investigate the water resources of the Kabul Basin. Specifically, this investigation integrated recently acquired remotely sensed data and satellite imagery, including glacier and climatic data; recent climate-change analyses; recent geologic investigations; analysis of streamflow data; groundwater-level analysis; surface-water- and groundwater-quality data, including data on chemical and isotopic environmental tracers; and estimates of public-supply and agricultural water uses. The data and analyses were integrated by using a simplified groundwater-flow model to test the conceptual model of the hydrologic system and to assess current (2007) and future (2057) water availability. Recharge in the basin is spatially and temporally variable and generally occurs near streams and irrigated areas in the late winter and early
[Case management and complex chronic diseases: concepts, models, evidence and uncertainties].
Morales-Asencio, José Miguel
2014-01-01
Chronic diseases are the greatest challenge for Health Care, but the conventional health care models have failed noticeably. Nurses are one of the main providers of the services developed to tackle this challenge, with special emphasis on case management, as one of the most common forms. But, one of the key problems is that case management is poorly conceptualized, and with the diversity of experience available, make its development and comparative evaluation difficult. An in-depth review on case management definition and concepts is presented in this article, with a description of the models, ingredients and the effectiveness reported in various studies. The remaining uncertainties in case management, such as the heterogeneity of designs and target populations, the weak description of the components, and the scarce use of research models for complex interventions, are also discussed. Finally, some key factors for a successful implementation of case management are detailed, such as a clear definition of accountability and roles, the existence of support to guarantee the competence of case managers, the use of valid mechanisms for case finding, adjusted caseload, accessible and team-shared record systems, or the integration of health and social services. Copyright © 2013 Elsevier España, S.L. All rights reserved.
Model and parametric uncertainty in source-based kinematic models of earthquake ground motion
Hartzell, Stephen; Frankel, Arthur; Liu, Pengcheng; Zeng, Yuehua; Rahman, Shariftur
2011-01-01
Four independent ground-motion simulation codes are used to model the strong ground motion for three earthquakes: 1994 Mw 6.7 Northridge, 1989 Mw 6.9 Loma Prieta, and 1999 Mw 7.5 Izmit. These 12 sets of synthetics are used to make estimates of the variability in ground-motion predictions. In addition, ground-motion predictions over a grid of sites are used to estimate parametric uncertainty for changes in rupture velocity. We find that the combined model uncertainty and random variability of the simulations is in the same range as the variability of regional empirical ground-motion data sets. The majority of the standard deviations lie between 0.5 and 0.7 natural-log units for response spectra and 0.5 and 0.8 for Fourier spectra. The estimate of model epistemic uncertainty, based on the different model predictions, lies between 0.2 and 0.4, which is about one-half of the estimates for the standard deviation of the combined model uncertainty and random variability. Parametric uncertainty, based on variation of just the average rupture velocity, is shown to be consistent in amplitude with previous estimates, showing percentage changes in ground motion from 50% to 300% when rupture velocity changes from 2.5 to 2.9 km/s. In addition, there is some evidence that mean biases can be reduced by averaging ground-motion estimates from different methods.
Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.
2014-12-01
A Bayesian framework is developed to quantify predictive uncertainty in environmental modeling caused by uncertainty in modeling scenarios, model structures, model parameters, and data. An example of using the framework to quantify model uncertainty is presented to simulate soil microbial respiration pulses in response to episodic rainfall pulses (the "Birch effect"). A total of five models are developed; they evolve from an existing four-carbon (C) pool model to models with additional C pools and recently developed models with explicit representations of soil moisture controls on C degradation and microbial uptake rates. Markov chain Monte Carlo (MCMC) methods with generalized likelihood function (not Gaussian) are used to estimate posterior parameter distributions of the models, and the posterior parameter samples are used to evaluate probabilities of the models. The models with explicit representations of soil moisture controls outperform the other models. The models with additional C pools for accumulation of degraded C in the dry zone of the soil pore space result in a higher probability of reproducing the observed Birch pulses. A cross-validation is conducted to explore predictive performance of model averaging and of individual models. The Bayesian framework is mathematically general and can be applied to a wide range of environmental problems.
DEFF Research Database (Denmark)
Lindblom, Erik Ulfson; Ahlman, S.; Mikkelsen, Peter Steen
2011-01-01
95% model prediction bounds. A positive correlation between the dry deposition and the dry (wind) removal rates was revealed as well as a negative correlation between the wet removal (wash-off) rate and the ratio between the dry deposition and wind removal rates, which determines the maximum pool......A dynamic conceptual and lumped accumulation wash-off model (SEWSYS) is uncertainty-calibrated with Zn, Cu, Pb and Cd field data from an intensive, detailed monitoring campaign. We use the generalized linear uncertainty estimation (GLUE) technique in combination with the Metropolis algorithm, which...... allows identifying a range of behavioral model parameter sets. The small catchment size and nearness of the rain gauge justified excluding the hydrological model parameters from the uncertainty assessment. Uniform, closed prior distributions were heuristically specified for the dry and wet removal...
Abesser, Corinna; Hughes, Andrew; Boon, David
2017-04-01
Coastal dunes are delicate systems that are under threat from a variety of human and natural influences. Groundwater modelling can provide a better understanding of how these systems operate and can be a useful tool towards the effective management of a coastal dune system, e.g. through predicting impacts from climatic change, sea level rise and land use management. Because of their small size, typically 10 - 100 km2, models representing small dune aquifer systems are more sensitive to uncertainties in input data, model geometry and model parameterisation as well as to the availability of observational data. This study describes the development of a groundwater flow model for a small (8 km2) spit dune system, Braunton Burrows, on the Southwest coast of England, UK. The system has been extensively studied and its hydrology is thought to be well understood. However, model development revealed a high degree of uncertainty relating to model structure (definition of model boundary conditions) and parameterisation (e.g., transmissivity distributions within the model domain). An iterative approach was employed, integrating (1) sensitivity analyses, (2) targeted field investigations and (3) Monte Carlo simulations within a cycle of repeated interrogation of the model outputs, observed data and conceptual understanding. Assessment of "soft information" and targeted field investigations were an important part of this iterative modelling process. For example, a passive seismic survey (TROMINO®) provided valuable new data for the characterisation of concealed bedrock topography and thickness of superficial deposits. The data confirmed a generally inclined underlying wave cut rock shelf platform (as suggested by literature sources), revealed a buried valley, and led to a more detailed delineation of transmissivity zones within the model domain. Constructing models with increasingly more complex spatial distributions of transmissivity, resulted in considerable improvements in
Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification, Phase I
National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...
Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi
2018-05-01
The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.
GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose
Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.
2014-01-01
This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.
Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions
Jung, J. Y.; Niemann, J. D.; Greimann, B. P.
2016-12-01
Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.
Modelling sensitivity and uncertainty in a LCA model for waste management systems - EASETECH
DEFF Research Database (Denmark)
Damgaard, Anders; Clavreul, Julie; Baumeister, Hubert
2013-01-01
In the new model, EASETECH, developed for LCA modelling of waste management systems, a general approach for sensitivity and uncertainty assessment for waste management studies has been implemented. First general contribution analysis is done through a regular interpretation of inventory and impact...
In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...
Fast uncertainty reduction strategies relying on Gaussian process models
International Nuclear Information System (INIS)
Chevalier, Clement
2013-01-01
This work deals with sequential and batch-sequential evaluation strategies of real-valued functions under limited evaluation budget, using Gaussian process models. Optimal Stepwise Uncertainty Reduction (SUR) strategies are investigated for two different problems, motivated by real test cases in nuclear safety. First we consider the problem of identifying the excursion set above a given threshold T of a real-valued function f. Then we study the question of finding the set of 'safe controlled configurations', i.e. the set of controlled inputs where the function remains below T, whatever the value of some others non-controlled inputs. New SUR strategies are presented, together with efficient procedures and formulas to compute and use them in real world applications. The use of fast formulas to recalculate quickly the posterior mean or covariance function of a Gaussian process (referred to as the 'kriging update formulas') does not only provide substantial computational savings. It is also one of the key tools to derive closed form formulas enabling a practical use of computationally-intensive sampling strategies. A contribution in batch-sequential optimization (with the multi-points Expected Improvement) is also presented. (author)
Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification
Energy Technology Data Exchange (ETDEWEB)
Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)
2014-09-01
In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO_{2} . This will allow for the examination of regional-scale transport and distribution of CO_{2} along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO_{2} inversions. We have tested the approach using data and model outputs from the TransCom3 global CO_{2} inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF
A Scoping Review: Conceptualizations and Pedagogical Models of Learning in Nursing Simulation
Poikela, Paula; Teräs, Marianne
2015-01-01
Simulations have been implemented globally in nursing education for years with diverse conceptual foundations. The aim of this scoping review is to examine the literature regarding the conceptualizations of learning and pedagogical models in nursing simulations. A scoping review of peer-reviewed articles published between 2000 and 2013 was…
A conceptual modeling framework for discrete event simulation using hierarchical control structures.
Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D
2015-08-01
Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.
Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model
Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.
2012-12-01
Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root
Impact on Model Uncertainty of Diabatization in Distillation Columns
DEFF Research Database (Denmark)
Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens
2014-01-01
This work provides uncertainty and sensitivity analysis of design of conventional and heat integrated distillation columns using Monte Carlo simulations. Selected uncertain parameters are relative volatility, heat of vaporization, the overall heat transfer coefficient , tray hold-up, and adiabat ...
Statistical approach for uncertainty quantification of experimental modal model parameters
DEFF Research Database (Denmark)
Luczak, M.; Peeters, B.; Kahsin, M.
2014-01-01
estimates obtained from vibration experiments. Modal testing results are influenced by numerous factors introducing uncertainty to the measurement results. Different experimental techniques applied to the same test item or testing numerous nominally identical specimens yields different test results...
Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I
National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...
An Efficient Deterministic Approach to Model-based Prediction Uncertainty
National Aeronautics and Space Administration — Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the...
Quantification of Uncertainties in Integrated Spacecraft System Models, Phase II
National Aeronautics and Space Administration — The objective for the Phase II effort will be to develop a comprehensive, efficient, and flexible uncertainty quantification (UQ) framework implemented within a...