WorldWideScience

Sample records for include formal uncertainties

  1. Uncertainty and Reconfigurability in Hilbertean Formal Methods

    NARCIS (Netherlands)

    Bujorianu, M.C.; Bujorianu, L.M.

    Hilbertian Formal Methods is a recently introduced paradigm for embedded systems operating in harsh physical environments. This paradigm has been more developed for the deterministic case. However, it is very rare that a physical environment follows precisely a deterministic rule and then it is more

  2. Uncertainty Analysis of the Estimated Risk in Formal Safety Assessment

    Directory of Open Access Journals (Sweden)

    Molin Sun

    2018-01-01

    Full Text Available An uncertainty analysis is required to be carried out in formal safety assessment (FSA by the International Maritime Organization. The purpose of this article is to introduce the uncertainty analysis technique into the FSA process. Based on the uncertainty identification of input parameters, probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. An approach which combines the Monte Carlo random sampling of probability distribution functions with the a-cuts for fuzzy calculus is proposed to propagate the uncertainties. One output of the FSA process is societal risk (SR, which can be evaluated in the two-dimensional frequency–fatality (FN diagram. Thus, the confidence-level-based SR is presented to represent the uncertainty of SR in two dimensions. In addition, a method for time window selection is proposed to estimate the magnitude of uncertainties, which is an important aspect of modeling uncertainties. Finally, a case study is carried out on an FSA study on cruise ships. The results show that the uncertainty analysis of SR generates a two-dimensional area for a certain degree of confidence in the FN diagram rather than a single FN curve, which provides more information to authorities to produce effective risk control measures.

  3. Universal uncertainty principle in the measurement operator formalism

    International Nuclear Information System (INIS)

    Ozawa, Masanao

    2005-01-01

    Heisenberg's uncertainty principle has been understood to set a limitation on measurements; however, the long-standing mathematical formulation established by Heisenberg, Kennard, and Robertson does not allow such an interpretation. Recently, a new relation was found to give a universally valid relation between noise and disturbance in general quantum measurements, and it has become clear that the new relation plays a role of the first principle to derive various quantum limits on measurement and information processing in a unified treatment. This paper examines the above development on the noise-disturbance uncertainty principle in the model-independent approach based on the measurement operator formalism, which is widely accepted to describe a class of generalized measurements in the field of quantum information. We obtain explicit formulae for the noise and disturbance of measurements given by measurement operators, and show that projective measurements do not satisfy the Heisenberg-type noise-disturbance relation that is typical in the gamma-ray microscope thought experiments. We also show that the disturbance on a Pauli operator of a projective measurement of another Pauli operator constantly equals √2, and examine how this measurement violates the Heisenberg-type relation but satisfies the new noise-disturbance relation

  4. [Contracts including performance and management of uncertainty].

    Science.gov (United States)

    Duru, G; Garassus, P; Auray, J-P

    2013-09-01

    Since many decades in France, the most important part of ambulatory health care expenditure is represented by drug consumption. By the fact, French patient is indeed the greatest world consumer of pharmaceuticals treatments. Therefore, the regulation authorities by successive strategies, attempt to limit or even restrict market access for new drugs in the health care sector secured by public social insurance coverage. Common objectives are to assess the reimbursement to scientific studies and to fix the price of therapeutics at an acceptable level for both industries and government. New trends try then to determine recently the drug price in a dual approach, as a component of global and effective contract, including performance and outcome. The first diffusion authorization is diffusion concerned, but this concept takes into account the eventual success of new produces in long-term survey. Signed for a fixed period as reciprocal partnership between regulation authorities and pharmaceutics industries, the contract integrates two dimensions of incertitude. The first one is represented by the strategy of new treatments development according to efficacy and adapted price, and the second one is linked to the result of diffusion and determines adapted rules if eventual non-respects of the previous engagement are registered. This paper discusses problems related to this new dimension of incertitude affected by conditional drug prices in market access strategy and the adapted follow-up of new treatment diffusion fixed by "outcome" contract between French regulation administration and pharmaceutics industries in our recent economic context. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  5. Formal handling of the level 2 uncertainty sources and their combination with the level 1 PSA uncertainties

    International Nuclear Information System (INIS)

    Ahn, K.I.; Joon-Eon, Yang; Jaejoo, Ha

    2007-01-01

    As an essential part of the Level 2 PSA, a probabilistic treatment of complex phenomenological accident pathways inevitably leads to two sources of uncertainty: (a) an incomplete modeling of these accident pathways and their subsequent impacts on the Level 2 risk, and (b) an expert-to-expert variation in the their probabilistic estimates. While the former type of uncertainty is epistemic in nature from the viewpoint that we deal with an uncertainty addressed in the deterministic events, the latter type is a random/aleatory uncertainty. The impacts of the preceding sources of uncertainty on the Level 2 risk measures are different for each other, thus leading to a different conclusion in the decision-making process. An important aspect of the foregoing distinction of uncertainty is that it plays an essential role in identifying what sources of uncertainty impact more on the Level 2 risk and what sources among them should be focused on for a greater reduction of the overall Level 2 uncertainty. Another aspect of the importance is closely related to its role in combining the Level 1 and Level 2 uncertainties. A primary objective of this paper is to explore the aforementioned sources of the Level 2 uncertainty and provide the corresponding approaches for handling them formally. An additional purpose is to provide an approach for combining consistently aleatory uncertainties addressed in the Level 1 PSA with the Level 2 epistemic uncertainties, so that the Level 1 and 2 uncertainties are finally represented as an integrated measure. (authors)

  6. Formalized representation of isoseismal uncertainty for Italian earthquakes

    International Nuclear Information System (INIS)

    Kronrod, T.L.; Molchan, G.M.; Podgaetskaya, V.M.; Panza, G.F.

    2001-11-01

    A unique macroseismic data base, in the form of Intensity Data Point (IDP) maps, is available for some of the seismic events which have occurred in Italy since 461 B.C. The problem of the reconstruction of the isoseismal shape imposes rigorous requirements on the quality of intensity data. We present here a collection of 55 IDP maps in isolines which, to various degrees, meet our quality requirements to analyze the shape problem. Our generalization of an IDP map in isolines is unconventional, and is based on the new approach developed by Molchan et al. (2001). The methodology, along with the smoothing method for IDP maps (Modified Polynomial Filtering, MPF), uses the Diffused Boundary (DB) method which visualizes the uncertainty of isoseismal boundaries. We define a quantitative measure of the isoseismal uncertainty and classify the isoseismals for all selected events. The presented collection of isoseismals can be used for comparison between observed and theoretical isoseismals. This comparison involves simultaneous testing of crustal and source models. The collection gives a sufficiently accurate idea of the quality of isoseismals of Italian earthquakes; it can be used for space local control of the intensity measurements as well. (author)

  7. Formalized representation of isoseismal uncertainty for Italian earthquakes

    International Nuclear Information System (INIS)

    Konrod, T.L.; Molchan, G.M.; Podgaetskaya, V.M.; Panza, G.F.

    2000-11-01

    A unique macroseismic (MS) data base, in the form of Intensity Data Point (IDP) maps, is available for seismic events which have occurred in Italy since 461 B.C. The problem of the reconstruction of the isoseismal shape imposes serious requirements on the quality of MS data. Here we present a set of 70 IDP maps in isolines which, to various degrees, meet our quality requirements to analyze the shape problem. Our generalization of an IDP map in isolines is unconventional, and is based on the new approach developed by Molchan et al. (2000). The methodology along with the smoothing method for IDP maps (Modified Polynomial Filtering, MPF), uses the Diffused Boundary (DB) method which visualizes the uncertainty of isoseismal boundaries. All the 70 IDP maps treated by the MPF and DB methods are represented in Appendix B. We define a quantitative measure of isoseismal uncertainty and classify the isoseismals for all selected events. The processed IDP maps can serve not only as raw data for examining the shapes of isoseismals but they provide also an independent assessment of the quality of MS data for Italian earthquakes. (author)

  8. A formal treatment of uncertainty sources in a level 2 PSA

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    The methodological framework of the level 2 PSA appears to be currently standardized in a formalized fashion, but there have been different opinions on the way the sources of uncertainty are characterized and treated. This is primarily because the level 2 PSA deals with complex phenomenological processes that are deterministic in nature rather than random processes, and there are no probabilistic models characterizing them clearly. As a result, the probabilistic quantification of the level 2 PSA is often subjected to two sources of uncertainty: (a) incomplete modeling of accident pathways or different predictions for the behavior of phenomenological events and (b) expert-to-expert variation in estimating the occurrence probability of phenomenological events. While a clear definition of the two sources of uncertainty involved in the level 2 PSA makes it possible to treat an uncertainty in a consistent manner, careless application of these different sources of uncertainty may produce different conclusions in the decision-making process. The primary purpose of this paper is to characterize typical sources of uncertainty that would often be addressed in the level 2 PSA and their impacts on the PSA level 2 risk results. An additional purpose of this paper is to give a formal approach on how to combine random uncertainties addressed in the level 1 PSA with subjectivistic uncertainties addressed in the level 2 PSA

  9. Cosmological constraints on the neutrino mass including systematic uncertainties

    Science.gov (United States)

    Couchot, F.; Henrot-Versillé, S.; Perdereau, O.; Plaszczynski, S.; Rouillé d'Orfeuil, B.; Spinelli, M.; Tristram, M.

    2017-10-01

    When combining cosmological and oscillations results to constrain the neutrino sector, the question of the propagation of systematic uncertainties is often raised. We address this issue in the context of the derivation of an upper bound on the sum of the neutrino masses (Σmν) with recent cosmological data. This work is performed within the ΛCDM model extended to Σmν, for which we advocate the use of three mass-degenerate neutrinos. We focus on the study of systematic uncertainties linked to the foregrounds modelling in cosmological microwave background (CMB) data analysis, and on the impact of the present knowledge of the reionisation optical depth. This is done through the use of different likelihoods built from Planck data. Limits on Σmν are derived with various combinations of data, including the latest baryon acoustic oscillations (BAO) and Type Ia supernovae (SNIa) results. We also discuss the impact of the preference for current CMB data for amplitudes of the gravitational lensing distortions higher than expected within the ΛCDM model, and add the Planck CMB lensing. We then derive a robust upper limit: Σmνcosmological parameters is also reported, for different assumptions on the neutrino mass repartition, and different high and low multipole CMB likelihoods.

  10. Formal Uncertainty and Dispersion of Single and Double Difference Models for GNSS-Based Attitude Determination.

    Science.gov (United States)

    Chen, Wen; Yu, Chao; Dong, Danan; Cai, Miaomiao; Zhou, Feng; Wang, Zhiren; Zhang, Lei; Zheng, Zhengqi

    2017-02-20

    With multi-antenna synchronized global navigation satellite system (GNSS) receivers, the single difference (SD) between two antennas is able to eliminate both satellite and receiver clock error, thus it becomes necessary to reconsider the equivalency problem between the SD and double difference (DD) models. In this paper, we quantitatively compared the formal uncertainties and dispersions between multiple SD models and the DD model, and also carried out static and kinematic short baseline experiments. The theoretical and experimental results show that under a non-common clock scheme the SD and DD model are equivalent. Under a common clock scheme, if we estimate stochastic uncalibrated phase delay (UPD) parameters every epoch, this SD model is still equivalent to the DD model, but if we estimate only one UPD parameter for all epochs or take it as a known constant, the SD (here called SD2) and DD models are no longer equivalent. For the vertical component of baseline solutions, the formal uncertainties of the SD2 model are two times smaller than those of the DD model, and the dispersions of the SD2 model are even more than twice smaller than those of the DD model. In addition, to obtain baseline solutions, the SD2 model requires a minimum of three satellites, while the DD model requires a minimum of four satellites, which makes the SD2 more advantageous in attitude determination under sheltered environments.

  11. 75 FR 16514 - Bayer Material Science, LLC, Formally Known as Sheffield Plastics, Including On-Site Leased...

    Science.gov (United States)

    2010-04-01

    ... Employment and Training Administration Bayer Material Science, LLC, Formally Known as Sheffield Plastics... Material Science, LLC, formally known as Sheffield Plastics, including on-site leased workers from... that Bayer Material Science, LLC was formally known as Sheffield Plastics. Some workers separated from...

  12. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  13. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  14. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  15. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  16. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  17. Uncertainties of the Yn Parameters of the Hage-Cifarelli Formalism

    Energy Technology Data Exchange (ETDEWEB)

    Smith-Nelson, Mark A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Burr, Thomas Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hutchinson, Jesson D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cutler, Theresa Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-14

    One method for determining the physical parameters of a multiplying system is summarized by Cifarelli [1]. In this methodology the single, double and triple rates are determined from what is commonly referred to as Feynman histograms. This paper will examine two methods for estimating the uncertainty in the parameters used in inferring these rates. These methods will be compared with simulated data in order to determine which one best approximates the sample uncertainty.

  18. Extending the formal model of a spatial data infrastructure to include volunteered geographical information

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2011-07-01

    Full Text Available -to-date VGI, have led to the integration of VGI into some SDIs. Therefore it is necessary to rethink our formal model of an SDI to accommodate VGI. We started our rethinking process with the SDI stakeholders in an attempt to establish which changes...

  19. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    International Nuclear Information System (INIS)

    Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.

    2009-11-01

    the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology and fracturing properties main characteristics. From that

  20. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Energy Technology Data Exchange (ETDEWEB)

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    the lineament scale (k{sub t} = 2) on the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology

  1. Physics-based elastic image registration using splines and including landmark localization uncertainties.

    Science.gov (United States)

    Wörz, Stefan; Rohr, Karl

    2006-01-01

    We introduce an elastic registration approach which is based on a physical deformation model and uses Gaussian elastic body splines (GEBS). We formulate an extended energy functional related to the Navier equation under Gaussian forces which also includes landmark localization uncertainties. These uncertainties are characterized by weight matrices representing anisotropic errors. Since the approach is based on a physical deformation model, cross-effects in elastic deformations can be taken into account. Moreover, we have a free parameter to control the locality of the transformation for improved registration of local geometric image differences. We demonstrate the applicability of our scheme based on 3D CT images from the Truth Cube experiment, 2D MR images of the brain, as well as 2D gel electrophoresis images. It turns out that the new scheme achieves more accurate results compared to previous approaches.

  2. Network optimization including gas lift and network parameters under subsurface uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Riegert, R.; Baffoe, J.; Pajonk, O. [SPT Group GmbH, Hamburg (Germany); Badalov, H.; Huseynov, S. [Technische Univ. Clausthal, Clausthal-Zellerfeld (Germany). ITE; Trick, M. [SPT Group, Calgary, AB (Canada)

    2013-08-01

    Optimization of oil and gas field production systems poses a great challenge to field development due to complex and multiple interactions between various operational design parameters and subsurface uncertainties. Conventional analytical methods are capable of finding local optima based on single deterministic models. They are less applicable for efficiently generating alternative design scenarios in a multi-objective context. Practical implementations of robust optimization workflows integrate the evaluation of alternative design scenarios and multiple realizations of subsurface uncertainty descriptions. Production or economic performance indicators such as NPV (Net Present Value) are linked to a risk-weighted objective function definition to guide the optimization processes. This work focuses on an integrated workflow using a reservoir-network simulator coupled to an optimization framework. The work will investigate the impact of design parameters while considering the physics of the reservoir, wells, and surface facilities. Subsurface uncertainties are described by well parameters such as inflow performance. Experimental design methods are used to investigate parameter sensitivities and interactions. Optimization methods are used to find optimal design parameter combinations which improve key performance indicators of the production network system. The proposed workflow will be applied to a representative oil reservoir coupled to a network which is modelled by an integrated reservoir-network simulator. Gas-lift will be included as an explicit measure to improve production. An objective function will be formulated for the net present value of the integrated system including production revenue and facility costs. Facility and gas lift design parameters are tuned to maximize NPV. Well inflow performance uncertainties are introduced with an impact on gas lift performance. Resulting variances on NPV are identified as a risk measure for the optimized system design. A

  3. Consistent retrieval of land surface radiation products from EO, including traceable uncertainty estimates

    Science.gov (United States)

    Kaminski, Thomas; Pinty, Bernard; Voßbeck, Michael; Lopatka, Maciej; Gobron, Nadine; Robustelli, Monica

    2017-05-01

    Earth observation (EO) land surface products have been demonstrated to provide a constraint on the terrestrial carbon cycle that is complementary to the record of atmospheric carbon dioxide. We present the Joint Research Centre Two-stream Inversion Package (JRC-TIP) for retrieval of variables characterising the state of the vegetation-soil system. The system provides a set of land surface variables that satisfy all requirements for assimilation into the land component of climate and numerical weather prediction models. Being based on a 1-D representation of the radiative transfer within the canopy-soil system, such as those used in the land surface components of advanced global models, the JRC-TIP products are not only physically consistent internally, but they also achieve a high degree of consistency with these global models. Furthermore, the products are provided with full uncertainty information. We describe how these uncertainties are derived in a fully traceable manner without any hidden assumptions from the input observations, which are typically broadband white sky albedo products. Our discussion of the product uncertainty ranges, including the uncertainty reduction, highlights the central role of the leaf area index, which describes the density of the canopy. We explain the generation of products aggregated to coarser spatial resolution than that of the native albedo input and describe various approaches to the validation of JRC-TIP products, including the comparison against in situ observations. We present a JRC-TIP processing system that satisfies all operational requirements and explain how it delivers stable climate data records. Since many aspects of JRC-TIP are generic, the package can serve as an example of a state-of-the-art system for retrieval of EO products, and this contribution can help the user to understand advantages and limitations of such products.

  4. Decision making under uncertainty: An investigation into the application of formal decision-making methods to safety issue decisions

    International Nuclear Information System (INIS)

    Bohn, M.P.

    1992-12-01

    As part of the NRC-sponsored program to study the implications of Generic Issue 57, ''Effects of Fire Protection System Actuation on Safety-Related Equipment,'' a subtask was performed to evaluate the applicability of formal decision analysis methods to generic issues cost/benefit-type decisions and to apply these methods to the GI-57 results. In this report, the numerical results obtained from the analysis of three plants (two PWRs and one BWR) as developed in the technical resolution program for GI-57 were studied. For each plant, these results included a calculation of the person-REM averted due to various accident scenarios and various proposed modifications to mitigate the accident scenarios identified. These results were recomputed to break out the benefit in terms of contributions due to random event scenarios, fire event scenarios, and seismic event scenarios. Furthermore, the benefits associated with risk (in terms of person-REM) averted from earthquakes at three different seismic ground motion levels were separately considered. Given this data, formal decision methodologies involving decision trees, value functions, and utility functions were applied to this basic data. It is shown that the formal decision methodology can be applied at several different levels. Examples are given in which the decision between several retrofits is changed from that resulting from a simple cost/benefit-ratio criterion by virtue of the decision-makinger's expressed (and assumed) preferences

  5. Uncertainty-driven nuclear data evaluation including thermal (n,α) applied to 59Ni

    Science.gov (United States)

    Helgesson, P.; Sjöstrand, H.; Rochman, D.

    2017-11-01

    This paper presents a novel approach to the evaluation of nuclear data (ND), combining experimental data for thermal cross sections with resonance parameters and nuclear reaction modeling. The method involves sampling of various uncertain parameters, in particular uncertain components in experimental setups, and provides extensive covariance information, including consistent cross-channel correlations over the whole energy spectrum. The method is developed for, and applied to, 59Ni, but may be used as a whole, or in part, for other nuclides. 59Ni is particularly interesting since a substantial amount of 59Ni is produced in thermal nuclear reactors by neutron capture in 58Ni and since it has a non-threshold (n,α) cross section. Therefore, 59Ni gives a very important contribution to the helium production in stainless steel in a thermal reactor. However, current evaluated ND libraries contain old information for 59Ni, without any uncertainty information. The work includes a study of thermal cross section experiments and a novel combination of this experimental information, giving the full multivariate distribution of the thermal cross sections. In particular, the thermal (n,α) cross section is found to be 12.7 ± . 7 b. This is consistent with, but yet different from, current established values. Further, the distribution of thermal cross sections is combined with reported resonance parameters, and with TENDL-2015 data, to provide full random ENDF files; all of this is done in a novel way, keeping uncertainties and correlations in mind. The random files are also condensed into one single ENDF file with covariance information, which is now part of a beta version of JEFF 3.3. Finally, the random ENDF files have been processed and used in an MCNP model to study the helium production in stainless steel. The increase in the (n,α) rate due to 59Ni compared to fresh stainless steel is found to be a factor of 5.2 at a certain time in the reactor vessel, with a relative

  6. Improving weather predictability by including land-surface model parameter uncertainty

    Science.gov (United States)

    Orth, Rene; Dutra, Emanuel; Pappenberger, Florian

    2016-04-01

    The land surface forms an important component of Earth system models and interacts nonlinearly with other parts such as ocean and atmosphere. To capture the complex and heterogenous hydrology of the land surface, land surface models include a large number of parameters impacting the coupling to other components of the Earth system model. Focusing on ECMWF's land-surface model HTESSEL we present in this study a comprehensive parameter sensitivity evaluation using multiple observational datasets in Europe. We select 6 poorly constrained effective parameters (surface runoff effective depth, skin conductivity, minimum stomatal resistance, maximum interception, soil moisture stress function shape, total soil depth) and explore their sensitivity to model outputs such as soil moisture, evapotranspiration and runoff using uncoupled simulations and coupled seasonal forecasts. Additionally we investigate the possibility to construct ensembles from the multiple land surface parameters. In the uncoupled runs we find that minimum stomatal resistance and total soil depth have the most influence on model performance. Forecast skill scores are moreover sensitive to the same parameters as HTESSEL performance in the uncoupled analysis. We demonstrate the robustness of our findings by comparing multiple best performing parameter sets and multiple randomly chosen parameter sets. We find better temperature and precipitation forecast skill with the best-performing parameter perturbations demonstrating representativeness of model performance across uncoupled (and hence less computationally demanding) and coupled settings. Finally, we construct ensemble forecasts from ensemble members derived with different best-performing parameterizations of HTESSEL. This incorporation of parameter uncertainty in the ensemble generation yields an increase in forecast skill, even beyond the skill of the default system. Orth, R., E. Dutra, and F. Pappenberger, 2016: Improving weather predictability by

  7. Uncertainties

    Indian Academy of Sciences (India)

    The imperfect understanding of some of the processes and physics in the carbon cycle and chemistry models generate uncertainties in the conversion of emissions to concentration. To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the ...

  8. PDF uncertainties in precision electroweak measurements, including the W mass, in ATLAS

    CERN Document Server

    Cooper-Sarkar, Amanda; The ATLAS collaboration

    2015-01-01

    Now that the Higgs mass is known all the parameters of the SM are known- but with what accuracy? Precision EW measurements test the self-consistency of the SM- and thus can give hints of BSM physics. Precision measurements of $sin^2\\theta _W$ and the W mass are limited by PDF uncertainties This contribution discusses these uncertainties and what can be done to improve them.

  9. Dose computation in conformal radiation therapy including geometric uncertainties: Methods and clinical implications

    Science.gov (United States)

    Rosu, Mihaela

    The aim of any radiotherapy is to tailor the tumoricidal radiation dose to the target volume and to deliver as little radiation dose as possible to all other normal tissues. However, the motion and deformation induced in human tissue by ventilatory motion is a major issue, as standard practice usually uses only one computed tomography (CT) scan (and hence one instance of the patient's anatomy) for treatment planning. The interfraction movement that occurs due to physiological processes over time scales shorter than the delivery of one treatment fraction leads to differences between the planned and delivered dose distributions. Due to the influence of these differences on tumors and normal tissues, the tumor control probabilities and normal tissue complication probabilities are likely to be impacted upon in the face of organ motion. In this thesis we apply several methods to compute dose distributions that include the effects of the treatment geometric uncertainties by using the time-varying anatomical information as an alternative to the conventional Planning Target Volume (PTV) approach. The proposed methods depend on the model used to describe the patient's anatomy. The dose and fluence convolution approaches for rigid organ motion are discussed first, with application to liver tumors and the rigid component of the lung tumor movements. For non-rigid behavior a dose reconstruction method that allows the accumulation of the dose to the deforming anatomy is introduced, and applied for lung tumor treatments. Furthermore, we apply the cumulative dose approach to investigate how much information regarding the deforming patient anatomy is needed at the time of treatment planning for tumors located in thorax. The results are evaluated from a clinical perspective. All dose calculations are performed using a Monte Carlo based algorithm to ensure more realistic and more accurate handling of tissue heterogeneities---of particular importance in lung cancer treatment planning.

  10. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  11. Use of health effect risk estimates and uncertainty in formal regulatory proceedings: a case study involving atmospheric particulates

    International Nuclear Information System (INIS)

    Habegger, L.J.; Oezkaynak, A.H.

    1984-01-01

    Coal combustion particulates are released to the atmosphere by power plants supplying electrical to the nuclear fuel cycle. This paper presents estimates of the public health risks associated with the release of these particulates at a rate associated with the annual nuclear fuel production requirements for a nuclear power plan. Utilization of these risk assessments as a new component in the formal evaluation of total risks from nuclear power plants is discussed. 23 references, 3 tables

  12. BWR transient analysis using neutronic / thermal hydraulic coupled codes including uncertainty quantification

    International Nuclear Information System (INIS)

    Hartmann, C.; Sanchez, V.; Tietsch, W.; Stieglitz, R.

    2012-01-01

    The KIT is involved in the development and qualification of best estimate methodologies for BWR transient analysis in cooperation with industrial partners. The goal is to establish the most advanced thermal hydraulic system codes coupled with 3D reactor dynamic codes to be able to perform a more realistic evaluation of the BWR behavior under accidental conditions. For this purpose a computational chain based on the lattice code (SCALE6/GenPMAXS), the coupled neutronic/thermal hydraulic code (TRACE/PARCS) as well as a Monte Carlo based uncertainty and sensitivity package (SUSA) has been established and applied to different kind of transients of a Boiling Water Reactor (BWR). This paper will describe the multidimensional models of the plant elaborated for TRACE and PARCS to perform the investigations mentioned before. For the uncertainty quantification of the coupled code TRACE/PARCS and specifically to take into account the influence of the kinetics parameters in such studies, the PARCS code has been extended to facilitate the change of model parameters in such a way that the SUSA package can be used in connection with TRACE/PARCS for the U and S studies. This approach will be presented in detail. The results obtained for a rod drop transient with TRACE/PARCS using the SUSA-methodology showed clearly the importance of some kinetic parameters on the transient progression demonstrating that the coupling of a best-estimate coupled codes with uncertainty and sensitivity tools is very promising and of great importance for the safety assessment of nuclear reactors. (authors)

  13. MODARIA WG5: Towards a practical guidance for including uncertainties in the results of dose assessment of routine releases

    Energy Technology Data Exchange (ETDEWEB)

    Mora, Juan C. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Telleria, Diego [International Atomic Energy Agency - IAEA (Austria); Al Neaimi, Ahmed [Emirates Nuclear Energy Corporation - ENEC (United Arab Emirates); Blixt Buhr, Anna Ma [Vattenfall AB (Sweden); Bonchuk, Iurii [Radiation Protection Institute - RPI (Ukraine); Chouhan, Sohan [Atomic Energy of Canada Limited - AECL (Canada); Chyly, Pavol [SE-VYZ (Slovakia); Curti, Adriana R. [Autoridad Regulatoria Nuclear - ARN (Argentina); Da Costa, Dejanira [Instituto de Radioprotecao e Dosimetria - IRD (Brazil); Duran, Juraj [VUJE Inc (Slovakia); Galeriu, Dan [Horia Hulubei National Institute of Physics and Nuclear Engineering - IFIN-HH (Romania); Haegg, Ann- Christin; Lager, Charlotte [Swedish Radiation Safety Authority - SSM (Sweden); Heling, Rudie [Nuclear Research and Consultancy Group - NRG (Netherlands); Ivanis, Goran; Shen, Jige [Ecometrix Incorporated (Canada); Iosjpe, Mikhail [Norwegian Radiation Protection Authority - NRPA (Norway); Krajewski, Pawel M. [Central Laboratory for Radiological Protection - CLOR (Poland); Marang, Laura; Vermorel, Fabien [Electricite de France - EdF (France); Mourlon, Christophe [Institut de Radioprotection et de Surete Nucleaire - IRSN (France); Perez, Fabricio F. [Belgian Nuclear Research Centre - SCK (Belgium); Woodruffe, Andrew [Federal Authority for Nuclear Regulation - FANR (United Arab Emirates); Zorko, Benjamin [Jozef Stefan Institute (Slovenia)

    2014-07-01

    MODARIA (Modelling and Data for Radiological Impact Assessments) project was launched in 2012 with the aim of improving the capabilities in radiation dose assessment by means of acquisition of improved data for model testing, model testing and comparison, reaching consensus on modelling philosophies, approaches and parameter values, development of improved methods and exchange of information. The project focuses on areas where uncertainties remain in the predictive capability of environmental models, emphasizing in reducing associated uncertainties or developing new approaches to strengthen the evaluation of the radiological impact. Within MODARIA, four main areas were defined, one of them devoted to Uncertainty and Variability. In this area four working groups were included, Working Group 5 dealing with the 'uncertainty and variability analysis for assessments of radiological impacts arising from routine discharges of radionuclides'. Whether doses are estimated by using measurement data, by applying models, or through a combination of measurements and calculations, the variability and uncertainty contribute to a distribution of possible values. The degree of variability and uncertainty is represented by the shape and extent of that distribution. The main objective of WG5 is to explore how to consider uncertainties and variabilities in the results of assessment of doses in planned situations for controlling the impact of routine releases from radioactive and nuclear installations to the environment. The final aim is to produce guidance for the calculation of uncertainties in these exposure situations and for the presentation of such results to the different stakeholders. To achieve that objective the main tasks identified were: to find tools and methods for uncertainty and variability analysis applicable to dose assessments in routine radioactive discharges, to define scenarios where information on uncertainty and variability of parameters is available

  14. Comparative life cycle assessment of wastewater treatment in Denmark including sensitivity and uncertainty analysis

    DEFF Research Database (Denmark)

    Niero, Monia; Pizzol, Massimo; Gundorph Bruun, Henrik

    2014-01-01

    in a comparative LCA of four types of WWTPs, representative of mainstream treatment options in Denmark. The four plant types differ regarding size and treatment technology: aerobic versus anaerobic, chemical vs. combined chemical and biological. Trade-offs in their environmental performance were identified......Wastewater treatment has nowadays multiple functions and produces both clean effluents and sludge, which is increasingly seen as a resource rather than a waste product. Technological as well as management choices influence the performance of wastewater treatment plants (WWTPs) on the multiple...... considering system expansion to model the avoided impacts achievable in different end-of-life scenarios for sludge: combustion with energy production versus agricultural application. To account for the variability in quality of effluents and sludge, and to address the related uncertainties, Monte Carlo...

  15. Probabilistic assessment of fatigue life including statistical uncertainties in the S-N curve

    International Nuclear Information System (INIS)

    Sudret, B.; Hornet, P.; Stephan, J.-M.; Guede, Z.; Lemaire, M.

    2003-01-01

    A probabilistic framework is set up to assess the fatigue life of components of nuclear power plants. It intends to incorporate all kinds of uncertainties such as those appearing in the specimen fatigue life, design sub-factor, mechanical model and applied loading. This paper details the first step, which corresponds to the statistical treatment of the fatigue specimen test data. The specimen fatigue life at stress amplitude S is represented by a lognormal random variable whose mean and standard deviation depend on S. This characterization is then used to compute the random fatigue life of a component submitted to a single kind of cycles. Precisely the mean and coefficient of variation of this quantity are studied, as well as the reliability associated with the (deterministic) design value. (author)

  16. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  17. [Training of residents in obstetrics and gynecology: Assessment of an educational program including formal lectures and practical sessions using simulators].

    Science.gov (United States)

    Jordan, A; El Haloui, O; Breaud, J; Chevalier, D; Antomarchi, J; Bongain, A; Boucoiran, I; Delotte, J

    2015-01-01

    Evaluate an educational program in the training of residents in gynecology-obstetrics (GO) with a theory session and a practical session on simulators and analyze their learning curve. Single-center prospective study, at the university hospital (CHU). Two-day sessions were leaded in April and July 2013. An evaluation on obstetric and gynecological surgery simulator was available to all residents. Theoretical knowledge principles of obstetrics were evaluated early in the session and after formal lectures was taught to them. At the end of the first session, a satisfaction questionnaire was distributed to all participants. Twenty residents agreed to participate to the training sessions. Evaluation of theoretical knowledge: at the end of the session, the residents obtained a significant improvement in their score on 20 testing knowledge. Obstetrical simulator: a statistically significant improvement in scores on assessments simulator vaginal delivery between the first and second session. Subjectively, a larger increase feeling was seen after breech delivery simulation than for the cephalic vaginal delivery. However, the confidence level of the resident after breech delivery simulation has not been improved at the end of the second session. Simulation in gynecological surgery: a trend towards improvement in the time realized on the peg-transfer between the two sessions was noted. In the virtual simulation, no statistically significant differences showed, no improvement for in salpingectomy's time. Subjectively, the residents felt an increase in the precision of their gesture. Satisfaction: All residents have tried the whole program. They considered the pursuit of these sessions on simulators was necessary and even mandatory. The approach chosen by this structured educational program allowed a progression for the residents, both objectively and subjectively. This simulation program type for the resident's training would use this tool in assessing their skills and develop

  18. A program for confidence interval calculations for a Poisson process with background including systematic uncertainties: POLE 1.0

    Science.gov (United States)

    Conrad, Jan

    2004-04-01

    A Fortran 77 routine has been developed to calculate confidence intervals with and without systematic uncertainties using a frequentist confidence interval construction with a Bayesian treatment of the systematic uncertainties. The routine can account for systematic uncertainties in the background prediction and signal/background efficiencies. The uncertainties may be separately parametrized by a Gauss, log-normal or flat probability density function (PDF), though since a Monte Carlo approach is chosen to perform the necessary integrals a generalization to other parameterizations is particularly simple. Full correlation between signal and background efficiency is optional. The ordering schemes for frequentist construction currently supported are the likelihood ratio ordering (also known as Feldman-Cousins) and Neyman ordering. Optionally, both schemes can be used with conditioning, meaning the probability density function is conditioned on the fact that the actual outcome of the background process can not have been larger than the number of observed events. Program summaryTitle of program: POLE version 1.0 Catalogue identifier: ADTA Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTA Program available from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: None Computer for which the program is designed: DELL PC 1 GB 2.0 Ghz Pentium IV Operating system under which the program has been tested: RH Linux 7.2 Kernel 2.4.7-10 Programming language used: Fortran 77 Memory required to execute with typical data: ˜1.6 Mbytes No. of bytes in distributed program, including test data, etc.: 373745 No. of lines in distributed program, including test data, etc.: 2700 Distribution format: tar gzip file Keywords: Confidence interval calculation, Systematic uncertainties Nature of the physical problem: The problem is to calculate a frequentist confidence interval on the parameter of a Poisson process with known background in presence of

  19. Which Costs Matter? Costs Included in Economic Evaluation and their Impact on Decision Uncertainty for Stable Coronary Artery Disease.

    Science.gov (United States)

    Lomas, James; Asaria, Miqdad; Bojke, Laura; Gale, Chris P; Richardson, Gerry; Walker, Simon

    2018-02-14

    Variation exists in the resource categories included in economic evaluations, and National Institute for Health and Care Excellence (NICE) guidance suggests the inclusion only of costs related to the index condition or intervention. However, there is a growing consensus that all healthcare costs should be included in economic evaluations for Health Technology Assessments (HTAs), particularly those related to extended years of life. We aimed to quantify the impact of a range of cost categories on the adoption decision about a hypothetical intervention, and uncertainty around that decision, for stable coronary artery disease (SCAD) based on a dataset comprising 94,966 patients. Three costing scenarios were considered: coronary heart disease (CHD) costs only, cardiovascular disease (CVD) costs and all costs. The first two illustrate different interpretations of what might be regarded as related costs. Employing a 20-year time horizon, the highest mean expected incremental cost was when all costs were included (£2468) and the lowest when CVD costs only were included (£2377). The probability of the treatment being cost effective, estimating health opportunity costs using a ratio of £30,000 per quality-adjusted life-year (QALY), was different for each of the CHD (70%) costs, CVD costs (73%) and all costs (56%) scenarios. The results concern a hypothetical intervention and are illustrative only, as such they cannot necessarily be generalised to all interventions and diseases. Cost categories included in an economic evaluation of SCAD impact on estimates of both cost effectiveness and decision uncertainty. With an aging and co-morbid population, the inclusion of all healthcare costs may have important ramifications for the selection of healthcare provision on economic grounds.

  20. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  1. Masses of Formal Philosophy

    DEFF Research Database (Denmark)

    Masses of Formal Philosophy is an outgrowth of Formal Philosophy. That book gathered the responses of some of the most prominent formal philosophers to five relatively open and broad questions initiating a discussion of metaphilosophical themes and problems surrounding the use of formal methods...... in philosophy. Including contributions from a wide range of philosophers, Masses of Formal Philosophy contains important new responses to the original five questions....

  2. Challenges of Sustaining the International Space Station through 2020 and Beyond: Including Epistemic Uncertainty in Reassessing Confidence Targets

    Science.gov (United States)

    Anderson, Leif; Carter-Journet, Katrina; Box, Neil; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    This paper introduces an analytical approach, Probability and Confidence Trade-space (PACT), which can be used to assess uncertainty in International Space Station (ISS) hardware sparing necessary to extend the life of the vehicle. There are several key areas under consideration in this research. We investigate what sparing confidence targets may be reasonable to ensure vehicle survivability and for completion of science on the ISS. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. An ongoing annual analysis currently compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies availability does not meet subsystem confidence targets, the current sparing analysis further identifies which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty (uncertainty in the knowledge of the true hardware failure rate), which does not currently factor into the annual sparing analysis. The existing confidence targets may be conservative. This paper will also discuss how confidence targets may be relaxed based on the inclusion of epistemic uncertainty for each ORU. The paper will conclude with strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life, 2020 and beyond.

  3. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  4. A formal statistical approach to representing uncertainty in rainfall-runoff modelling with focus on residual analysis and probabilistic output evaluation - Distinguishing simulation and prediction

    DEFF Research Database (Denmark)

    Breinholt, Anders; Møller, Jan Kloppenborg; Madsen, Henrik

    2012-01-01

    evaluation of the modelled output, and we attach particular importance to inspecting the residuals of the model outputs and improving the model uncertainty description. We also introduce the probabilistic performance measures sharpness, reliability and interval skill score for model comparison...... and for checking the reliability of the confidence bounds. Using point rainfall and evaporation data as input and flow measurements from a sewer system for model conditioning, a state space model is formulated that accounts for three different flow contributions: wastewater from households, and fast rainfall......-runoff from paved areas and slow rainfall-dependent infiltration-inflow from unknown sources. We consider two different approaches to evaluate the model output uncertainty, the output error method that lumps all uncertainty into the observation noise term, and a method based on Stochastic Differential...

  5. Should Student Evaluation of Teaching Play a Significant Role in the Formal Assessment of Dental Faculty? Two Viewpoints: Viewpoint 1: Formal Faculty Assessment Should Include Student Evaluation of Teaching and Viewpoint 2: Student Evaluation of Teaching Should Not Be Part of Formal Faculty Assessment.

    Science.gov (United States)

    Rowan, Susan; Newness, Elmer J; Tetradis, Sotirios; Prasad, Joanne L; Ko, Ching-Chang; Sanchez, Arlene

    2017-11-01

    Student evaluation of teaching (SET) is often used in the assessment of faculty members' job performance and promotion and tenure decisions, but debate over this use of student evaluations has centered on the validity, reliability, and application of the data in assessing teaching performance. Additionally, the fear of student criticism has the potential of influencing course content delivery and testing measures. This Point/Counterpoint article reviews the potential utility of and controversy surrounding the use of SETs in the formal assessment of dental school faculty. Viewpoint 1 supports the view that SETs are reliable and should be included in those formal assessments. Proponents of this opinion contend that SETs serve to measure a school's effectiveness in support of its core mission, are valid measures based on feedback from the recipients of educational delivery, and provide formative feedback to improve faculty accountability to the institution. Viewpoint 2 argues that SETs should not be used for promotion and tenure decisions, asserting that higher SET ratings do not correlate with improved student learning. The advocates of this viewpoint contend that faculty members may be influenced to focus on student satisfaction rather than pedagogy, resulting in grade inflation. They also argue that SETs are prone to gender and racial biases and that SET results are frequently misinterpreted by administrators. Low response rates and monotonic response patterns are other factors that compromise the reliability of SETs.

  6. Brine migration resulting from CO2 injection into saline aquifers – An approach to risk estimation including various levels of uncertainty

    DEFF Research Database (Denmark)

    Walter, Lena; Binning, Philip John; Oladyshkin, Sergey

    2012-01-01

    for large-scale 3D models including complex physics. Therefore, we apply a model reduction based on arbitrary polynomial chaos expansion combined with probabilistic collocation method. It is shown that, dependent on data availability, both types of uncertainty can be equally significant. The presented study...... provides estimates of the risk of brine discharge into freshwater aquifers due to CO2 injection into geological formations and resultant salt concentrations in the overlying drinking water aquifers....

  7. Mapping of Schistosomiasis and Soil-Transmitted Helminths in Namibia: The First Large-Scale Protocol to Formally Include Rapid Diagnostic Tests.

    Science.gov (United States)

    Sousa-Figueiredo, José Carlos; Stanton, Michelle C; Katokele, Stark; Arinaitwe, Moses; Adriko, Moses; Balfour, Lexi; Reiff, Mark; Lancaster, Warren; Noden, Bruce H; Bock, Ronnie; Stothard, J Russell

    2015-01-01

    Namibia is now ready to begin mass drug administration of praziquantel and albendazole against schistosomiasis and soil-transmitted helminths, respectively. Although historical data identifies areas of transmission of these neglected tropical diseases (NTDs), there is a need to update epidemiological data. For this reason, Namibia adopted a new protocol for mapping of schistosomiasis and geohelminths, formally integrating rapid diagnostic tests (RDTs) for infections and morbidity. In this article, we explain the protocol in detail, and introduce the concept of 'mapping resolution', as well as present results and treatment recommendations for northern Namibia. This new protocol allowed a large sample to be surveyed (N = 17,896 children from 299 schools) at relatively low cost (7 USD per person mapped) and very quickly (28 working days). All children were analysed by RDTs, but only a sub-sample was also diagnosed by light microscopy. Overall prevalence of schistosomiasis in the surveyed areas was 9.0%, highly associated with poorer access to potable water (OR = 1.5, Pmaps were produced and hot spots identified to better guide the national programme in drug administration, as well as targeted improvements in water, sanitation and hygiene. The RDTs employed (circulating cathodic antigen and microhaematuria for Schistosoma mansoni and S. haematobium, respectively) performed well, with sensitivities above 80% and specificities above 95%. This protocol is cost-effective and sensitive to budget limitations and the potential economic and logistical strains placed on the national Ministries of Health. Here we present a high resolution map of disease prevalence levels, and treatment regimens are recommended.

  8. Mapping of Schistosomiasis and Soil-Transmitted Helminths in Namibia: The First Large-Scale Protocol to Formally Include Rapid Diagnostic Tests.

    Directory of Open Access Journals (Sweden)

    José Carlos Sousa-Figueiredo

    Full Text Available Namibia is now ready to begin mass drug administration of praziquantel and albendazole against schistosomiasis and soil-transmitted helminths, respectively. Although historical data identifies areas of transmission of these neglected tropical diseases (NTDs, there is a need to update epidemiological data. For this reason, Namibia adopted a new protocol for mapping of schistosomiasis and geohelminths, formally integrating rapid diagnostic tests (RDTs for infections and morbidity. In this article, we explain the protocol in detail, and introduce the concept of 'mapping resolution', as well as present results and treatment recommendations for northern Namibia.This new protocol allowed a large sample to be surveyed (N = 17,896 children from 299 schools at relatively low cost (7 USD per person mapped and very quickly (28 working days. All children were analysed by RDTs, but only a sub-sample was also diagnosed by light microscopy. Overall prevalence of schistosomiasis in the surveyed areas was 9.0%, highly associated with poorer access to potable water (OR = 1.5, P<0.001 and defective (OR = 1.2, P<0.001 or absent sanitation infrastructure (OR = 2.0, P<0.001. Overall prevalence of geohelminths, more particularly hookworm infection, was 12.2%, highly associated with presence of faecal occult blood (OR = 1.9, P<0.001. Prevalence maps were produced and hot spots identified to better guide the national programme in drug administration, as well as targeted improvements in water, sanitation and hygiene. The RDTs employed (circulating cathodic antigen and microhaematuria for Schistosoma mansoni and S. haematobium, respectively performed well, with sensitivities above 80% and specificities above 95%.This protocol is cost-effective and sensitive to budget limitations and the potential economic and logistical strains placed on the national Ministries of Health. Here we present a high resolution map of disease prevalence levels, and treatment regimens are

  9. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  10. Superfield formalism

    Indian Academy of Sciences (India)

    dimensional superfields, is a clear signature of the presence of the (anti-)BRST invariance in the original. 4D theory. Keywords. Non-Abelian 1-form gauge theory; Dirac fields; (anti-)Becchi–Roucet–Stora–. Tyutin invariance; superfield formalism; ...

  11. Evaluation of the Repeatability of the Delta Q Duct Leakage Testing TechniqueIncluding Investigation of Robust Analysis Techniques and Estimates of Weather Induced Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dickerhoff, Darryl; Walker, Iain

    2008-08-01

    found in for the pressure station approach. Walker and Dickerhoff also included estimates of DeltaQ test repeatability based on the results of field tests where two houses were tested multiple times. The two houses were quite leaky (20-25 Air Changes per Hour at 50Pa (0.2 in. water) (ACH50)) and were located in the San Francisco Bay area. One house was tested on a calm day and the other on a very windy day. Results were also presented for two additional houses that were tested by other researchers in Minneapolis, MN and Madison, WI, that had very tight envelopes (1.8 and 2.5 ACH50). These tight houses had internal duct systems and were tested without operating the central blower--sometimes referred to as control tests. The standard deviations between the multiple tests for all four houses were found to be about 1% of the envelope air flow at 50 Pa (0.2 in. water) (Q50) that led to the suggestion of this as a rule of thumb for estimating DeltaQ uncertainty. Because DeltaQ is based on measuring envelope air flows it makes sense for uncertainty to scale with envelope leakage. However, these tests were on a limited data set and one of the objectives of the current study is to increase the number of tested houses. This study focuses on answering two questions: (1) What is the uncertainty associated with changes in weather (primarily wind) conditions during DeltaQ testing? (2) How can these uncertainties be reduced? The first question is addressing issues of repeatability. To study this five houses were tested as many times as possible over a day. Weather data was recorded on-site--including the local windspeed. The result from these five houses were combined with the two Bay Area homes from the previous studies. The variability of the tests (represented by the standard deviation) is the repeatability of the test method for that house under the prevailing weather conditions. Because the testing was performed over a day a wide range of wind speeds was achieved following

  12. Superfield formalism

    Indian Academy of Sciences (India)

    framework of the usual superfield approach to BRST formalism [1–9]. This ap- proach, however, has not been able ... within the framework of the superfield formulation. The central theme of a couple of very ... define [23a] the curvature tensor Fµν = ∂µAν −∂νAµ +iAµ ×Aν. Here B and ¯B are the auxiliary fields that satisfy the ...

  13. Beyond formalism

    Science.gov (United States)

    Denning, Peter J.

    1991-01-01

    The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.

  14. Momentum conservation decides Heisenberg's interpretation of the uncertainty formulas

    International Nuclear Information System (INIS)

    Angelidis, T.D.

    1977-01-01

    In the light of Heisenberg's interpretation of the uncertainty formulas, the conditions necessary for the derivation of the quantitative statement or law of momentum conservation are considered. The result of such considerations is a contradiction between the formalism of quantum physics and the asserted consequences of Heisenberg's interpretation. This contradiction decides against Heisenberg's interpretation of the uncertainty formulas on upholding that the formalism of quantum physics is both consistent and complete, at least insofar as the statement of momentum conservation can be proved within this formalism. A few comments are also included on Bohr's complementarity interpretation of the formalism of quantum physics. A suggestion, based on a statistical mode of empirical testing of the uncertainty formulas, does not give rise to any such contradiction

  15. Probabilistic Accident Consequence Uncertainty Analysis of the Late Health Effects Module in the COSYMA Package (invited paper)

    International Nuclear Information System (INIS)

    Goossens, L.H.J.; Wakeford, R.; Little, M.; Muirhead, C.; Hasemann, I.; Jones, J.A.

    2000-01-01

    The uncertainty analysis is described of the late health effects module of COSYMA, describing both the methods of obtaining distributions on the input parameters and the results of the analysis. The uncertainty distributions on the input parameter values were obtained using formal expert judgement elicitation techniques. The aim of the module analysis was to identify those parameters whose uncertainty makes major contributions to the overall uncertainty, and which should be included in the final analysis of the whole COSYMA package. (author)

  16. Probabilistic Accident Consequence Uncertainty Analysis of the Atmospheric Dispersion and Deposition Module in the COSYMA Package (invited paper)

    International Nuclear Information System (INIS)

    Pasler-Sauer, J.; Jones, J.A.

    2000-01-01

    The uncertainty analysis is described of the atmospheric dispersion and deposition module of COSYMA, describing both the methods of obtaining distributions on the input parameters and the results of the analysis. The uncertainty distributions on the input parameter values were obtained using formal techniques of expert judgement elicitation. The aim of the module analysis was to identify those parameters whose uncertainty makes major contributions to the overall uncertainty, and which should be included in the final analysis of the whole COSYMA system. (author)

  17. Revisiting the formal foundation of Probabilistic Databases

    NARCIS (Netherlands)

    Wanders, B.; van Keulen, Maurice

    2015-01-01

    One of the core problems in soft computing is dealing with uncertainty in data. In this paper, we revisit the formal foundation of a class of probabilistic databases with the purpose to (1) obtain data model independence, (2) separate metadata on uncertainty and probabilities from the raw data, (3)

  18. Decision making under uncertainty

    International Nuclear Information System (INIS)

    Wu, J.S.; Apostolakis, G.E.; Okrent, D.

    1989-01-01

    The theory of evidence and the theory of possibility are considered by some analysts as potential models for uncertainty. This paper discusses two issues: how formal probability theory has been relaxed to develop these uncertainty models; and the degree to which these models can be applied to risk assessment. The scope of the second issue is limited to an investigation of their compatibility for combining various pieces of evidence, which is an important problem in PRA

  19. A geometrically exact Cosserat shell-model including size effects, avoiding degeneracy in the thin shell limit. Part I: Formal dimensional reduction for elastic plates and existence of minimizers for positive Cosserat couple modulus

    Science.gov (United States)

    Neff, P.

    2004-10-01

    This contribution is concerned with a consistent formal dimensional reduction of a previously introduced finite-strain three-dimensional Cosserat micropolar elasticity model to the two-dimensional situation of thin plates and shells. Contrary to the direct modelling of a shell as a Cosserat surface with additional directors, we obtain the shell model from the Cosserat bulk model which already includes a triad of rigid directors. The reduction is achieved by assumed kinematics, quadratic through the thickness. The three-dimensional transverse boundary conditions can be evaluated analytically in terms of the assumed kinematics and determines exactly two appearing coefficients in the chosen ansatz. Further simplifications with subsequent analytical integration through the thickness determine the reduced model in a variational setting. The resulting membrane energy turns out to be a quadratic, elliptic, first order, non degenerate energy in contrast to classical approaches. The bending contribution is augmented by a curvature term representing an additional stiffness of the Cosserat model and the corresponding system of balance equations remains of second order. The lateral boundary conditions for simple support are non-standard. The model includes size-effects, transverse shear resistance, drilling degrees of freedom and accounts implicitly for thickness extension and asymmetric shift of the midsurface. The formal thin shell “membrane” limit without classical h 3-bending term is non-degenerate due to the additional Cosserat curvature stiffness and control of drill rotations. In our formulation, the drill-rotations are strictly related to the size-effects of the bulk model and not introduced artificially for numerical convenience. Upon linearization with zero Cosserat couple modulus μ_c = 0 we recover the well known infinitesimal-displacement Reissner-Mindlin model without size-effects and without drill-rotations. It is shown that the dimensionally reduced

  20. Outline of Neutron Scattering Formalism

    OpenAIRE

    Berk, N. F.

    1993-01-01

    Neutron scattering formalism is briefly surveyed. Topics touched upon include coherent and incoherent scattering, bound and free cross-sections, the Van Hove formalism, magnetic scattering, elastic scattering, the static approximation, sum rules, small angle scattering, inelastic scattering, thermal diffuse scattering, quasielastic scattering, and neutron optics.

  1. Particle size distribution in soils and marine sediments by laser diffraction using Malvern Mastersizer 2000—method uncertainty including the effect of hydrogen peroxide pretreatment

    DEFF Research Database (Denmark)

    Callesen, Ingeborg; Keck, Hannes; Andersen, Thorbjørn Joest

    2018-01-01

    Purpose: Methods for particle size distribution (PSD) determination by laser diffraction are not standardized and differ between disciplines and sectors. The effect of H2O2 pretreatment before a sonication treatment in laser diffraction analysis of soils and marine sediments was examined on soils...... with less than 1% C and some marine sediments. Materials and methods: The method uncertainty for particle size analysis by the laser diffraction method using or not using H2O2 pretreatment followed by 2 min ultrasound and 1-mm sieving was determined for two soil samples and two aquatic sediments...... by analyzing ten replicates on a Malvern M2000 instrument. The carbon content was in the normal range for upland soils 0.1–0.9% C, but one of the aquatic sediment samples had a high carbon content (16.3% C) for which the H2O2 pretreatment was not feasible. Results and discussion: The effect of H2O2...

  2. Extended uncertainty from first principles

    Energy Technology Data Exchange (ETDEWEB)

    Costa Filho, Raimundo N., E-mail: rai@fisica.ufc.br [Departamento de Física, Universidade Federal do Ceará, Caixa Postal 6030, Campus do Pici, 60455-760 Fortaleza, Ceará (Brazil); Braga, João P.M., E-mail: philipe@fisica.ufc.br [Instituto de Ciências Exatas e da Natureza-ICEN, Universidade da Integração Internacional da Lusofonia Afro-Brasileira-UNILAB, Campus dos Palmares, 62785-000 Acarape, Ceará (Brazil); Lira, Jorge H.S., E-mail: jorge.lira@mat.ufc.br [Departamento de Matemática, Universidade Federal do Ceará, Caixa Postal 6030, Campus do Pici, 60455-760 Fortaleza, Ceará (Brazil); Andrade, José S., E-mail: soares@fisica.ufc.br [Departamento de Física, Universidade Federal do Ceará, Caixa Postal 6030, Campus do Pici, 60455-760 Fortaleza, Ceará (Brazil)

    2016-04-10

    A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  3. Extended uncertainty from first principles

    International Nuclear Information System (INIS)

    Costa Filho, Raimundo N.; Braga, João P.M.; Lira, Jorge H.S.; Andrade, José S.

    2016-01-01

    A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  4. Formal groups and applications

    CERN Document Server

    Hazewinkel, Michiel

    2012-01-01

    This book is a comprehensive treatment of the theory of formal groups and its numerous applications in several areas of mathematics. The seven chapters of the book present basics and main results of the theory, as well as very important applications in algebraic topology, number theory, and algebraic geometry. Each chapter ends with several pages of historical and bibliographic summary. One prerequisite for reading the book is an introductory graduate algebra course, including certain familiarity with category theory.

  5. Heuristics structure and pervade formal risk assessment.

    Science.gov (United States)

    MacGillivray, Brian H

    2014-04-01

    Lay perceptions of risk appear rooted more in heuristics than in reason. A major concern of the risk regulation literature is that such "error-strewn" perceptions may be replicated in policy, as governments respond to the (mis)fears of the citizenry. This has led many to advocate a relatively technocratic approach to regulating risk, characterized by high reliance on formal risk and cost-benefit analysis. However, through two studies of chemicals regulation, we show that the formal assessment of risk is pervaded by its own set of heuristics. These include rules to categorize potential threats, define what constitutes valid data, guide causal inference, and to select and apply formal models. Some of these heuristics lay claim to theoretical or empirical justifications, others are more back-of-the-envelope calculations, while still more purport not to reflect some truth but simply to constrain discretion or perform a desk-clearing function. These heuristics can be understood as a way of authenticating or formalizing risk assessment as a scientific practice, representing a series of rules for bounding problems, collecting data, and interpreting evidence (a methodology). Heuristics are indispensable elements of induction. And so they are not problematic per se, but they can become so when treated as laws rather than as contingent and provisional rules. Pitfalls include the potential for systematic error, masking uncertainties, strategic manipulation, and entrenchment. Our central claim is that by studying the rules of risk assessment qua rules, we develop a novel representation of the methods, conventions, and biases of the prior art. © 2013 Society for Risk Analysis.

  6. Readings in Formal Epistemology

    DEFF Research Database (Denmark)

    Formal epistemology’ is a term coined in the late 1990s for a new constellation of interests in philosophy,the roots of which are found in earlier works of epistemologists, philosophers of science, and logicians. It addresses a growing agenda of problems concerning knowledge, belief, certainty......, rationality, deliberation, decision, strategy, action and agent interaction – and it does so using methods from logic, probability, computability, decision, and game theory. This volume presents 42 classic texts in formal epistemology, and strengthens the ties between research into this area of philosophy...... and its neighbouring intellectual disciplines. The editors provide introductions to five basic subsections: Bayesian Epistemology, Belief Change, Decision Theory, Interactive Epistemology and Logics of Knowledge and Belief. The volume also includes a thorough index and suggestions for further reading...

  7. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  8. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  9. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  10. Subjective judgment on measure of data uncertainty

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Bytchkova, A.V.

    2004-01-01

    Integral parameters are considered, which can be derived from the covariance matrix of the uncertainties and can serve as a general measure of uncertainties in comparisons of different fits. Using realistic examples and simple data model fits with a variable number of parameters, he was able to show that the sum of all elements of the covariance matrix is a best general measure for characterizing and comparing uncertainties obtained in different model and non-model fits. Discussions also included the problem of non-positive definiteness of the covariance matrix of the uncertainty of the cross sections obtained from the covariance matrix of the uncertainty of the parameters in cases where the number of parameters is less than number of cross section points. As a consequence of the numerical inaccuracy of the calculations that are always many orders larger than the presentation of the machine zero, it was concluded that the calculated eigenvalues of semipositive definite matrices have no machine zeros. These covariance matrices can be inverted when they are used in the error propagation equations. So the procedure for transformation of the semi-positive definite matrices to positive ones by introducing minimal changes into the matrix (changes that are equivalent to introducing additional non-informative parameters in the model) is generally not needed. But caution should be observed, because there can be cases where uncertainties can be unphysical, e.g. integral parameters estimated with formally non-positive-definite covariance matrices

  11. O-Theory: a hybrid uncertainty theory

    Energy Technology Data Exchange (ETDEWEB)

    Oblow, E.M.

    1985-10-01

    A hybrid uncertainty theory is developed to bridge the gap between fuzzy set theory and Bayesian inference theory. Its basis is the Dempster-Shafer formalism (a probability-like, set-theoretic approach), which is extended and expanded upon so as to include a complete set of basic operations for manipulating uncertainties in approximate reasoning. The new theory, operator-belief theory (OT), retains the probabilistic flavor of Bayesian inference but includes the potential for defining a wider range of operators like those found in fuzzy set theory. The basic operations defined for OT in this paper include those for: dominance and order, union, intersection, complement and general mappings. A formal relationship between the membership function in fuzzy set theory and the upper probability function in the Dempster-Shafer formalism is also developed. Several sample problems in logical inference are worked out to illustrate the results derived from this new approach as well as to compare them with the other theories currently being used. A general method of extending the theory using the historical development of fuzzy set theory as an example is suggested.

  12. Archigregarines of the English Channel revisited: New molecular data on Selenidium species including early described and new species and the uncertainties of phylogenetic relationships.

    Directory of Open Access Journals (Sweden)

    Sonja Rueckert

    Full Text Available Gregarines represent an important transition step from free-living predatory (colpodellids s.l. and/or photosynthetic (Chromera and Vitrella apicomplexan lineages to the most important pathogens, obligate intracellular parasites of humans and domestic animals such as coccidians and haemosporidians (Plasmodium, Toxoplasma, Eimeria, Babesia, etc.. While dozens of genomes of other apicomplexan groups are available, gregarines are barely entering the molecular age. Among the gregarines, archigregarines possess a unique mixture of ancestral (myzocytosis and derived (lack of apicoplast, presence of subpellicular microtubules features.In this study we revisited five of the early-described species of the genus Selenidium including the type species Selenidium pendula, with special focus on surface ultrastructure and molecular data. We were also able to describe three new species within this genus. All species were characterized at morphological (light and scanning electron microscopy data and molecular (SSU rDNA sequence data levels. Gregarine specimens were isolated from polychaete hosts collected from the English Channel near the Station Biologique de Roscoff, France: Selenidium pendula from Scolelepis squamata, S. hollandei and S. sabellariae from Sabellaria alveolata, S. sabellae from Sabella pavonina, Selenidium fallax from Cirriformia tentaculata, S. spiralis sp. n. and S. antevariabilis sp. n. from Amphitritides gracilis, and S. opheliae sp. n. from Ophelia roscoffensis. Molecular phylogenetic analyses of these data showed archigregarines clustering into five separate clades and support previous doubts about their monophyly.Our phylogenies using the extended gregarine sampling show that the archigregarines are indeed not monophyletic with one strongly supported clade of Selenidium sequences around the type species S. pendula. We suggest the revision of the whole archigregarine taxonomy with only the species within this clade remaining in the genus

  13. Pragmatics for formal semantics

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2011-01-01

    This tech talk describes how to write and how to inter-derive formal semantics for sequential programming languages. The progress reported here is (1) concrete guidelines to write each formal semantics to alleviate their proof obligations, and (2) simple calculational tools to obtain a formal sem...

  14. The Quantum Formalism and the GRW Formalism

    Science.gov (United States)

    Goldstein, Sheldon; Tumulka, Roderich; Zanghì, Nino

    2012-10-01

    The Ghirardi-Rimini-Weber (GRW) theory of spontaneous wave function collapse is known to provide a quantum theory without observers, in fact two different ones by using either the matter density ontology (GRWm) or the flash ontology (GRWf). Both theories are known to make predictions different from those of quantum mechanics, but the difference is so small that no decisive experiment can as yet be performed. While some testable deviations from quantum mechanics have long been known, we provide here something that has until now been missing: a formalism that succinctly summarizes the empirical predictions of GRWm and GRWf. We call it the GRW formalism. Its structure is similar to that of the quantum formalism but involves different operators. In other words, we establish the validity of a general algorithm for directly computing the testable predictions of GRWm and GRWf. We further show that some well-defined quantities cannot be measured in a GRWm or GRWf world.

  15. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  16. Industrial use of formal methods formal verification

    CERN Document Server

    Boulanger, Jean-Louis

    2012-01-01

    At present the literature gives students and researchers of the very general books on the formal technics. The purpose of this book is to present in a single book, a return of experience on the used of the "formal technics" (such proof and model-checking) on industrial examples for the transportation domain. This book is based on the experience of people which are completely involved in the realization and the evaluation of safety critical system software based.  The implication of the industrialists allows to raise the problems of confidentiality which could appear and so allow

  17. Regulating fisheries under uncertainty

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn; Jensen, Frank

    2017-01-01

    the effects of these uncertainties into a single welfare measure for comparing tax and quota regulation. It is shown that quotas are always preferred to fees when structural economic uncertainty dominates. Since most regulators are subject to this kind of uncertainty, this result is a potentially important......Regulator uncertainty is decisive for whether price or quantity regulation maximizes welfare in fisheries. In this paper, we develop a model of fisheries regulation that includes ecological uncertainly, variable economic uncertainty as well as structural economic uncertainty. We aggregate...... qualification of the pro-price regulation message dominating the fisheries economics literature. We also believe that the model of a fishery developed in this paper could be applied to the regulation of other renewable resources where regulators are subject to uncertainty either directly or with some...

  18. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  19. Uncertainty and its propagation in dynamics models

    International Nuclear Information System (INIS)

    Devooght, J.

    1994-01-01

    The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision

  20. Some illustrative examples of model uncertainty

    International Nuclear Information System (INIS)

    Bier, V.M.

    1994-01-01

    In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion

  1. Uncertainty vs. Information (Invited)

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  2. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  3. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  4. Integrating semi-formal and formal requirements

    NARCIS (Netherlands)

    Wieringa, Roelf J.; Olivé, Antoni; Dubois, Eric; Pastor, Joan Antoni; Huyts, Sander

    1997-01-01

    In this paper, we report on the integration of informal, semiformal and formal requirements specification techniques. We present a framework for requirements specification called TRADE, within which several well-known semiformal specification techniques are placed. TRADE is based on an analysis of

  5. Incorporating model parameter uncertainty into inverse treatment planning

    International Nuclear Information System (INIS)

    Lian Jun; Xing Lei

    2004-01-01

    Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment

  6. MODIS land cover uncertainty in regional climate simulations

    Science.gov (United States)

    Li, Xue; Messina, Joseph P.; Moore, Nathan J.; Fan, Peilei; Shortridge, Ashton M.

    2017-12-01

    MODIS land cover datasets are used extensively across the climate modeling community, but inherent uncertainties and associated propagating impacts are rarely discussed. This paper modeled uncertainties embedded within the annual MODIS Land Cover Type (MCD12Q1) products and propagated these uncertainties through the Regional Atmospheric Modeling System (RAMS). First, land cover uncertainties were modeled using pixel-based trajectory analyses from a time series of MCD12Q1 for Urumqi, China. Second, alternative land cover maps were produced based on these categorical uncertainties and passed into RAMS. Finally, simulations from RAMS were analyzed temporally and spatially to reveal impacts. Our study found that MCD12Q1 struggles to discriminate between grasslands and croplands or grasslands and barren in this study area. Such categorical uncertainties have significant impacts on regional climate model outputs. All climate variables examined demonstrated impact across the various regions, with latent heat flux affected most with a magnitude of 4.32 W/m2 in domain average. Impacted areas were spatially connected to locations of greater land cover uncertainty. Both biophysical characteristics and soil moisture settings in regard to land cover types contribute to the variations among simulations. These results indicate that formal land cover uncertainty analysis should be included in MCD12Q1-fed climate modeling as a routine procedure.

  7. Fear of the Formal

    DEFF Research Database (Denmark)

    du Gay, Paul; Lopdrup-Hjorth, Thomas

    Over recent decades, institutions exhibiting high degrees of formality have come in for severe criticism. From the private to the public sector, and across a whole spectrum of actors spanning from practitioners to academics, formal organization is viewed with increasing doubt and skepticism. In a...

  8. Geometry and Formal Linguistics.

    Science.gov (United States)

    Huff, George A.

    This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…

  9. Informal work and formal plans

    DEFF Research Database (Denmark)

    Dalsted, Rikke Juul; Hølge-Hazelton, Bibi; Kousgaard, Marius Brostrøm

    2012-01-01

    INTRODUCTION: Formal pathways models outline that patients should receive information in order to experience a coherent journey but do not describe an active role for patients or their relatives. The aim of this is paper is to articulate and discuss the active role of patients during their cancer...... trajectories. METHODS AND THEORY: An in-depth case study of patient trajectories at a Danish hospital and surrounding municipality using individual interviews with patients. Theory about trajectory and work by Strauss was included. RESULTS: Patients continuously took initiatives to organize their treatment....... The patients' requests were not sufficiently supported in the professional organisation of work or formal planning. Patients' insertion and use of information in their trajectories challenged professional views and working processes. And the design of the formal pathway models limits the patients' active...

  10. Necessity of Integral Formalism

    International Nuclear Information System (INIS)

    Tao Yong

    2011-01-01

    To describe the physical reality, there are two ways of constructing the dynamical equation of field, differential formalism and integral formalism. The importance of this fact is firstly emphasized by Yang in case of gauge field [Phys. Rev. Lett. 33 (1974) 445], where the fact has given rise to a deeper understanding for Aharonov-Bohm phase and magnetic monopole [Phys. Rev. D 12 (1975) 3845]. In this paper we shall point out that such a fact also holds in general wave function of matter, it may give rise to a deeper understanding for Berry phase. Most importantly, we shall prove a point that, for general wave function of matter, in the adiabatic limit, there is an intrinsic difference between its integral formalism and differential formalism. It is neglect of this difference that leads to an inconsistency of quantum adiabatic theorem pointed out by Marzlin and Sanders [Phys. Rev. Lett. 93 (2004) 160408]. It has been widely accepted that there is no physical difference of using differential operator or integral operator to construct the dynamical equation of field. Nevertheless, our study shows that the Schrödinger differential equation (i.e., differential formalism for wave function) shall lead to vanishing Berry phase and that the Schrödinger integral equation (i.e., integral formalism for wave function), in the adiabatic limit, can satisfactorily give the Berry phase. Therefore, we reach a conclusion: There are two ways of describing physical reality, differential formalism and integral formalism; but the integral formalism is a unique way of complete description. (general)

  11. Fear of the Formal

    DEFF Research Database (Denmark)

    du Gay, Paul; Lopdrup-Hjorth, Thomas

    2016-01-01

    Over recent decades, ‘formal’ organisations have come in for severe criticism. Not only is formal organisation represented as ill suited to the realities of the contemporary organisational world, but as a key source from which organisational dysfunctions themselves emerge. For that reason informal...... and spontaneous modes of organising have emerged, or better re-emerged, as preferable substitutes, because they, in contrast to the formal, allegedly allow for creativity, inventiveness, flexibility, speed, and freedom. Thus, the province of the formal is significantly devalued. In this paper, we explore what we...

  12. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  13. Formalizing Arrow's theorem

    Indian Academy of Sciences (India)

    Keywords. formalization of mathematics; Mizar; social choice theory; Arrow's theorem; Gibbard–Satterthwaite theorem; proof errors. ... Author Affiliations. Freek Wiedijk1. Institute for Computing and Information Sciences, Radboud University Nijmegen, Toernooiveld 1, 6525 ED Nijmegen, The Netherlands ...

  14. Formalizing typical crosscutting concerns

    NARCIS (Netherlands)

    Marin, A.M.

    2006-01-01

    We present a consistent system for referring crosscutting functionality, relating crosscutting concerns to specific implementation idioms, and formalizing their underlying relations through queries. The system is based on generic crosscutting concerns that we organize and describe in a catalog. We

  15. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  16. The simplest formal argument for fitness optimization

    Indian Academy of Sciences (India)

    2008-12-23

    Dec 23, 2008 ... The Formal Darwinism Project aims to provide a formal argument linking population genetics to fitness optimization, which of necessity includes defining fitness. This bridges the gulf between those biologists who assume that natural selection leads to something close to fitness optimization and those ...

  17. The simplest formal argument for fitness optimization

    Indian Academy of Sciences (India)

    The Formal Darwinism Project aims to provide a formal argument linking population genetics to fitness optimization, which of necessity includes defining fitness. This bridges the gulf between those biologists who assume that natural selection leads to something close to fitness optimization and those biologists who believe ...

  18. AgRISTARS: Foreign commodity production forecasting. Minutes of the annual formal project manager's review, including preliminary technical review reports of FY80 experiments. [wheat/barley and corn/soybean experiments

    Science.gov (United States)

    1980-01-01

    The U.S./Canada wheat/barley exploratory experiment is discussed with emphasis on labeling, machine processing using P1A, and the crop calendar. Classification and the simulated aggregation test used in the U.S. corn/soybean exploratory experiment are also considered. Topics covered regarding the foreign commodity production forecasting project include: (1) the acquisition, handling, and processing of both U.S. and foreign agricultural data, as well as meteorological data. The accuracy assessment methodology, multicrop sampling and aggregation technology development, frame development, the yield project interface, and classification for area estimation are also examined.

  19. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  20. Formality in Brackets

    DEFF Research Database (Denmark)

    Garsten, Christina; Nyqvist, Anette

    suits’ (Nyqvist 2013), and of doing ‘ethnography by failure’ (Garsten 2013). In this paper, we explore the layers of informality and formality in our fieldwork experiences among financial investors and policy experts, and discuss how to ethnographically represent embodied fieldwork practices. How do we...... stage and informal for others. Walking the talk, donning the appropriate attire, wearing the proper suit, may be part of what is takes to figure out the code of formal organizational settings – an entrance ticket to the backstage, as it were. Oftentimes, it involves a degree of mimicry, of ‘following...... conceptualize and articulate the informal and the formal? How do we represent the multidimensional character of organizations while maintaining a degree of integrity of informants? And how do we decide on relevance as we transpose our fieldwork experiences into text? We suggest that ethnographic organization...

  1. Ontology or formal ontology

    Science.gov (United States)

    Žáček, Martin

    2017-07-01

    Ontology or formal ontology? Which word is correct? The aim of this article is to introduce correct terms and explain their basis. Ontology describes a particular area of interest (domain) in a formal way - defines the classes of objects that are in that area, and relationships that may exist between them. Meaning of ontology consists mainly in facilitating communication between people, improve collaboration of software systems and in the improvement of systems engineering. Ontology in all these areas offer the possibility of unification of view, maintaining consistency and unambiguity.

  2. Informal work and formal plans

    DEFF Research Database (Denmark)

    Dalsted, Rikke Juul; Hølge-Hazelton, Bibi; Kousgaard, Marius Brostrøm

    2012-01-01

    trajectories. METHODS AND THEORY: An in-depth case study of patient trajectories at a Danish hospital and surrounding municipality using individual interviews with patients. Theory about trajectory and work by Strauss was included. RESULTS: Patients continuously took initiatives to organize their treatment...... and care. They initiated processes in the trajectories, and acquired information, which they used to form their trajectories. Patients presented problems to the healthcare professionals in order to get proper help when needed. DISCUSSION: Work done by patients was invisible and not perceived as work....... The patients' requests were not sufficiently supported in the professional organisation of work or formal planning. Patients' insertion and use of information in their trajectories challenged professional views and working processes. And the design of the formal pathway models limits the patients' active...

  3. Criteria for logical formalization

    Czech Academy of Sciences Publication Activity Database

    Peregrin, Jaroslav; Svoboda, Vladimír

    2013-01-01

    Roč. 190, č. 14 (2013), s. 2897-2924 ISSN 0039-7857 R&D Projects: GA ČR(CZ) GAP401/10/1279 Institutional support: RVO:67985955 Keywords : logic * logical form * formalization * reflective equilibrium Subject RIV: AA - Philosophy ; Religion Impact factor: 0.637, year: 2013

  4. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    This is a short tutorial on formal methods which are techniques for specifying and verifying com- plex software and hardware systems. A few ex- amples of successful industrial use of these are also presented. Computers are ubiquitous these days and are used to control various safety critical systems like aircrafts, satel-.

  5. Formalization of Medical Guidelines

    Czech Academy of Sciences Publication Activity Database

    Peleška, Jan; Anger, Z.; Buchtela, David; Šebesta, K.; Tomečková, Marie; Veselý, Arnošt; Zvára, K.; Zvárová, Jana

    2005-01-01

    Roč. 1, - (2005), s. 133-141 ISSN 1801-5603 R&D Projects: GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : GLIF model * formalization of guidelines * prevention of cardiovascular diseases Subject RIV: IN - Informatics, Computer Science

  6. The Benefits of Formalization

    DEFF Research Database (Denmark)

    Rand, John; Torm, Nina Elisabeth

    2012-01-01

    Based on unique panel data consisting of both formal and informal firms, this paper uses a matched double difference approach to examine the relationship between legal status and firm level outcomes in micro, small and medium manufacturing enterprises (SMEs) in Vietnam. Controlling for determining...

  7. Formalizing Arrow's theorem

    Indian Academy of Sciences (India)

    a formalization is done using a computer program called a proof checker or proof assistant. Such a proof assistant .... means 'Prototype Verification System', but the system is far more than a prototype. It actually is one of the ... related to the Lisp programming language (ACL2 means 'A Computational Logic for. Applicative ...

  8. Formalizing physical security procedures

    NARCIS (Netherlands)

    Meadows, C.; Pavlovic, Dusko

    Although the problems of physical security emerged more than 10,000 years before the problems of computer security, no formal methods have been developed for them, and the solutions have been evolving slowly, mostly through social procedures. But as the traffic on physical and social networks is now

  9. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  10. Formalized Informal Learning

    DEFF Research Database (Denmark)

    Levinsen, Karin Tweddell; Sørensen, Birgitte Holm

    2013-01-01

    are examined and the relation between network society competences, learners’ informal learning strategies and ICT in formalized school settings over time is studied. The authors find that aspects of ICT like multimodality, intuitive interaction design and instant feedback invites an informal bricoleur approach......Longitudinal research projects into social practices are both subject to and capture changes in society, meaning that research is conducted in a fluid context and that new research questions appear during the project’s life cycle. In the present study emerging new performances and uses of ICT....... When integrated into certain designs for teaching and learning, this allows for Formalized Informal Learning and support is found for network society competences building....

  11. Towards Formal Implementation of PUS Standard

    Science.gov (United States)

    Ilić, D.

    2009-05-01

    As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.

  12. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  13. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    . This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles......This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...

  14. 49 CFR 1111.1 - Content of formal complaints; joinder.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Content of formal complaints; joinder. 1111.1... § 1111.1 Content of formal complaints; joinder. (a) General. A formal complaint must contain the correct... language the facts upon which it is based. It should include specific reference to pertinent statutory...

  15. Formal System Verification for Trustworthy Embedded Systems

    Science.gov (United States)

    2011-04-19

    step is for the first time formal and machine-checked. Contemporary OS verification projects include Verisoft, Verisoft XT, and Verve . The Verisoft...tens of thousands lines of code. The Verve kernel [22] shows that type and memory safety properties can be established on the assembly level via type...systems and therefore with much lower cost. Verve contains a formally verified runtime system, in particular a garbage collector that the type system

  16. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  17. Formal Methods in Industry

    Directory of Open Access Journals (Sweden)

    Alexei Serna A.

    2012-12-01

    Full Text Available The application of formal methods in industry has progressed extensively over the past decade and the results are promising. But despite these achievements and it have been documented in numerous studies, it is still very common the skepticism about its usefulness and applicability. The goal of this paper is to show that its evolution over the past decade exceeds all previous processes and each time they do a better job to satisfy industrial needs. This is achieved by the description of some experiments and the result of various applications in industry and through an analyzing of the needs of companies that must be satisfy the research community in this field.

  18. Formalizing the concept of sound.

    Energy Technology Data Exchange (ETDEWEB)

    Kaper, H. G.; Tipei, S.

    1999-08-03

    The notion of formalized music implies that a musical composition can be described in mathematical terms. In this article we explore some formal aspects of music and propose a framework for an abstract approach.

  19. Formalization of Database Systems -- and a Formal Definition of {IMS}

    DEFF Research Database (Denmark)

    Bjørner, Dines; Løvengreen, Hans Henrik

    1982-01-01

    Drawing upon an analogy between Programming Language Systems and Database Systems we outline the requirements that architectural specifications of database systems must futfitl, and argue that only formal, mathematical definitions may 6atisfy these. Then we illustrate home aspects and touch upon...... come ueee of formal definitions of data models and databaee management systems. A formal model of INS will carry this discussion. Finally we survey some of the exkting literature on formal definitions of database systems. The emphasis will be on constructive definitions in the denotationul semantics...... style of the VCM: Vienna Development Nethd. The role of formal definitions in international standardiaation efforts is briefly mentioned....

  20. Formalization of Database Systems -- and a Formal Definition of {IMS}

    DEFF Research Database (Denmark)

    Bjørner, Dines; Løvengreen, Hans Henrik

    1982-01-01

    come ueee of formal definitions of data models and databaee management systems. A formal model of INS will carry this discussion. Finally we survey some of the exkting literature on formal definitions of database systems. The emphasis will be on constructive definitions in the denotationul semantics......Drawing upon an analogy between Programming Language Systems and Database Systems we outline the requirements that architectural specifications of database systems must futfitl, and argue that only formal, mathematical definitions may 6atisfy these. Then we illustrate home aspects and touch upon...... style of the VCM: Vienna Development Nethd. The role of formal definitions in international standardiaation efforts is briefly mentioned....

  1. Graphical And Textual Notations In Formal Specification

    Energy Technology Data Exchange (ETDEWEB)

    Bove, Rocco; Dipoppa, Giovanni; Groven, Arne-Kristian; Sivertsen, Terje

    1996-07-01

    The present report describes the current status of the co-operative project between ENEA and the OECD Halden Reactor Project on graphical and formal methods for software specification. The aim of this project is to contribute to a clarification of the relationship between graphical descriptions and formal specifications, and to provide guidelines for how they can be combined in order to utilize the strengths of' each approach. The overall aim of such a combination is to improve the formal basis of graphical descriptions and make formal specifications more generally comprehensible. The research reported includes the application of the IPTES technology on the APRM case example, an approach to the translation of Petri nets into algebraic specification, and the specification of real-time distributed systems using time-extended LOTOS. (author)

  2. Formalisms for reuse and systems integration

    CERN Document Server

    Rubin, Stuart

    2015-01-01

    Reuse and integration are defined as synergistic concepts, where reuse addresses how to minimize redundancy in the creation of components; while, integration focuses on component composition. Integration supports reuse and vice versa. These related concepts support the design of software and systems for maximizing performance while minimizing cost. Knowledge, like data, is subject to reuse; and, each can be interpreted as the other. This means that inherent complexity, a measure of the potential utility of a system, is directly proportional to the extent to which it maximizes reuse and integration. Formal methods can provide an appropriate context for the rigorous handling of these synergistic concepts. Furthermore, formal languages allow for non ambiguous model specification; and, formal verification techniques provide support for insuring the validity of reuse and integration mechanisms.   This edited book includes 12 high quality research papers written by experts in formal aspects of reuse and integratio...

  3. Formal Methods in Knowledge Engineering

    NARCIS (Netherlands)

    Harmelen, van F.A.H.; Fensel, D.

    1995-01-01

    This paper presents a general discussion of the role of formal methods in Knowledge Engineering. We give an historical account of the development of the field of Knowledge Engineering towards the use of formal methods. Subsequently, we discuss the pro's and cons of formal methods. We do this by

  4. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  5. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  6. Formal Epistemology and New Paradigm Psychology of Reasoning

    NARCIS (Netherlands)

    Pfeifer, Niki; Douven, Igor

    This position paper advocates combining formal epistemology and the new paradigm psychology of reasoning in the studies of conditionals and reasoning with uncertainty. The new paradigm psychology of reasoning is characterized by the use of probability theory as a rationality framework instead of

  7. Uncertainty in an Interconnected Financial System, Contagion

    OpenAIRE

    Mei Li; Frank Milne; Junfeng Qiu

    2013-01-01

    This paper studies contagion and market freezes caused by uncertainty in financial network structures and provides theoretical guidance for central banks. We establish a formal model to demonstrate that, in a financial system where financial institutions are interconnected, a negative shock to an individual financial institution could spread to other institutions, causing market freezes because of creditors' uncertainty about the financial network structure. Central bank policies to alleviate...

  8. Advancing Uncertainty: Untangling and Discerning Related Concepts

    OpenAIRE

    Janice Penrod

    2002-01-01

    Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal in...

  9. IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov

    2012-11-01

    Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.

  10. In/formalization

    Directory of Open Access Journals (Sweden)

    Alan Smart

    2017-12-01

    Full Text Available Addressing a variety of locations and subjects across several social contexts and countries, this forum intends to stimulate novel ways of conceptualizing the inevitable interpenetration and entanglement of formalization and informalization as two interlinked social processes. Rather than proposing a new coherent definition of “informality”, we propose to consider “in/formalization” as a space of practice and reflection which is crucial for engaging with contemporary economy, law and politics and their current local and global articulations and scenarios. The forum features contributions by Stamatis Amarianakis, Lenka Brunclíková, Dolores Koenig, B. Lynne Milgram, Sarah Muir, Antonio Maria Pusceddu, Alan Smart, Mechthild von Vacano, Filippo M. Zerilli & Julie Trappe.

  11. Formal aspects of resilience

    Directory of Open Access Journals (Sweden)

    Diana-Maria Drigă

    2015-12-01

    Full Text Available The concept of resilience has represented during the recent years a leading concern both in Romania, within the European Union and worldwide. Specialists in economics, management, finance, legal sciences, political sciences, sociology, psychology, grant a particular interest to this concept. Multidisciplinary research of resilience has materialized throughout the time in multiple conceptualizations and theorizing, but without being a consensus between specialists in terms of content, specificity and scope. Through this paper it is intended to clarify the concept of resilience, achieving an exploration of the evolution of this concept in ecological, social and economic environment. At the same time, the paper presents aspects of feedback mechanisms and proposes a formalization of resilience using the logic and mathematical analysis.

  12. Formalized informal learning

    DEFF Research Database (Denmark)

    Levinsen, Karin; Sørensen, Birgitte Holm

    2011-01-01

    This paper presents findings from a large-scale longitudinal, qualitative study - Project ICT and Learning (PIL) - that engaged the participation of eight primary schools in Denmark, and was conducted between 2006 and 2008. The research design was based on action research, involving teachers...... and other relevant stakeholders, as well as participant observations in the classroom documented by thick descriptions, formal and informal interviews and focus group interviews. The aim of the study was to explore and identify relations between designs for teaching and learning and the students' learning...... of school subjects within defined learning goals and curricula, along with various implementations of ICT in the pedagogical everyday practice (Levinsen & Sørensen 2008). However, another research strand - the topic of this paper - emerged during the project's life cycle as a consequence of ongoing changes...

  13. Development and application of an uncertainty methodology for the Sizewell B large LOCA safety case

    International Nuclear Information System (INIS)

    Lightfoot, P.; Trow, M.

    1994-01-01

    This paper presents an uncertainty methodology which has been successfully applied to the licensing of Sizewell B for large break LOCA. The emphasis of this approach has been on gaining a detailed understanding of the physical process and of the sensitivity to individual phenomena. The major contributors to uncertainty have been identified, and have subsequently been included in a combined uncertainty analysis. The combined uncertainty analysis demonstrated that uncertainties did not combine in a highly non-linear manner phenomena such as the random reflood effect and clad ballooning have been treated a bounding biases in the assessment of the overall bounding peak clad temperature. The plant initial and boundary conditions have been conservatively defined for the uncertainty analysis. A better estimate calculation, which uses more realistic assumptions, shows a large benefit in the predicted peak clad temperature, thereby demonstrating the conservatism of the uncertainty analysis. The UK licensing regime is not prescriptive in terms of the approach to large LOCA analysis, and no attempt has been made to apply a formal probability or confidence limit to the final bounding peak clad temperature is conservative. The Sizewell B uncertainty analysis was completed within the timescale and resources limitations. It has been shown to be practical in its application and reductions in the required analysis scope have been identified for any future plants of similar design

  14. Formality of the Chinese collective leadership.

    Science.gov (United States)

    Li, Haiying; Graesser, Arthur C

    2016-09-01

    We investigated the linguistic patterns in the discourse of four generations of the collective leadership of the Communist Party of China (CPC) from 1921 to 2012. The texts of Mao Zedong, Deng Xiaoping, Jiang Zemin, and Hu Jintao were analyzed using computational linguistic techniques (a Chinese formality score) to explore the persuasive linguistic features of the leaders in the contexts of power phase, the nation's education level, power duration, and age. The study was guided by the elaboration likelihood model of persuasion, which includes a central route (represented by formal discourse) versus a peripheral route (represented by informal discourse) to persuasion. The results revealed that these leaders adopted the formal, central route more when they were in power than before they came into power. The nation's education level was a significant factor in the leaders' adoption of the persuasion strategy. The leaders' formality also decreased with their increasing age and in-power times. However, the predictability of these factors for formality had subtle differences among the different types of leaders. These results enhance our understanding of the Chinese collective leadership and the role of formality in politically persuasive messages.

  15. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  16. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  17. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  18. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  19. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  20. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  1. Spinor formalism and complex-vector formalism of general relativity

    International Nuclear Information System (INIS)

    Han-ying, G.; Yong-shi, W.; Gendao, L.

    1974-01-01

    In this paper, using E. Cartan's exterior calculus, we give the spinor form of the structure equations, which leads naturally to the Newman--Penrose equations. Furthermore, starting from the spinor spaces and the el (2C) algebra, we construct the general complex-vector formalism of general relativity. We find that both the Cahen--Debever--Defrise complex-vector formalism and that of Brans are its special cases. Thus, the spinor formalism and the complex-vector formalism of general relativity are unified on the basis of the uni-modular group SL(2C) and its Lie algebra

  2. An Integrated Approach for Characterization of Uncertainty in Complex Best Estimate Safety Assessment

    International Nuclear Information System (INIS)

    Pourgol-Mohamad, Mohammad; Modarres, Mohammad; Mosleh, Ali

    2013-01-01

    This paper discusses an approach called Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA) to characterize and integrate a wide range of uncertainties associated with the best estimate models and complex system codes used for nuclear power plant safety analyses. Examples of applications include complex thermal hydraulic and fire analysis codes. In identifying and assessing uncertainties, the proposed methodology treats the complex code as a 'white box', thus explicitly treating internal sub-model uncertainties in addition to the uncertainties related to the inputs to the code. The methodology accounts for uncertainties related to experimental data used to develop such sub-models, and efficiently propagates all uncertainties during best estimate calculations. Uncertainties are formally analyzed and probabilistically treated using a Bayesian inference framework. This comprehensive approach presents the results in a form usable in most other safety analyses such as the probabilistic safety assessment. The code output results are further updated through additional Bayesian inference using any available experimental data, for example from thermal hydraulic integral test facilities. The approach includes provisions to account for uncertainties associated with user-specified options, for example for choices among alternative sub-models, or among several different correlations. Complex time-dependent best-estimate calculations are computationally intense. The paper presents approaches to minimize computational intensity during the uncertainty propagation. Finally, the paper will report effectiveness and practicality of the methodology with two applications to a complex thermal-hydraulics system code as well as a complex fire simulation code. In case of multiple alternative models, several techniques, including dynamic model switching, user-controlled model selection, and model mixing, are discussed. (authors)

  3. Network planning under uncertainties

    Science.gov (United States)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  4. Multifractal formalisms of human behavior.

    Science.gov (United States)

    Ihlen, Espen A F; Vereijken, Beatrix

    2013-08-01

    With the mounting realization that variability is an inevitable part of human behavior comes the need to integrate this phenomenon in concomitant models and theories of motor control. Among other things, this has resulted in a debate throughout the last decades about the origin of variability in behavior, the outcome of which has important implications for motor control theories. To date, a monofractal formalism of variability has been used as the basis for arguing for component- versus interaction-oriented theories of motor control. However, monofractal formalism alone cannot decide between the opposing sides of the debate. The present theoretical overview introduces multifractal formalisms as a necessary extension of the conventional monofractal formalism. In multifractal formalisms, the scale invariance of behavior is numerically defined as a spectrum of scaling exponents, rather than a single average exponent as in the monofractal formalism. Several methods to estimate the multifractal spectrum of scaling exponents - all within two multifractal formalisms called large deviation and Legendre formalism - are introduced and briefly discussed. Furthermore, the multifractal analyses within these two formalisms are applied to several performance tasks to illustrate how explanations of motor control vary with the methods used. The main section of the theoretical overview discusses the implications of multifractal extensions of the component- and interaction-oriented models for existing theories of motor control. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  6. Leibniz' First Formalization of Syllogistics

    DEFF Research Database (Denmark)

    Robering, Klaus

    2014-01-01

    of letters just those which belong to the useful, i.e., valid, modes. The set of codes of valid modes turns out to be a so-called "regular" language (in the sense of formal-language-theory). Leibniz' formalization of syllogistics in his Dissertatio thus contains an estimation of the computational complexity...

  7. A Formalization of Linkage Analysis

    DEFF Research Database (Denmark)

    Ingolfsdottir, Anna; Christensen, A.I.; Hansen, Jens A.

    In this report a formalization of genetic linkage analysis is introduced. Linkage analysis is a computationally hard biomathematical method, which purpose is to locate genes on the human genome. It is rooted in the new area of bioinformatics and no formalization of the method has previously been ...

  8. Formal Methods: Practice and Experience

    DEFF Research Database (Denmark)

    Woodcock, Jim; Larsen, Peter Gorm; Bicarregui, Juan

    2009-01-01

    Formal methods use mathematical models for analysis and verification at any part of the program life-cycle. We describe the state of the art in the industrial use of formal methods, concentrating on their increasing use at the earlier stages of specification and design. We do this by reporting on...

  9. PERANAN PENDIDIKAN FORMAL DALAM PROSES PEMBUDAYAAN

    Directory of Open Access Journals (Sweden)

    Juanda Juanda

    2010-06-01

    Full Text Available This article deals the role of formal education in the process of enculturation, including six aspects: education and encul-turation, educational policy in Indonesia, educational aspects as cultural phenomena, the functions of culture in education, the role of formal education in the process of enculturation, and the process of enculturation through formal education. The policy of equal right for education is designed to give equal opportunity to all people to get education without discrimination to any ethnic group. Education is an endeavour of developing the community culture to be a modern society, developed and harmonious based on the shared cultural values. Enculturation process is an attempt to guide someone’s attitude and behaviour based on science and skill.

  10. Heisenberg, Matrix Mechanics, and the Uncertainty Principle 4-6 ...

    Indian Academy of Sciences (India)

    These investigations climaxed with the advent of quan- tum mechanics in the 1920s. Under the leadership of ... discollraged. uot to say repelled, ... by the lack of visuali:t:ability" in matrix mechanics. (SchrodillgeT~s formalism deals with the .... because of its profound consequences. The Uncertainty Principle. The Uncertainty ...

  11. New procedure for departure formalities

    CERN Multimedia

    HR & GS Departments

    2011-01-01

    As part of the process of simplifying procedures and rationalising administrative processes, the HR and GS Departments have introduced new personalised departure formalities on EDH. These new formalities have applied to students leaving CERN since last year and from 17 October 2011 this procedure will be extended to the following categories of CERN personnel: Staff members, Fellows and Associates. It is planned to extend this electronic procedure to the users in due course. What purpose do departure formalities serve? The departure formalities are designed to ensure that members of the personnel contact all the relevant services in order to return any necessary items (equipment, cards, keys, dosimeter, electronic equipment, books, etc.) and are aware of all the benefits to which they are entitled on termination of their contract. The new departure formalities on EDH have the advantage of tailoring the list of services that each member of the personnel must visit to suit his individual contractual and p...

  12. Transition Path Time Distribution, Tunneling Times, Friction, and Uncertainty

    Science.gov (United States)

    Pollak, Eli

    2017-02-01

    A quantum mechanical transition path time probability distribution is formulated and its properties are studied using a parabolic barrier potential model. The average transit time is well defined and readily calculated. It is smaller than the analogous classical mechanical average transit time, vanishing at the crossover temperature. It provides a direct route for determining tunneling times. The average time may be also used to define a coarse grained momentum of the system for the passage from one side of the barrier to the other. The product of the uncertainty in this coarse grained momentum with the uncertainty in the location of the particle is shown under certain conditions to be smaller than the ℏ/2 formal uncertainty limit. The model is generalized to include friction in the form of a bilinear interaction with a harmonic bath. Using an Ohmic friction model one finds that increasing the friction, increases the transition time. Only moderate values of the reduced friction coefficient are needed for the quantum transition time and coarse grained uncertainty to approach the classical limit which is smaller than ℏ/2 when the friction is not too small. These results show how one obtains classical dynamics from a pure quantum system without invoking any further assumptions, approximations, or postulates.

  13. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  14. A formal framework for scenario development in support of environmental decision-making

    Science.gov (United States)

    Mahmoud, M.; Liu, Yajing; Hartmann, H.; Stewart, S.; Wagener, T.; Semmens, D.; Stewart, R.; Gupta, H.; Dominguez, D.; Dominguez, F.; Hulse, D.; Letcher, R.; Rashleigh, Brenda; Smith, C.; Street, R.; Ticehurst, J.; Twery, M.; van, Delden H.; Waldick, R.; White, D.; Winter, L.

    2009-01-01

    Scenarios are possible future states of the world that represent alternative plausible conditions under different assumptions. Often, scenarios are developed in a context relevant to stakeholders involved in their applications since the evaluation of scenario outcomes and implications can enhance decision-making activities. This paper reviews the state-of-the-art of scenario development and proposes a formal approach to scenario development in environmental decision-making. The discussion of current issues in scenario studies includes advantages and obstacles in utilizing a formal scenario development framework, and the different forms of uncertainty inherent in scenario development, as well as how they should be treated. An appendix for common scenario terminology has been attached for clarity. Major recommendations for future research in this area include proper consideration of uncertainty in scenario studies in particular in relation to stakeholder relevant information, construction of scenarios that are more diverse in nature, and sharing of information and resources among the scenario development research community. ?? 2008 Elsevier Ltd.

  15. Creative uncertainty

    Science.gov (United States)

    Victoria Marshall; Dil Hoda

    2009-01-01

    One of 18 articles inspired by the Meristem 2007 Forum, "Restorative Commons for Community Health." The articles include interviews, case studies, thought pieces, and interdisciplinary theoretical works that explore the relationship between human health and the urban...

  16. Logical strength of complexity theory and a formalization of the PCP theorem in bounded arithmetic

    OpenAIRE

    Pich, Ján

    2014-01-01

    We present several known formalizations of theorems from computational complexity in bounded arithmetic and formalize the PCP theorem in the theory PV1 (no formalization of this theorem was known). This includes a formalization of the existence and of some properties of the (n,d,{\\lambda})-graphs in PV1.

  17. Uncertainty and Decision Making

    Science.gov (United States)

    1979-09-01

    included as independent variables orderli- ness, the status of the source of information, the primacy versus recency of positive information items, and...low uncertainty and high satisfac- tion. The primacy / recency and sequential/final variables produced no significant differences. In summary, we have...to which the different independent variables (credibility, probability, and content) had an effect on the favorability judgments. The results were

  18. Formal verification - Robust and efficient code: Introduction to Formal Verification

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    In general, FV means "proving that certain properties hold for a given system using formal mathematics". This definition can certainly feel daunting, however, as we will learn, we can reap benefits from the paradigm without digging too deep into ...

  19. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  20. Applications of the Decoherence Formalism

    Science.gov (United States)

    Brun, Todd Andrew

    In this work the decoherence formalism of quantum mechanics is explored and applied to a number of interesting problems in quantum physics. The boundary between quantum and classical physics is examined, and demonstration made that quantum histories corresponding to classical equations of motion become more probable for a broad class of models, including linear and nonlinear models of Brownian motion. The link between noise, dissipation, and decoherence is studied. This work is then applied to systems which classically exhibit dissipative chaotic dynamics. A theory is explicated for treating these systems, and the ideas are applied to a particular model of the forced, damped Duffing oscillator, which is chaotic for certain parameter values. Differences between classical and quantum chaos are examined, particularly differences arising in the structure of fractal strange attractors, and the conceptual difficulties in framing standard notions of chaos in a quantum system. A brief discussion of previous work on quantum chaos is included, and the differences between Hamiltonian and dissipative chaos pointed out; a somewhat different interpretation of quantum chaos from the standard one is suggested. A class of histories for quantum systems, in phase space rather than configuration space, is studied. Different ways of representing projections in phase space are discussed, and expressions for the probability of phase space histories are derived; conditions for such histories to decohere are also estimated in the semiclassical limit.

  1. Introduction to formal languages

    CERN Document Server

    Révész, György E

    1991-01-01

    Covers all areas, including operations on languages, context-sensitive languages, automata, decidability, syntax analysis, derivation languages, and more. Numerous worked examples, problem exercises, and elegant mathematical proofs. 1983 edition.

  2. Leibniz' First Formalization of Syllogistics

    DEFF Research Database (Denmark)

    Robering, Klaus

    2014-01-01

    of letters just those which belong to the useful, i.e., valid, modes. The set of codes of valid modes turns out to be a so-called "regular" language (in the sense of formal-language-theory). Leibniz' formalization of syllogistics in his Dissertatio thus contains an estimation of the computational complexity......In his Dissertatio de Arte Combinatoria from 1666 Leibniz provides a formal presentation of syllogistics within the framework of his general-combinatoric program. He does not set up an axiomatic system for the derivation of all valid syllogistic mode, rather he formulates a set of rules which...

  3. Lessons Learned- The Use of Formal Expert Elicitation in Probablistic Seismic Hazard

    Energy Technology Data Exchange (ETDEWEB)

    K.J. Coppersmith; R.C. Perman; R.R. Youngs

    2006-05-10

    Probabilistic seismic hazard analyses provide the opportunity, indeed the requirement, to quantify the uncertainties in important inputs to the analysis. The locations of future earthquakes, their recurrence rates and maximum size, and the ground motions that will result at a site of interest are all quantities that require careful consideration because they are uncertain. The earliest PSHA models [Cornell, 1968] provided solely for the randomness or aleatory variability in these quantities. The most sophisticated seismic hazard models today, which include quantified uncertainties, are merely more realistic representations of this basic aleatory model. All attempts to quantify uncertainties require expert judgment. Further, all uncertainty models should endeavor to consider the range of views of the larger technical community at the time the hazard analysis is conducted. In some cases, especially for large projects under regulatory review, formal structured methods for eliciting expert judgments have been employed. Experience has shown that certain key elements are required for these assessments to be successful, including: (1) experts should be trained in probability theory, uncertainty quantification, and ways to avoid common cognitive biases; (2) comprehensive and user-friendly databases should be provided to the experts; (3) experts should be required to evaluate all potentially credible hypotheses; (4) workshops and other interactions among the experts and proponents of published viewpoints should be encouraged; (5) elicitations are best conducted in individual interview sessions; (6) feedback should be provided to the experts to give them insight into the significance of alternative assessments to the hazard results; and (7) complete documentation should include the technical basis for all assessments. Case histories are given from seismic hazard analyses in Europe, western North America, and the stable continental region of the United States.

  4. Hydrology, society, change and uncertainty

    Science.gov (United States)

    Koutsoyiannis, Demetris

    2014-05-01

    Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

  5. Fourth NASA Langley Formal Methods Workshop

    Science.gov (United States)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  6. Contribution to the systemic study of energetic systems including electrochemical devices: Bond Graph formalism applied to modelling fuel cells, lithium-ion batteries and sun-racer; Contribution a l'etude systemique de dispositifs energetiques a composants electrochimiques. Formalisme Bond Graph applique aux piles a combustible, accumulateurs lithium-ion, vehicule solaire

    Energy Technology Data Exchange (ETDEWEB)

    Saisset, R.

    2004-04-01

    This thesis is a contribution to the study of electric power conversion systems including electrochemical devices. A systemic approach draws advantage of the unified Bond Graph formalism in order to model every component as well as the whole system. A state of the art of electrochemical devices for decentralized electric energy generation and storage put emphasis on common phenomena with the aim of developing 'system oriented' generic models. Solid Oxide and Proton Exchange Fuel Cells (SOFC, PEMFC), as well as Lithium Ion batteries, have been modelled through an efficient work with electrochemistry specialists. These models involve an explicit representation, at a macroscopic level, of conversion and irreversible phenomena linked to the chemical reaction and coupled together both in the hydraulic, chemical, thermodynamic, electric and thermal fields. These models are used to study the modularity of the components, particularly the electric and thermal imbalances in the series and parallel fuel cells associations. The systemic approach is also applied to the study of architectures and energy management of electric power generating units involving PEMFC and battery or super-capacitors storage. Different working conditions for the fuel cells are defined and studied, consisting in either voltage or current or power imposed by means of the storage and static converters environment. Identification of parameters and working tests are performed on specially developed test benches so as to validate theoretical results. At last, the method is applied to study a 'sun-racer', an original complex system with embedded photovoltaic generator, electrochemical storage and brush-less wheel motor, wholly modelled in order to compare various energy management onboard the solar vehicle 'Solelhada'. (author)

  7. El Salvador - Formal Technical Education

    Data.gov (United States)

    Millennium Challenge Corporation — With a budget of nearly $20 million, the Formal Technical Education Sub-Activity was designed to strengthen technical and vocational educational institutions in the...

  8. Concepts of formal concept analysis

    Science.gov (United States)

    Žáček, Martin; Homola, Dan; Miarka, Rostislav

    2017-07-01

    The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.

  9. Readings in Formal Epistemology : Sourcebook

    NARCIS (Netherlands)

    Arló-Costa, H.; Hendricks, V.F.; van Benthem, J.

    2016-01-01

    This volume presents 38 classic texts in formal epistemology, and strengthens the ties between research into this area of philosophy and its neighbouring intellectual disciplines. The editors provide introductions to five subsections: Bayesian Epistemology, Belief Change, Decision Theory,

  10. Risk, uncertainty and regulation.

    Science.gov (United States)

    Krebs, John R

    2011-12-13

    This paper reviews the relationship between scientific evidence, uncertainty, risk and regulation. Risk has many different meanings. Furthermore, if risk is defined as the likelihood of an event happening multiplied by its impact, subjective perceptions of risk often diverge from the objective assessment. Scientific evidence may be ambiguous. Scientific experts are called upon to assess risks, but there is often uncertainty in their assessment, or disagreement about the magnitude of the risk. The translation of risk assessments into policy is a political judgement that includes consideration of the acceptability of the risk and the costs and benefits of legislation to reduce the risk. These general points are illustrated with reference to three examples: regulation of risk from pesticides, control of bovine tuberculosis and pricing of alcohol as a means to discourage excessive drinking.

  11. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  12. Equifinality of formal (DREAM) and informal (GLUE) bayesian approaches in hydrologic modeling?

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Ter Braak, Cajo J F [NON LANL; Gupta, Hoshin V [NON LANL

    2008-01-01

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.

  13. Uncertainty and Decision Making: Examples of Some Possible New Frontiers

    Science.gov (United States)

    Silliman, S. E.; Rodak, C. M.; Bolster, D.; Saavedra, K.; Evans, W.

    2011-12-01

    The concept of decision making under uncertainty for groundwater systems represents an exciting area of research and application. In this presentation, three examples are briefly introduced which represent possible new applications of risk and decision making under uncertainty. In the most classic of the three examples, a probabilistic strategy is considered within the context of management / assessment of proposed changes in land-use in the vicinity of a public water-supply well. Focused on health-risk related to contamination at the well, the analysis includes uncertainties in source location / strength, groundwater flow / transport, human exposure, and human health risk. The second example involves application of Probabilistic Risk Assessment (PRA) to the evaluation of development projects in rural regions of developing countries. PRA combined with Fault Tree Analysis provides a structure for analysis of the impact of data uncertainties on the estimation of health risk resulting from failure of multiple components of new water-resource systems. The third is an extension of the concept of "risk compensation" to the analysis of potential long-term risk associated with new water resource projects. Of direct interest here is the appearance of new risk to the public, such as introduction of new disease pathways or new sources of contamination of the source waters. As a result of limitations on conceptual model and/or limitations on data, this type of risk is often difficult to identify / assess, and is therefore not commonly included in formal decision-making efforts: it may however seriously impact the long-term net benefit of a water resource project. The goal of presenting these three examples is to illustrate the breadth of possible application of uncertainty / risk analyses beyond the more classic applications to groundwater remediation and protection.

  14. General form of effective Lagrangian in path integral quantization formalism

    CERN Document Server

    Tu Tung Sheng; Yuan Tu Nan

    1980-01-01

    The general form of the effective lagrangian in path integral formalism is obtained. The effective lagrangian includes the classical lagrangian, the term proportional to delta (0) given by Lee and Yang (see Phys. Rev., vol.128, p.885, 1962) and the additional term which is presented explicitly in the series expansion. The formalism is explicitly covariant. (6 refs).

  15. Evaluation of nuclear data and their uncertainties

    International Nuclear Information System (INIS)

    Story, J.S.

    1984-01-01

    Some topics studied within the Winfrith Nuclear Data Group in recent years, and still of current importance, are briefly reviewed. Moderator cross-sections; criteria to be met for reactor applications are listed; thermal neutron scattering theory is summarized, with the approximations used to facilitate comutation; neutron age data test stringently the accuracy of epithermal cross-sections; a modification of the CFS effective range treatment for S-wave scatter by H is presented, and new calculations with up-to-date slow neutron scattering data are advocated. Use of multilevel resonance formalisms; the top bound resonance should be included explicitly in calculations; additive statistical terms are given to allow for ''distant'' negative and positive resonances, in both MLBW and R-M formalisms; formulae are presented for estimating R-M level shifts for 1>0 resonances. Resonance mean spacings; the Syson-Mehta optimum estimator is utilised in a method which up-dates the staircase plot. Resonances of 56 Fe have been resolved to approx. 800keV, over which range the level density for given Jπ should increase 2-fold; this variation is allowed for in the mean spacing calculations. Fission-product decay power; present status of integral data and summation calculations for 235 U and 239 Pu fissions is summarized, with a variety of intercomparisons including 239 Pu/ 235 U ratios. Data uncertainties are considered, but the sequence of data on GAMMAsub(γ) for the 27.8keV resonance of 56 Fe provided a cautionary example. (author)

  16. Formal Education with LSST

    Science.gov (United States)

    Herrold, Ardis; Bauer, Amanda, Dr.; Peterson, J. Matt; Large Synoptic Survey Telescope Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope will usher in a new age of astronomical data exploration for science educators and students. LSST data sets will be large, deep, and dynamic, and will establish a time-domain record that will extend over a decade. They will be used to provide engaging, relevant learning experiences.The EPO Team will develop online investigations using authentic LSST data that offer varying levels of challenge and depth by the start of telescope operations, slated to begin in 2022. The topics will cover common introductory astronomy concepts, and will align with the four science domains of LSST: The Milky Way, the changing sky (transients), solar system (moving) objects, and dark matter and dark energy.Online Jupyter notebooks will make LSST data easily available to access and analyze by students at the advanced middle school through college levels. Using online notebooks will circumvent common obstacles caused by firewalls, bandwidth issues, and the need to download software, as they will be accessible from any computer or tablet with internet access. Although the LSST EPO Jupyter notebooks are Python-based, a knowledge of programming will not be required to use them.Each topical investigation will include teacher and student versions of Jupyter notebooks, instructional videos, and access to a suite of support materials including a forum, and professional development training and tutorial videos.Jupyter notebooks will contain embedded widgets to process data, eliminating the need to use external spreadsheets and plotting software. Students will be able to analyze data by using some of the existing modules already developed for professional astronomers. This will shorten the time needed to conduct investigations and will shift the emphasis to understanding the underlying science themes, which is often lost with novice learners.

  17. A Comparitive Study of Subject Knowledge of B.Ed Graduates of Formal and Non-Formal Teacher Education Systems

    Science.gov (United States)

    Saif, Perveen; Reba, Amjad; ud Din, Jalal

    2017-01-01

    This study was designed to compare the subject knowledge of B.Ed graduates of formal and non-formal teacher education systems. The population of the study included all teachers from Girls High and Higher Secondary Schools both from private and public sectors from the district of Peshawar. Out of the total population, twenty schools were randomly…

  18. Formalization of an environmental model using formal concept analysis - FCA

    Science.gov (United States)

    Bourdon-García, Rubén D.; Burgos-Salcedo, Javier D.

    2016-08-01

    Nowadays, there is a huge necessity to generate novel strategies for social-ecological systems analyses for resolving global sustainability problems. This paper has as main purpose the application of the formal concept analysis to formalize the theory of Augusto Ángel Maya, who without a doubt, was one of the most important environmental philosophers in South America; Ángel Maya proposed and established that Ecosystem-Culture relations, instead Human-Nature ones, are determinants in our understanding and management of natural resources. Based on this, a concept lattice, formal concepts, subconcept-superconcept relations, partially ordered sets, supremum and infimum of the lattice and implications between attributes (Duquenne-Guigues base), were determined for the ecosystem-culture relations.

  19. Hydrologic Scenario Uncertainty in a Comprehensive Assessment of Hydrogeologic Uncertainty

    Science.gov (United States)

    Nicholson, T. J.; Meyer, P. D.; Ye, M.; Neuman, S. P.

    2005-12-01

    A method to jointly assess hydrogeologic conceptual model and parameter uncertainties has recently been developed based on a Maximum Likelihood implementation of Bayesian Model Averaging (MLBMA). Evidence from groundwater model post-audits suggests that errors in the projected future hydrologic conditions of a site (hydrologic scenarios) are a significant source of model predictive errors. MLBMA can be extended to include hydrologic scenario uncertainty, along with conceptual model and parameter uncertainties, in a systematic and quantitative assessment of predictive uncertainty. Like conceptual model uncertainty, scenario uncertainty is represented by a discrete set of alternative scenarios. The effect of scenario uncertainty on model predictions is quantitatively assessed by conducting an MLBMA analysis under each scenario. We demonstrate that posterior model probability is a function of the scenario only through the possible dependence of prior model probabilities on the scenario. As a result, the model likelihoods (computed from calibration results), are not a function of the scenario and do not need to be recomputed under each scenario. MLBMA results for each scenario are weighted by the scenario probability and combined to render a joint assessment of scenario, conceptual model, and parameter uncertainty. Like model probability, scenario probability represents a subjective evaluation, in this case of the plausibility of the occurrence of the specific scenario. Because the scenarios describe future conditions, the scenario probabilities represent prior estimates and cannot be updated using the (past) system state data as is used to compute posterior model probabilities. Assessment of hydrologic scenario uncertainty is illustrated using a site-specific application considering future changes in land use, dam operations, and climate. Estimation of scenario probabilities and consideration of scenario characteristics (e.g., timing, magnitude) are discussed.

  20. A FORMALISM FOR FUZZY BUSINESS RULES

    Directory of Open Access Journals (Sweden)

    Vasile Mazilescu

    2015-05-01

    Full Text Available The aim of this paper is to provide a formalism for fuzzy rule bases, included in our prototype system FUZZY_ENTERPRISE. This framework can be used in Distributed Knowledge Management Systems (DKMSs, real-time interdisciplinary decision making systems, that often require increasing technical support to high quality decisions in a timely manner. The language of the first-degree predicates facilitates the formulation of complex knowledge in a rigorous way, imposing appropriate reasoning techniques.

  1. Formal connections in deformation quantization

    DEFF Research Database (Denmark)

    Masulli, Paolo

    product on a Poisson manifold that is in general non-commutative and corresponds to the composition of the quantized observables. While in general it is difficult to express a star product globally on a curved manifold in an explicit way, we consider a case where this is possible, namely that of a Kähler...... terms. This allows us to express the equations determining a trivialization of the formal connection completely in graph terms, and solving them amounts to finding a linear combination of graphs whose derivative is equal to a given expression. We shall also look at another approach to the problem...... that is more calculative. Moreover we use the graph formalism to give a set of recursive equations determining the formal connection for a given family of star products....

  2. Logical formalization and the formalization of logic(s)

    Czech Academy of Sciences Publication Activity Database

    Peregrin, Jaroslav; Svoboda, Vladimír

    2016-01-01

    Roč. 59, č. 233 (2016), s. 55-80 ISSN 0024-5836 R&D Projects: GA ČR(CZ) GA13-21076S Institutional support: RVO:67985955 Keywords : logical formalization * logical analysis * reflective equilibrium Subject RIV: AA - Philosophy ; Religion

  3. Integrating Semi-formal and Formal Software Specification Techniques

    NARCIS (Netherlands)

    Wieringa, Roelf J.; Dubois, Eric

    1998-01-01

    In this paper, we report on the integration of informal, semiformal and formal system specification techniques. We present a framework for system specification called TRADE, within which several well-known semiformal specification techniques are placed. TRADE is based on an analysis of structured

  4. International conference on Facets of Uncertainties and Applications

    CERN Document Server

    Skowron, Andrzej; Maiti, Manoranjan; Kar, Samarjit

    2015-01-01

    Since the emergence of the formal concept of probability theory in the seventeenth century, uncertainty has been perceived solely in terms of probability theory. However, this apparently unique link between uncertainty and probability theory has come under investigation a few decades back. Uncertainties are nowadays accepted to be of various kinds. Uncertainty in general could refer to different sense like not certainly known, questionable, problematic, vague, not definite or determined, ambiguous, liable to change, not reliable. In Indian languages, particularly in Sanskrit-based languages, there are other higher levels of uncertainties. It has been shown that several mathematical concepts such as the theory of fuzzy sets, theory of rough sets, evidence theory, possibility theory, theory of complex systems and complex network, theory of fuzzy measures and uncertainty theory can also successfully model uncertainty.

  5. Formal language constrained path problems

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C.; Jacob, R.; Marathe, M.

    1997-07-08

    In many path finding problems arising in practice, certain patterns of edge/vertex labels in the labeled graph being traversed are allowed/preferred, while others are disallowed. Motivated by such applications as intermodal transportation planning, the authors investigate the complexity of finding feasible paths in a labeled network, where the mode choice for each traveler is specified by a formal language. The main contributions of this paper include the following: (1) the authors show that the problem of finding a shortest path between a source and destination for a traveler whose mode choice is specified as a context free language is solvable efficiently in polynomial time, when the mode choice is specified as a regular language they provide algorithms with improved space and time bounds; (2) in contrast, they show that the problem of finding simple paths between a source and a given destination is NP-hard, even when restricted to very simple regular expressions and/or very simple graphs; (3) for the class of treewidth bounded graphs, they show that (i) the problem of finding a regular language constrained simple path between source and a destination is solvable in polynomial time and (ii) the extension to finding context free language constrained simple paths is NP-complete. Several extensions of these results are presented in the context of finding shortest paths with additional constraints. These results significantly extend the results in [MW95]. As a corollary of the results, they obtain a polynomial time algorithm for the BEST k-SIMILAR PATH problem studied in [SJB97]. The previous best algorithm was given by [SJB97] and takes exponential time in the worst case.

  6. Formal Institutions and Subjective Wellbeing

    DEFF Research Database (Denmark)

    Bjørnskov, Christian; Dreher, Axel; Fischer, Justina A.V.

    2010-01-01

    A long tradition in economics explores the association between the quality of formal institutions and economic performance. The literature on the relationship between such institutions and happiness is, however, rather limited, and inconclusive. In this paper, we revisit the findings from recent...... cross-country studies on the institution-happiness association. Our findings suggest that their conclusions are qualitatively rather insensitive to the specific measure of 'happiness' used, while the associations between formal institutions and subjective well-being differ among poor and rich countries...

  7. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  8. Optimization under Uncertainty

    KAUST Repository

    Lopez, Rafael H.

    2016-01-06

    The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.

  9. Sources of uncertainty in cancer survivorship.

    Science.gov (United States)

    Miller, Laura E

    2012-12-01

    Previous research has demonstrated the common experience of illness-related uncertainty; however, little research has explored the specific sources of uncertainty throughout cancer survivorship. The purpose of this study is to investigate the experience of uncertainty for cancer survivors and their partners. Thus, the following research question is posed: What are the sources of uncertainty in cancer survivorship for survivors and partners? One-on-one interviews were conducted with 35 cancer survivors and 25 partners. Constant comparative methodologies were used to analyze the data. Participants described medical, personal, and social sources of uncertainty that persisted throughout survivorship. Medical sources of uncertainty included questions about the cancer diagnosis, treatment and prognosis. Personal sources of uncertainty included ambiguous valued identities and career-related questions. Social sources of uncertainty included unclear communicative, relational and familial consequences of illness. Survivors and partners in this study experienced uncertainty that persisted long after the completion of cancer treatment. The participants also described sources of uncertainty unique to this illness context. These results have important implications for health care providers and intervention developers and imply that chronic uncertainty should be managed throughout survivorship. The sources of uncertainty described in the current study have important implications for cancer survivors' management of uncertainty. Cancer survivors and their family members must first know the common sources of uncertainty to adaptively adjust to an uncertain survivorship trajectory. The present investigation provides insight into the uncertainty experiences of cancer survivors and implies that continued care may improve well-being after the completion of cancer treatment.

  10. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  11. Rapid research and implementation priority setting for wound care uncertainties.

    Directory of Open Access Journals (Sweden)

    Trish A Gray

    Full Text Available People with complex wounds are more likely to be elderly, living with multimorbidity and wound related symptoms. A variety of products are available for managing complex wounds and a range of healthcare professionals are involved in wound care, yet there is a lack of good evidence to guide practice and services. These factors create uncertainty for those who deliver and those who manage wound care. Formal priority setting for research and implementation topics is needed to more accurately target the gaps in treatment and services. We solicited practitioner and manager uncertainties in wound care and held a priority setting workshop to facilitate a collaborative approach to prioritising wound care-related uncertainties.We recruited healthcare professionals who regularly cared for patients with complex wounds, were wound care specialists or managed wound care services. Participants submitted up to five wound care uncertainties in consultation with their colleagues, via an on-line survey and attended a priority setting workshop. Submitted uncertainties were collated, sorted and categorised according professional group. On the day of the workshop, participants were divided into four groups depending on their profession. Uncertainties submitted by their professional group were viewed, discussed and amended, prior to the first of three individual voting rounds. Participants cast up to ten votes for the uncertainties they judged as being high priority. Continuing in the professional groups, the top 10 uncertainties from each group were displayed, and the process was repeated. Groups were then brought together for a plenary session in which the final priorities were individually scored on a scale of 0-10 by participants. Priorities were ranked and results presented. Nominal group technique was used for generating the final uncertainties, voting and discussions.Thirty-three participants attended the workshop comprising; 10 specialist nurses, 10 district

  12. Formal Analysis of Graphical Security Models

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi

    The increasing usage of computer-based systems in almost every aspects of our daily life makes more and more dangerous the threat posed by potential attackers, and more and more rewarding a successful attack. Moreover, the complexity of these systems is also increasing, including physical devices......, software components and human actors interacting with each other to form so-called socio-technical systems. The importance of socio-technical systems to modern societies requires verifying their security properties formally, while their inherent complexity makes manual analyses impracticable. Graphical...... models for security offer an unrivalled opportunity to describe socio-technical systems, for they allow to represent different aspects like human behaviour, computation and physical phenomena in an abstract yet uniform manner. Moreover, these models can be assigned a formal semantics, thereby allowing...

  13. On the Coherence of Probabilistic Relational Formalisms

    Directory of Open Access Journals (Sweden)

    Glauber De Bona

    2018-03-01

    Full Text Available There are several formalisms that enhance Bayesian networks by including relations amongst individuals as modeling primitives. For instance, Probabilistic Relational Models (PRMs use diagrams and relational databases to represent repetitive Bayesian networks, while Relational Bayesian Networks (RBNs employ first-order probability formulas with the same purpose. We examine the coherence checking problem for those formalisms; that is, the problem of guaranteeing that any grounding of a well-formed set of sentences does produce a valid Bayesian network. This is a novel version of de Finetti’s problem of coherence checking for probabilistic assessments. We show how to reduce the coherence checking problem in relational Bayesian networks to a validity problem in first-order logic augmented with a transitive closure operator and how to combine this logic-based approach with faster, but incomplete algorithms.

  14. $\\delta N$ formalism from superpotential and holography

    CERN Document Server

    Garriga, Jaume; Vernizzi, Filippo

    2016-02-16

    We consider the superpotential formalism to describe the evolution of scalar fields during inflation, generalizing it to include the case with non-canonical kinetic terms. We provide a characterization of the attractor behaviour of the background evolution in terms of first and second slow-roll parameters (which need not be small). We find that the superpotential is useful in justifying the separate universe approximation from the gradient expansion, and also in computing the spectra of primordial perturbations around attractor solutions in the $\\delta N$ formalism. As an application, we consider a class of models where the background trajectories for the inflaton fields are derived from a product separable superpotential. In the perspective of the holographic inflation scenario, such models are dual to a deformed CFT boundary theory, with $D$ mutually uncorrelated deformation operators. We compute the bulk power spectra of primordial adiabatic and entropy cosmological perturbations, and show that the results...

  15. Charging transient in polyvinyl formal

    Indian Academy of Sciences (India)

    Unknown

    401–406. © Indian Academy of Sciences. 401. Charging transient in polyvinyl formal. P K KHARE*, P L JAIN† and R K PANDEY‡. Department of Postgraduate Studies & Research in Physics & Electronics, Rani Durgavati University,. Jabalpur 482 001, India. †Department of Physics, Government PG College, Damoh 470 ...

  16. Formal monkey linguistics : The debate

    NARCIS (Netherlands)

    Schlenker, Philippe; Chemla, Emmanuel; Schel, Anne M.; Fuller, James; Gautier, Jean Pierre; Kuhn, Jeremy; Veselinović, Dunja; Arnold, Kate; Cäsar, Cristiane; Keenan, Sumir; Lemasson, Alban; Ouattara, Karim; Ryder, Robin; Zuberbühler, Klaus

    2016-01-01

    We explain why general techniques from formal linguistics can and should be applied to the analysis of monkey communication - in the areas of syntax and especially semantics. An informed look at our recent proposals shows that such techniques needn't rely excessively on categories of human language:

  17. Rotor and wind turbine formalism

    DEFF Research Database (Denmark)

    Branlard, Emmanuel Simon Pierre

    2017-01-01

    The main conventions used in this book for the study of rotors are introduced in this chapter. The main assumptions and notations are provided. The formalism specific to wind turbines is presented. The forces, moments, velocities and dimensionless coefficients used in the study of rotors...

  18. Formalization in Component Based Development

    DEFF Research Database (Denmark)

    Holmegaard, Jens Peter; Knudsen, John; Makowski, Piotr

    2006-01-01

    may be specified in a formal language convenient for its purpose and, in principle, unrelated to languages for other aspects. Each aspect forms its own semantic domain, although a semantic domain may be parameterized by values derived from other aspects. The proposed conceptual framework is introduced...

  19. Formal tautologies in Czech language

    OpenAIRE

    Bílková, Jana

    2006-01-01

    The main purpose ofthis work was to describe and classify FT used in the contemporary Czech language and to demonstrate formal and semantic variety and high functional potential of this specific class of sentences. Powered by TCPDF (www.tcpdf.org)

  20. Formal systems for persuasion dialogue

    NARCIS (Netherlands)

    Prakken, Henry

    This article reviews formal systems that regulate persuasion dialogues. In such dialogues two or more participants aim to resolve a difference of opinion, each trying to persuade the other participants to adopt their point of view. Systems for persuasion dialogue have found application in various

  1. Automatic Testing with Formal Methods

    NARCIS (Netherlands)

    Tretmans, G.J.; Belinfante, Axel

    1999-01-01

    The use of formal system specifications makes it possible to automate the derivation of test cases from specifications. This allows to automate the whole testing process, not only the test execution part of it. This paper presents the state of the art and future perspectives in testing based on

  2. UML-ising formal techniques

    DEFF Research Database (Denmark)

    Bjørner, Dines; George, Chris W.; Haxthausen, Anne Elisabeth

    2004-01-01

    these different UML views are unified, integrated, correlated or merely co-located is for others to dispute. We also seek to support multiple views, but are also in no doubt that there must be sound, well defined relations between such views. We thus report on ways and means of integrating formal techniques...

  3. Formalizing ICD coding rules using Formal Concept Analysis.

    Science.gov (United States)

    Jiang, Guoqian; Pathak, Jyotishman; Chute, Christopher G

    2009-06-01

    With the 11th revision of the International Classification of Disease (ICD) being officially launched by the World Health Organization (WHO), the significance of a formal representation for ICD coding rules has emerged as a pragmatic concern. To explore the role of Formal Concept Analysis (FCA) on examining ICD10 coding rules and to develop FCA-based auditing approaches for the formalization process. We propose a model for formalizing ICD coding rules underlying the ICD Index using FCA. The coding rules are generated from FCA models and represented in the Semantic Web Rule Language (SWRL). Two auditing approaches were developed focusing upon non-disjoint nodes and anonymous nodes manifest in the FCA model. The candidate domains (i.e. any three character code with their sub-codes) of all 22 chapters of the ICD10 2006 version were analyzed using the two auditing approaches. Case studies and a preliminary evaluation were performed for validation. A total of 2044 formal contexts from the candidate domains of 22 ICD chapters were generated and audited. We identified 692 ICD codes having non-disjoint nodes in all chapters; chapters 19 and 21 contained the highest proportion of candidate domains with non-disjoint nodes (61.9% and 45.6%). We also identified 6996 anonymous nodes from 1382 candidate domains. Chapters 7, 11, 13, and 17, have the highest proportion of candidate domains having anonymous nodes (97.5%, 95.4%, 93.6% and 93.0%) while chapters 15 and 17 have the highest proportion of anonymous nodes among all chapters (45.5% and 44.0%). Case studies and a limited evaluation demonstrate that non-disjoint nodes and anonymous nodes arising from FCA are effective mechanisms for auditing ICD10. FCA-based models demonstrate a practical solution for formalizing ICD coding rules. FCA techniques could not only audit ICD domain knowledge completeness for a specific domain, but also provide a high level auditing profile for all ICD chapters.

  4. Uncovering the triple omeron vertex from Wilson line formalism

    International Nuclear Information System (INIS)

    Chirilli, G. A.; Szymanowski, L.; Wallon, S.

    2011-01-01

    We compute the triple omeron vertex from the Wilson line formalism, including both planar and nonplanar contributions, and get perfect agreement with the result obtained in the Extended Generalized Logarithmic Approximation based on Reggeon calculus.

  5. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    Science.gov (United States)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  6. Uncertainty in measurements by counting

    Science.gov (United States)

    Bich, Walter; Pennecchi, Francesca

    2012-02-01

    Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

  7. Uncertainties in land use data

    Directory of Open Access Journals (Sweden)

    G. Castilla

    2007-11-01

    Full Text Available This paper deals with the description and assessment of uncertainties in land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable reporting the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. The properties of this pattern that are relevant to hydrological processes have to be known with some accuracy in order to obtain reliable results; hence, uncertainty in land use data may lead to uncertainty in model predictions. There are two main uncertainties surrounding land use data, positional and categorical. The first one is briefly addressed and the second one is explored in more depth, including the factors that influence it. We (1 argue that the conventional method used to assess categorical uncertainty, the confusion matrix, is insufficient to propagate uncertainty through distributed hydrologic models; (2 report some alternative methods to tackle this and other insufficiencies; (3 stress the role of metadata as a more reliable means to assess the degree of distrust with which these data should be used; and (4 suggest some practical recommendations.

  8. Propagation of dynamic measurement uncertainty

    Science.gov (United States)

    Hessling, J. P.

    2011-10-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result.

  9. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  10. Robustness to strategic uncertainty

    NARCIS (Netherlands)

    Andersson, O.; Argenton, C.; Weibull, J.W.

    We introduce a criterion for robustness to strategic uncertainty in games with continuum strategy sets. We model a player's uncertainty about another player's strategy as an atomless probability distribution over that player's strategy set. We call a strategy profile robust to strategic uncertainty

  11. Fission Spectrum Related Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores

    2007-10-01

    The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.

  12. [How to write an article: formal aspects].

    Science.gov (United States)

    Corral de la Calle, M A; Encinas de la Iglesia, J

    2013-06-01

    Scientific research and the publication of the results of the studies go hand in hand. Exquisite research methods can only be adequately reflected in formal publication with the optimum structure. To ensure the success of this process, it is necessary to follow orderly steps, including selecting the journal in which to publish and following the instructions to authors strictly as well as the guidelines elaborated by diverse societies of editors and other institutions. It is also necessary to structure the contents of the article in a logical and attractive way and to use an accurate, clear, and concise style of language. Although not all the authors are directly involved in the actual writing, elaborating a scientific article is a collective undertaking that does not finish until the article is published. This article provides practical advice about formal and not-so-formal details to take into account when writing a scientific article as well as references that will help readers find more information in greater detail. Copyright © 2012 SERAM. Published by Elsevier Espana. All rights reserved.

  13. Addressing uncertainties in the ERICA Integrated Approach

    International Nuclear Information System (INIS)

    Oughton, D.H.; Agueero, A.; Avila, R.; Brown, J.E.; Copplestone, D.; Gilek, M.

    2008-01-01

    Like any complex environmental problem, ecological risk assessment of the impacts of ionising radiation is confounded by uncertainty. At all stages, from problem formulation through to risk characterisation, the assessment is dependent on models, scenarios, assumptions and extrapolations. These include technical uncertainties related to the data used, conceptual uncertainties associated with models and scenarios, as well as social uncertainties such as economic impacts, the interpretation of legislation, and the acceptability of the assessment results to stakeholders. The ERICA Integrated Approach has been developed to allow an assessment of the risks of ionising radiation, and includes a number of methods that are intended to make the uncertainties and assumptions inherent in the assessment more transparent to users and stakeholders. Throughout its development, ERICA has recommended that assessors deal openly with the deeper dimensions of uncertainty and acknowledge that uncertainty is intrinsic to complex systems. Since the tool is based on a tiered approach, the approaches to dealing with uncertainty vary between the tiers, ranging from a simple, but highly conservative screening to a full probabilistic risk assessment including sensitivity analysis. This paper gives on overview of types of uncertainty that are manifest in ecological risk assessment and the ERICA Integrated Approach to dealing with some of these uncertainties

  14. Polynomials formalism of quantum numbers

    International Nuclear Information System (INIS)

    Kazakov, K.V.

    2005-01-01

    Theoretical aspects of the recently suggested perturbation formalism based on the method of quantum number polynomials are considered in the context of the general anharmonicity problem. Using a biatomic molecule by way of example, it is demonstrated how the theory can be extrapolated to the case of vibrational-rotational interactions. As a result, an exact expression for the first coefficient of the Herman-Wallis factor is derived. In addition, the basic notions of the formalism are phenomenologically generalized and expanded to the problem of spin interaction. The concept of magneto-optical anharmonicity is introduced. As a consequence, an exact analogy is drawn with the well-known electro-optical theory of molecules, and a nonlinear dependence of the magnetic dipole moment of the system on the spin and wave variables is established [ru

  15. Contextual approach to quantum formalism

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    The aim of this book is to show that the probabilistic formalisms of classical statistical mechanics and quantum mechanics can be unified on the basis of a general contextual probabilistic model. By taking into account the dependence of (classical) probabilities on contexts (i.e. complexes of physical conditions), one can reproduce all distinct features of quantum probabilities such as the interference of probabilities and the violation of Bell’s inequality. Moreover, by starting with a formula for the interference of probabilities (which generalizes the well known classical formula of total probability), one can construct the representation of contextual probabilities by complex probability amplitudes or, in the abstract formalism, by normalized vectors of the complex Hilbert space or its hyperbolic generalization. Thus the Hilbert space representation of probabilities can be naturally derived from classical probabilistic assumptions. An important chapter of the book critically reviews known no-go theorems...

  16. Musical Formalism and Political Performances

    Directory of Open Access Journals (Sweden)

    Jonathan A. Neufeld

    2009-01-01

    Full Text Available Musical formalism, which strictly limits the type of thing any description of the music can tell us, is ill-equipped to account for contemporary performance practice. If performative interpretations are in a position to tell us something about musical works—that is if performance is a kind of description, as Peter Kivy argues—then we have to loosen the restrictions on notions of musical relevance to make sense of performance. I argue that musical formalism, which strictly limits the type of thing any description of the music can tell us, is inconsistent with Kivy's quite compelling account of performance. This shows the difficulty that actual performances pose to overly rigid conceptions of music. Daniel Barenboim's unannounced performance of Wagner in Israel in 2001 shows that the problem of the boundaries of musical relevance is no mere philosophical puzzle. It is a pressing problem in the musical public sphere.

  17. Stroh formalism and Rayleigh waves

    CERN Document Server

    Tanuma, Kazumi

    2008-01-01

    Introduces a powerful and elegant mathematical method for the analysis of anisotropic elasticity equationsThe reader can grasp the essentials as quickly as possibleCan be used as a textbook, which presents compactly introduction and applications of the Stroh formalismAppeals to the people not only in mathematics but also in mechanics and engineering sciencePrerequisites are only basic linear algebra, calculus and fundamentals of differential equations

  18. Formal Women-only Networks

    DEFF Research Database (Denmark)

    Villesèche, Florence; Josserand, Emmanuel

    2017-01-01

    /organisations and the wider social group of women in business. Research limitations/implications: The authors focus on the distinction between external and internal formal women-only networks while also acknowledging the broader diversity that can characterise such networks. Their review provides the reader with an insight...... member level, the authors suggest that such networks can be of value for organisations and the wider social group of women in management and leadership positions....

  19. Review of the helicity formalism

    International Nuclear Information System (INIS)

    Barreiro, F.; Cerrada, M.; Fernandez, E.

    1972-01-01

    Our purpose in these notes has been to present a brief and general review of the helicity formalism. We begin by discussing Lorentz invariance, spin and helicity ideas, in section 1 . In section 2 we deal with the construction of relativistic states and scattering amplitudes in the helicity basis and we study their transformation properties under discrete symmetries. Finally we present some more sophisticated topics like kinematical singularities of helicity amplitudes, kinematical constraints and crossing relations 3, 4, 5 respectively. (Author) 8 refs

  20. Variational formalism for spin particles

    International Nuclear Information System (INIS)

    Horvathy, P.

    1977-11-01

    The geometrical formulation of Hamilton's principle presented in a previous paper has been related to the usual one in terms of Lagrangian functions. The exact conditions for their equivalence are obtained and a method is given for the construction of a Lagrangian function. The formalism is extended to spin particles and a local Lagrangian is constructed in this case, too. However, this function cannot be extended to a global one. (D.P.)

  1. Ashtekar formalism with real variables

    International Nuclear Information System (INIS)

    Kalau, W.; Nationaal Inst. voor Kernfysica en Hoge-Energiefysica

    1990-12-01

    A new approach to canonical gravity is presented which is based on the Ashtekar formalism. But, in contrast to Ashtekar's variables, this formulation does not need complex quantities nor does it lead to second class constraints. This is achieved using SO(3,1) as a gauge group instead of complexified SO(3). Because of the larger group additional first class constraints are needed which turn out to be cubic and quartic in the momenta. (author). 13 refs

  2. Impact of model defect and experimental uncertainties on evaluated output

    International Nuclear Information System (INIS)

    Neudecker, D.; Capote, R.; Leeb, H.

    2013-01-01

    One of the current major problems in nuclear data evaluation is the unreasonably small evaluated uncertainties often obtained. These small uncertainties are partly attributed to missing correlations of experimental uncertainties as well as to deficiencies of the model employed for the prior information. In this article, both uncertainty sources are included in an evaluation of 55 Mn cross-sections for incident neutrons. Their impact on the evaluated output is studied using a prior obtained by the Full Bayesian Evaluation Technique and a prior obtained by the nuclear model program EMPIRE. It is shown analytically and by means of an evaluation that unreasonably small evaluated uncertainties can be obtained not only if correlated systematic uncertainties of the experiment are neglected but also if prior uncertainties are smaller or about the same magnitude as the experimental ones. Furthermore, it is shown that including model defect uncertainties in the evaluation of 55 Mn leads to larger evaluated uncertainties for channels where the model is deficient. It is concluded that including correlated experimental uncertainties is equally important as model defect uncertainties, if the model calculations deviate significantly from the measurements. -- Highlights: • We study possible causes of unreasonably small evaluated nuclear data uncertainties. • Two different formulations of model defect uncertainties are presented and compared. • Smaller prior than experimental uncertainties cause too small evaluated ones. • Neglected correlations of experimental uncertainties cause too small evaluated ones. • Including model defect uncertainties in the prior improves the evaluated output

  3. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  4. Improving Robustness of Network Fault Diagnosis to Uncertainty in Observations

    DEFF Research Database (Denmark)

    Grønbæk, Lars Jesper; Schwefel, Hans-Peter; Ceccarelli, Andrea

    2010-01-01

    Performing decentralized network fault diagnosis based on network traffic is challenging. Besides inherent stochastic behaviour of observations, measurements may be subject to errors degrading diagnosis timeliness and accuracy. In this paper we present a novel approach in which we aim to mitigate...... issues of measurement errors by quantifying uncertainty. The uncertainty information is applied in the diagnostic component to improve its robustness. Three diagnosis components have been proposed based on the Hidden Markov Model formalism: (H0) representing a classical approach, (H1) a static...... compensation of (H0) to uncertainties and (H2) dynamically adapting diagnosis to uncertainty information. From uncertainty injection scenarios of added measurement noise we demonstrate how using uncertainty information can provide a structured approach of improving diagnosis....

  5. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  6. A first formal link between the price equation and an optimization program.

    Science.gov (United States)

    Grafen, Alan

    2002-07-07

    The Darwin unification project is pursued. A meta-model encompassing an important class of population genetic models is formed by adding an abstract model of the number of successful gametes to the Price equation under uncertainty. A class of optimization programs are defined to represent the "individual-as-maximizing-agent analogy" in a general way. It is then shown that for each population genetic model there is a corresponding optimization program with which formal links can be established. These links provide a secure logical foundation for the commonplace biological principle that natural selection leads organisms to act as if maximizing their "fitness", provides a definition of "fitness", and clarifies the limitations of that principle. The situations covered do not include frequency dependence or social behaviour, but the approach is capable of extension.

  7. Formalized Search Strategies for Human Risk Contributions

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Pedersen, O. M.

    and documentation of the PRA coverage, including the search strategies applied, is discussed and aids are proposed such as plant description in terms of a formal abstraction hierarchy and use of cause-consequence-charts for the documentation of not only the results of PRA but also of its coverage. Typical human...... risk contributions are described on the basis of general plant design features relevant for risk and accident analysis. With this background, search strategies for human risk contributions are treated: Under the designation "work analysis", procedures for the analysis of familiar, well trained, planned...

  8. Great Computational Intelligence in the Formal Sciences via Analogical Reasoning

    Science.gov (United States)

    2017-05-08

    AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...hour per response, including the time for reviewing instructions, searching existing    data sources, gathering and maintaining the data needed, and...Intelligence in the Formal Sciences via Analogical Reasoning 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-12-1-0003 5c.  PROGRAM ELEMENT NUMBER 61102F

  9. Improved pion pion scattering amplitude from dispersion relation formalism

    International Nuclear Information System (INIS)

    Cavalcante, I.P.; Coutinho, Y.A.; Borges, J. Sa

    2005-01-01

    Pion-pion scattering amplitude is obtained from Chiral Perturbation Theory at one- and two-loop approximations. Dispersion relation formalism provides a more economic method, which was proved to reproduce the analytical structure of that amplitude at both approximation levels. This work extends the use of the formalism in order to compute further unitarity corrections to partial waves, including the D-wave amplitude. (author)

  10. Advancing Uncertainty: Untangling and Discerning Related Concepts

    Directory of Open Access Journals (Sweden)

    Janice Penrod

    2002-12-01

    Full Text Available Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal investigation into the meaning of the concept. Through concept analysis, the concept was deconstructed to identify conceptual components and gaps in understanding. Using this skeletal framework of the concept identified through concept analysis, subsequent studies were carried out to add ‘flesh’ to the concept. First, a concept refinement using the literature as data was completed. Findings revealed that the current state of the concept of uncertainty failed to incorporate what was known of the lived experience. Therefore, using interview techniques as the primary data source, a phenomenological study of uncertainty among caregivers was conducted. Incorporating the findings of the phenomenology, the skeletal framework of the concept was further fleshed out using techniques of concept correction to produce a more mature conceptualization of uncertainty. In this section, I describe the flow of this qualitative project investigating the concept of uncertainty, with special emphasis on a particular threat to validity (called conceptual tunnel vision that was identified and addressed during the phases of concept correction. Though in this article I employ a study of uncertainty for illustration, limited substantive findings regarding uncertainty are presented to retain a clear focus on the methodological issues.

  11. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  12. Uncertainty covariances in robotics applications

    International Nuclear Information System (INIS)

    Smith, D.L.

    1984-01-01

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  13. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  14. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    .D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  15. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  16. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data...... observed and expected variability is tested by means of the T-test, which follows a chi-square distribution with a number of degrees of freedom determined by the number of replicates. Significant deviations between predicted and observed variability may be caused by a variety of effects, and examples...... will be presented; both underestimation and overestimation may occur, each leading to correcting the influence of uncertainty components according to their influence on the variability of experimental results. Some uncertainty components can be verified only with a very small number of degrees of freedom, because...

  17. The smooth entropy formalism for von Neumann algebras

    International Nuclear Information System (INIS)

    Berta, Mario; Furrer, Fabian; Scholz, Volkher B.

    2016-01-01

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra

  18. The smooth entropy formalism for von Neumann algebras

    Energy Technology Data Exchange (ETDEWEB)

    Berta, Mario, E-mail: berta@caltech.edu [Institute for Quantum Information and Matter, California Institute of Technology, Pasadena, California 91125 (United States); Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp [Department of Physics, Graduate School of Science, University of Tokyo, Tokyo, Japan and Institute for Theoretical Physics, Leibniz University Hanover, Hanover (Germany); Scholz, Volkher B., E-mail: scholz@phys.ethz.ch [Institute for Theoretical Physics, ETH Zurich, Zurich (Switzerland)

    2016-01-15

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.

  19. Measuring the effect of formalization

    International Nuclear Information System (INIS)

    Stoelen, K.; Mohn, P.

    1998-01-01

    We present an ongoing research activity concerned with measuring the effect of an increased level of formalization in software development. We summarize the experiences from a first experimental development. Based on these experiences, we discuss a number of technical issues; in particular, problems connected to metrics based on fault reports. First of all, what is a fault? Secondly, how should the fault counting be integrated in the development process? Thirdly, any reasonable definition of fault depends on a notion of satisfaction. Hence, we must address the question: What does it mean for a specification or an implementation to satisfy a requirement imposed by a more high-level specification? (author)

  20. The formal de Rham complex

    Science.gov (United States)

    Zharinov, V. V.

    2013-02-01

    We propose a formal construction generalizing the classic de Rham complex to a wide class of models in mathematical physics and analysis. The presentation is divided into a sequence of definitions and elementary, easily verified statements; proofs are therefore given only in the key case. Linear operations are everywhere performed over a fixed number field {F} = {R},{C}. All linear spaces, algebras, and modules, although not stipulated explicitly, are by definition or by construction endowed with natural locally convex topologies, and their morphisms are continuous.

  1. Development and preliminary validation of a scale to measure patient uncertainty: The "Uncertainty Scale".

    Science.gov (United States)

    LaNoue, Marianna D; Gerolamo, Angela M; Powell, Rhea; Nord, Garrison; Doty, Amanda Mb; Rising, Kristin L

    2018-01-01

    Research suggests that patient uncertainty related to experiencing symptoms may drive decisions to seek care. The only validated measure of patient uncertainty assesses uncertainty related to defined illness. In prior work, we engaged patients to describe uncertainty related to symptoms and used findings to develop the 'U-Scale' scale. In this work, we present results from preliminary scale reliability and validity testing. Psychometric testing demonstrated content validity, high internal consistency, and evidence for concurrent validity. Next steps include administration in diverse populations for continued refinement and validation, and exploration of the potential contribution of uncertainty to healthcare utilization.

  2. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  3. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  4. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  5. A survey of formal languages for contracts

    DEFF Research Database (Denmark)

    Hvitved, Tom

    2010-01-01

    In this short paper we present the current status on formal languages and models for contracts. By a formal model is meant an unambiguous and rigorous representation of contracts, in order to enable their automatic validation, execution, and analysis — activates that are collectively referred...... to as contract lifecycle management (CLM). We present a set of formalism requirements, which represent features that any ideal contract model should support, based on which we present a comparative survey of existing contract formalisms....

  6. 19 CFR 4.9 - Formal entry.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Formal entry. 4.9 Section 4.9 Customs Duties U.S... FOREIGN AND DOMESTIC TRADES Arrival and Entry of Vessels § 4.9 Formal entry. (a) General. Section 4.3 provides which vessels are subject to formal entry and where and when entry must be made. The formal entry...

  7. Evacuation decision-making: process and uncertainty

    International Nuclear Information System (INIS)

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs

  8. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  9. Understanding visualization: a formal approach using category theory and semiotics.

    Science.gov (United States)

    Vickers, Paul; Faith, Joe; Rossiter, Nick

    2013-06-01

    This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.

  10. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    Science.gov (United States)

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  11. Formalizing the concept phase of product development

    NARCIS (Netherlands)

    Schuts, M.; Hooman, J.

    2015-01-01

    We discuss the use of formal techniques to improve the concept phase of product realisation. As an industrial application, a new concept of interventional X-ray systems has been formalized, using model checking techniques and the simulation of formal models. cop. Springer International Publishing

  12. Formal Testing of Correspondence Carrying Software

    NARCIS (Netherlands)

    Bujorianu, M.C.; Bujorianu, L.M.; Maharaj, S.

    2008-01-01

    Nowadays formal software development is characterised by use of multitude formal specification languages. Test case generation from formal specifications depends in general on a specific language, and, moreover, there are competing methods for each language. There is a need for a generic approach to

  13. Uncertainty modelling of atmospheric dispersion by stochastic ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... Uncertainty; polynomial chaos expansion; fuzzy set theory; cumulative distribution function; uniform distribution; membership function. Abstract. The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that ...

  14. Uncertainty and growth of the firm

    NARCIS (Netherlands)

    Lensink, Robert; Steen, Paul van der; Sterken, Elmer

    2000-01-01

    Using data from a survey of 1097 Dutch firms we investigate the relation between growth of the firm and uncertainty. We focus on the impact of uncertainty on various types of investment, employment demand, and expected maturity of the firm. The special feature of the survey is that it includes data

  15. The spatial statistics formalism applied to mapping electromagnetic radiation in urban areas.

    Science.gov (United States)

    Paniagua, Jesus M; Rufo, Montaña; Jimenez, Antonio; Antolin, Alicia

    2013-01-01

    Determining the electromagnetic radiation levels in urban areas is a complicated task. Various approaches have been taken, including numerical simulations using different models of propagation, sampling campaigns to measure field values with which to validate theoretical models, and the formalism of spatial statistics. In the work, we present here that this latter technique was used to construct maps of electric field and its associated uncertainty from experimental data. For this purpose, a field meter and a broadband probe sensitive in the 100-kHz-3-GHz frequency range were used to take 1,020 measurements around buildings and along the perimeter of the area. The distance between sampling points was 5 m. The results were stored in a geographic information system to facilitate data handling and analysis, in particular, the application of the formalism of spatial statistical to the analysis of the distribution of the field levels over the study area. The spatial structure was analyzed using the variographic technique, with the field levels at non-sampled points being interpolated by kriging. The results indicated that, in the urban area analyzed in the present work, the linear density of sampling points could be reduced to a distance which coincides with the length of the blocks of buildings without the statistical parameters varying significantly and with the field level maps being reproduced qualitatively and quantitatively.

  16. Cognitive reasoning a formal approach

    CERN Document Server

    Anshakov, Oleg M

    2010-01-01

    Dealing with uncertainty, moving from ignorance to knowledge, is the focus of cognitive processes. Understanding these processes and modelling, designing, and building artificial cognitive systems have long been challenging research problems. This book describes the theory and methodology of a new, scientifically well-founded general approach, and its realization in the form of intelligent systems applicable in disciplines ranging from social sciences, such as cognitive science and sociology, through natural sciences, such as life sciences and chemistry, to applied sciences, such as medicine,

  17. PIV uncertainty propagation

    NARCIS (Netherlands)

    Sciacchitano, A.; Wieneke, Bernhard

    2016-01-01

    This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It

  18. Feedback versus uncertainty

    NARCIS (Netherlands)

    Van Nooyen, R.R.P.; Hrachowitz, M.; Kolechkina, A.G.

    2014-01-01

    Even without uncertainty about the model structure or parameters, the output of a hydrological model run still contains several sources of uncertainty. These are: measurement errors affecting the input, the transition from continuous time and space to discrete time and space, which causes loss of

  19. Schrodinger's Uncertainty Principle?

    Indian Academy of Sciences (India)

    correlation between x and p. The virtue of Schrodinger's version (5) is that it accounts for this correlation. In spe- cial cases like the free particle and the harmonic oscillator, the 'Schrodinger uncertainty product' even remains constant with time, whereas Heisenberg's does not. The glory of giving the uncertainty principle to ...

  20. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  1. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  2. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  3. Formal modeling of virtual machines

    Science.gov (United States)

    Cremers, A. B.; Hibbard, T. N.

    1978-01-01

    Systematic software design can be based on the development of a 'hierarchy of virtual machines', each representing a 'level of abstraction' of the design process. The reported investigation presents the concept of 'data space' as a formal model for virtual machines. The presented model of a data space combines the notions of data type and mathematical machine to express the close interaction between data and control structures which takes place in a virtual machine. One of the main objectives of the investigation is to show that control-independent data type implementation is only of limited usefulness as an isolated tool of program development, and that the representation of data is generally dictated by the control context of a virtual machine. As a second objective, a better understanding is to be developed of virtual machine state structures than was heretofore provided by the view of the state space as a Cartesian product.

  4. Formal analysis of electromagnetic optics

    Science.gov (United States)

    Khan-Afshar, Sanaz; Hasan, Osman; Tahar, Sofiène

    2014-09-01

    Optical systems are increasingly being used in safety-critical applications. Due to the complexity and sensitivity of optical systems, their verification raises many challenges for engineers. Traditionally, the analysis of such systems has been carried out by paper-and-pencil based proofs and numerical computations. However, these techniques cannot provide accurate results due to the risk of human error and inherent approximations of numerical algorithms. In order to overcome these limitations, we propose to use theorem proving (i.e., a computer-based technique that allows to express mathematical expressions and reason about their correctness by taking into account all the details of mathematical reasoning) as a complementary approach to improve optical system analysis. This paper provides a higher-order logic (a language used to express mathematical theories) formalization of electromagnetic optics in the HOL Light theorem prover. In order to demonstrate the practical effectiveness of our approach, we present the analysis of resonant cavity enhanced photonic devices.

  5. A Formal Calculus for Categories

    DEFF Research Database (Denmark)

    Cáccamo, Mario José

    This dissertation studies the logic underlying category theory. In particular we present a formal calculus for reasoning about universal properties. The aim is to systematise judgements about functoriality and naturality central to categorical reasoning. The calculus is based on a language which...... extends the typed lambda calculus with new binders to represent universal constructions. The types of the languages are interpreted as locally small categories and the expressions represent functors. The logic supports a syntactic treatment of universality and duality. Contravariance requires a definition...... of universality generous enough to deal with functors of mixed variance. Ends generalise limits to cover these kinds of functors and moreover provide the basis for a very convenient algebraic manipulation of expressions. The equational theory of the lambda calculus is extended with new rules for the definitions...

  6. Formal algorithmic elimination for PDEs

    CERN Document Server

    Robertz, Daniel

    2014-01-01

    Investigating the correspondence between systems of partial differential equations and their analytic solutions using a formal approach, this monograph presents algorithms to determine the set of analytic solutions of such a system and conversely to find differential equations whose set of solutions coincides with a given parametrized set of analytic functions. After giving a detailed introduction to Janet bases and Thomas decomposition, the problem of finding an implicit description of certain sets of analytic functions in terms of differential equations is addressed. Effective methods of varying generality are developed to solve the differential elimination problems that arise in this context. In particular, it is demonstrated how the symbolic solution of partial differential equations profits from the study of the implicitization problem. For instance, certain families of exact solutions of the Navier-Stokes equations can be computed.

  7. Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.

    Science.gov (United States)

    Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk

    2012-02-01

    Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Formalism for neutron cross section covariances in the resonance region using kernel approximation

    Energy Technology Data Exchange (ETDEWEB)

    Oblozinsky, P.; Cho,Y-S.; Matoon,C.M.; Mughabghab,S.F.

    2010-04-09

    We describe analytical formalism for estimating neutron radiative capture and elastic scattering cross section covariances in the resolved resonance region. We use capture and scattering kernels as the starting point and show how to get average cross sections in broader energy bins, derive analytical expressions for cross section sensitivities, and deduce cross section covariances from the resonance parameter uncertainties in the recently published Atlas of Neutron Resonances. The formalism elucidates the role of resonance parameter correlations which become important if several strong resonances are located in one energy group. Importance of potential scattering uncertainty as well as correlation between potential scattering and resonance scattering is also examined. Practical application of the formalism is illustrated on {sup 55}Mn(n,{gamma}) and {sup 55}Mn(n,el).

  9. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  10. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands); Grupa, J.B. [Netherlands Energy Research Foundation (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  11. Proceedings of the First NASA Formal Methods Symposium

    Science.gov (United States)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  12. Towards a different attitude to uncertainty

    Directory of Open Access Journals (Sweden)

    Guy Pe'er

    2014-10-01

    Full Text Available The ecological literature deals with uncertainty primarily from the perspective of how to reduce it to acceptable levels. However, the current rapid and ubiquitous environmental changes, as well as anticipated rates of change, pose novel conditions and complex dynamics due to which many sources of uncertainty are difficult or even impossible to reduce. These include both uncertainty in knowledge (epistemic uncertainty and societal responses to it. Under these conditions, an increasing number of studies ask how one can deal with uncertainty as it is. Here, we explore the question how to adopt an overall alternative attitude to uncertainty, which accepts or even embraces it. First, we show that seeking to reduce uncertainty may be counterproductive under some circumstances. It may yield overconfidence, ignoring early warning signs, policy- and societal stagnation, or irresponsible behaviour if personal certainty is offered by externalization of environmental costs. We then demonstrate that uncertainty can have positive impacts by driving improvements in knowledge, promoting cautious action, contributing to keeping societies flexible and adaptable, enhancing awareness, support and involvement of the public in nature conservation, and enhancing cooperation and communication. We discuss the risks of employing a certainty paradigm on uncertain knowledge, the potential benefits of adopting an alternative attitude to uncertainty, and the need to implement such an attitude across scales – from adaptive management at the local scale, to the evolving Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES at the global level.

  13. Formal ontologies in biomedical knowledge representation.

    Science.gov (United States)

    Schulz, S; Jansen, L

    2013-01-01

    Medical decision support and other intelligent applications in the life sciences depend on increasing amounts of digital information. Knowledge bases as well as formal ontologies are being used to organize biomedical knowledge and data. However, these two kinds of artefacts are not always clearly distinguished. Whereas the popular RDF(S) standard provides an intuitive triple-based representation, it is semantically weak. Description logics based ontology languages like OWL-DL carry a clear-cut semantics, but they are computationally expensive, and they are often misinterpreted to encode all kinds of statements, including those which are not ontological. We distinguish four kinds of statements needed to comprehensively represent domain knowledge: universal statements, terminological statements, statements about particulars and contingent statements. We argue that the task of formal ontologies is solely to represent universal statements, while the non-ontological kinds of statements can nevertheless be connected with ontological representations. To illustrate these four types of representations, we use a running example from parasitology. We finally formulate recommendations for semantically adequate ontologies that can efficiently be used as a stable framework for more context-dependent biomedical knowledge representation and reasoning applications like clinical decision support systems.

  14. Formal specification level concepts, methods, and algorithms

    CERN Document Server

    Soeken, Mathias

    2015-01-01

    This book introduces a new level of abstraction that closes the gap between the textual specification of embedded systems and the executable model at the Electronic System Level (ESL). Readers will be enabled to operate at this new, Formal Specification Level (FSL), using models which not only allow significant verification tasks in this early stage of the design flow, but also can be extracted semi-automatically from the textual specification in an interactive manner.  The authors explain how to use these verification tasks to check conceptual properties, e.g. whether requirements are in conflict, as well as dynamic behavior, in terms of execution traces. • Serves as a single-source reference to a new level of abstraction for embedded systems, known as the Formal Specification Level (FSL); • Provides a variety of use cases which can be adapted to readers’ specific design flows; • Includes a comprehensive illustration of Natural Language Processing (NLP) techniques, along with examples of how to i...

  15. The simplest formal argument for fitness optimization.

    Science.gov (United States)

    Grafen, Alen

    2008-12-01

    The Formal Darwinism Project aims to provide a formal argument linking population genetics to fitness optimization, which of necessity includes defining fitness. This bridges the gulf between those biologists who assume that natural selection leads to something close to fitness optimization and those biologists who believe on theoretical grounds that there is no sense of fitness that can usefully be said to be optimized. The current paper's main objective is to provide a careful mathematical introduction to the project, and it also reflects on the project's scope and limitations. The central argument is the proof of close ties between the mathematics of motion, as embodied in the Price equation, and the mathematics of optimization, as represented by optimization programmes. To make these links, a general and abstract model linking genotype, phenotype and number of successful gametes is assumed. The project has begun with simple dynamic models and simple linking models, and its progress will involve more realistic versions of them. The versions given here are fully mathematically rigorous, but elementary enough to serve as an introduction.

  16. Analysis of Infiltration Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    R. McCurley

    2003-10-27

    The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper

  17. Uncertainty in eddy covariance measurements and its application to physiological models

    Science.gov (United States)

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  18. Uncertainty of dustfall monitoring results

    Directory of Open Access Journals (Sweden)

    Martin A. van Nierop

    2017-06-01

    Full Text Available Fugitive dust has the ability to cause a nuisance and pollute the ambient environment, particularly from human activities including construction and industrial sites and mining operations. As such, dustfall monitoring has occurred for many decades in South Africa; little has been published on the repeatability, uncertainty, accuracy and precision of dustfall monitoring. Repeatability assesses the consistency associated with the results of a particular measurement under the same conditions; the consistency of the laboratory is assessed to determine the uncertainty associated with dustfall monitoring conducted by the laboratory. The aim of this study was to improve the understanding of the uncertainty in dustfall monitoring; thereby improving the confidence in dustfall monitoring. Uncertainty of dustfall monitoring was assessed through a 12-month study of 12 sites that were located on the boundary of the study area. Each site contained a directional dustfall sampler, which was modified by removing the rotating lid, with four buckets (A, B, C and D installed. Having four buckets on one stand allows for each bucket to be exposed to the same conditions, for the same period of time; therefore, should have equal amounts of dust deposited in these buckets. The difference in the weight (mg of the dust recorded from each bucket at each respective site was determined using the American Society for Testing and Materials method D1739 (ASTM D1739. The variability of the dust would provide the confidence level of dustfall monitoring when reporting to clients.

  19. Risk Management and Uncertainty in Infrastructure Projects

    DEFF Research Database (Denmark)

    Harty, Chris; Neerup Themsen, Tim; Tryggestad, Kjell

    2014-01-01

    The assumption that large complex projects should be managed in order to reduce uncertainty and increase predictability is not new. What is relatively new, however, is that uncertainty reduction can and should be obtained through formal risk management approaches. We question both assumptions...... by addressing a more fundamental question about the role of knowledge in current risk management practices. Inquiries into the predominant approaches to risk management in large infrastructure and construction projects reveal their assumptions about knowledge and we discuss the ramifications these have...... for project and construction management. Our argument and claim is that predominant risk management approaches tends to reinforce conventional ideas of project control whilst undermining other notions of value and relevance of built assets and project management process. These approaches fail to consider...

  20. Addressing model uncertainty in dose-response: The case of chloroform

    International Nuclear Information System (INIS)

    Evans, J.S.

    1994-01-01

    This paper discusses the issues involved in addressing model uncertainty in the analysis of dose-response relationships. A method for addressing model uncertainty is described and applied to characterize the uncertainty in estimates of the carcinogenic potency of chloroform. The approach, which is rooted in Bayesian concepts of subjective probability, uses probability trees and formally-elicited expert judgments to address model uncertainty. It is argued that a similar approach could be used to improve the characterization of model uncertainty in the dose-response relationships for health effects from ionizing radiation

  1. Does formal research training lead to academic success in otolaryngology?

    Science.gov (United States)

    Bobian, Michael R; Shah, Noor; Svider, Peter F; Hong, Robert S; Shkoukani, Mahdi A; Folbe, Adam J; Eloy, Jean Anderson

    2017-01-01

    To evaluate whether formalized research training is associated with higher researcher productivity, academic rank, and acquisition of National Institutes of Health (NIH) grants within academic otolaryngology departments. Each of the 100 civilian otolaryngology program's departmental websites were analyzed to obtain a comprehensive list of faculty members credentials and characteristics, including academic rank, completion of a clinical fellowship, completion of a formal research fellowship, and attainment of a doctorate in philosophy (PhD) degree. We also recorded measures of scholarly impact and successful acquisition of NIH funding. A total of 1,495 academic physicians were included in our study. Of these, 14.1% had formal research training. Bivariate associations showed that formal research training was associated with a greater h-index, increased probability of acquiring NIH funding, and higher academic rank. Using a linear regression model, we found that otolaryngologists possessing a PhD had an associated h-index of 1.8 points higher, and those who completed a formal research fellowship had an h-index of 1.6 points higher. A PhD degree or completion of a research fellowship was not associated with a higher academic rank; however, a higher h-index and previous acquisition of an NIH grant were associated with a higher academic rank. The attainment of NIH funding was three times more likely for those with a formal research fellowship and 8.6 times more likely for otolaryngologists with a PhD degree. Formalized research training is associated with academic success in otolaryngology. Such dedicated research training accompanies greater scholarly impact, acquisition of NIH funding, and a higher academic rank. NA Laryngoscope, 127:E15-E21, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  2. Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models

    Science.gov (United States)

    Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea

    2014-05-01

    Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.

  3. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  4. Introduction to uncertainty quantification

    CERN Document Server

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  5. On formally integrating science and policy: walking the walk

    Science.gov (United States)

    Nichols, James D.; Johnson, Fred A.; Williams, Byron K.; Boomer, G. Scott

    2015-01-01

    The contribution of science to the development and implementation of policy is typically neither direct nor transparent.  In 1995, the U.S. Fish and Wildlife Service (FWS) made a decision that was unprecedented in natural resource management, turning to an unused and unproven decision process to carry out trust responsibilities mandated by an international treaty.  The decision process was adopted for the establishment of annual sport hunting regulations for the most economically important duck population in North America, the 6 to 11 million mallards Anas platyrhynchos breeding in the mid-continent region of north-central United States and central Canada.  The key idea underlying the adopted decision process was to formally embed within it a scientific process designed to reduce uncertainty (learn) and thus make better decisions in the future.  The scientific process entails use of models to develop predictions of competing hypotheses about system response to the selected action at each decision point.  These prediction not only are used to select the optimal management action, but also are compared with the subsequent estimates of system state variables, providing evidence for modifying degrees of confidence in, and hence relative influence of, these models at the next decision point.  Science and learning in one step are formally and directly incorporated into the next decision, contrasting with the usual ad hoc and indirect use of scientific results in policy development and decision-making.  Application of this approach over the last 20 years has led to a substantial reduction in uncertainty, as well as to an increase in transparency and defensibility of annual decisions and a decrease in the contentiousness of the decision process.  As resource managers are faced with increased uncertainty associated with various components of global change, this approach provides a roadmap for the future scientific management of natural resources.  

  6. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  7. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  8. Sources of Judgmental Uncertainty

    Science.gov (United States)

    1977-09-01

    sometimes at the end. To avoid primacy or recency effects , which were not part of this first study, for half of the subjects the orders of information items...summarize, 72 subjects were randomly assigned to two conditions of control and exposed to three conditions of orderliness. Order effects and primacy / recency ...WORDS (Continue on reverie atids If necessary and Identity by block number) ~ Judgmental Uncertainty Primacy / Recency Environmental UncertaintyN1

  9. Formalized search strategies for human risk contributions

    International Nuclear Information System (INIS)

    Rasmussen, J.; Pedersen, O.M.

    1982-07-01

    For risk management, the results of a probabilistic risk analysis (PRA) as well as the underlying assumptions can be used as references in a closed-loop risk control; and the analyses of operational experiences as a means of feedback. In this context, the need for explicit definition and documentation of the PRA coverage, including the search strategies applied, is discussed and aids are proposed such as plant description in terms of a formal abstraction hierarchy and use of cause-consequence-charts for the documentation of not only the results of PRA but also of its coverage. Typical human risk contributions are described on the basis of general plant design features relevant for risk and accident analysis. With this background, search strategies for human risk contributions are treated: Under the designation ''work analysis'', procedures for the analysis of familiar, well trained, planned tasks are proposed. Strategies for identifying human risk contributions outside this category are outlined. (author)

  10. Formal Model Engineering for Embedded Systems Using Real-Time Maude

    Directory of Open Access Journals (Sweden)

    Peter Csaba Ölveczky

    2011-06-01

    Full Text Available This paper motivates why Real-Time Maude should be well suited to provide a formal semantics and formal analysis capabilities to modeling languages for embedded systems. One can then use the code generation facilities of the tools for the modeling languages to automatically synthesize Real-Time Maude verification models from design models, enabling a formal model engineering process that combines the convenience of modeling using an informal but intuitive modeling language with formal verification. We give a brief overview six fairly different modeling formalisms for which Real-Time Maude has provided the formal semantics and (possibly formal analysis. These models include behavioral subsets of the avionics modeling standard AADL, Ptolemy II discrete-event models, two EMF-based timed model transformation systems, and a modeling language for handset software.

  11. Visualizing large-scale uncertainty in astrophysical data.

    Science.gov (United States)

    Li, Hongwei; Fu, Chi-Wing; Li, Yinggang; Hanson, Andrew

    2007-01-01

    Visualization of uncertainty or error in astrophysical data is seldom available in simulations of astronomical phenomena, and yet almost all rendered attributes possess some degree of uncertainty due to observational error. Uncertainties associated with spatial location typically vary signicantly with scale and thus introduce further complexity in the interpretation of a given visualization. This paper introduces effective techniques for visualizing uncertainty in large-scale virtual astrophysical environments. Building upon our previous transparently scalable visualization architecture, we develop tools that enhance the perception and comprehension of uncertainty across wide scale ranges. Our methods include a unified color-coding scheme for representing log-scale distances and percentage errors, an ellipsoid model to represent positional uncertainty, an ellipsoid envelope model to expose trajectory uncertainty, and a magic-glass design supporting the selection of ranges of log-scale distance and uncertainty parameters, as well as an overview mode and a scalable WIM tool for exposing the magnitudes of spatial context and uncertainty.

  12. A Formal Description of Problem Frames

    OpenAIRE

    Souleymane KOUSSOUBE; Roger NOUSSI; Balira O. KONFE

    2014-01-01

    Michael Jackson defines a Problem Frame as a mean to describe and classify software development problems. The initial description of problem Frames is essentially graphical. A weakness of this proposal is the lack of formal specification allowing efficient reasoning tools. This paper deals with Problem Frames’ formal specification with Description Logics. We first propose a formal terminology of Problem Frames leading to the specification of a Problem Frames’ TBOX and a specific problem’s ABO...

  13. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  14. Formal Information Model for Representing Production Resources

    OpenAIRE

    Siltala, Niko; Järvenpää, Eeva; Lanz, Minna

    2017-01-01

    Part 2: Intelligent Manufacturing Systems; International audience; This paper introduces a concept and associated descriptions to formally describe physical production resources for modular and reconfigurable production systems. These descriptions are source of formal information for (automatic) production system design and (re-)configuration. They can be further utilized during the system deployment and execution. The proposed concept and the underlying formal resource description model is c...

  15. What Determines Firms’ Decisions to Formalize?

    OpenAIRE

    Neil McCulloch; Günther G. Schulze; Janina Voss

    2010-01-01

    In this paper we analyze the decision of small and micro firms to formalize, i.e. to obtain business and other licenses in rural Indonesia. We use the rural investment climate survey (RICS) that consists of non-farm rural enterprises, most of them microenterprises, and analyze the effect of formalization on tax payments, corruption, access to credit and revenue, taking into account the endogeneity of the formalization decision to such benefits and costs. We show, contrary to most of the liter...

  16. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2017-01-01

    Purpose: Relational uncertainty determines how relationships develop because it enables the building of trust and commitment. However, relational uncertainty has not been explored in an inter-organisational setting. This paper investigates how organisations experience relational uncertainty in se...

  17. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  18. A Mathematical Formalization Proposal for Business Growth

    Directory of Open Access Journals (Sweden)

    Gheorghe BAILESTEANU

    2013-01-01

    Full Text Available Economic sciences have known a spectacular evolution in the last century; beginning to use axiomatic methods, applying mathematical instruments as a decision-making tool. The quest to formalization needs to be addressed from various different angles, reducing entry and operating formal costs, increasing the incentives for firms to operate formally, reducing obstacles to their growth, and searching for inexpensive approaches through which to enforce compliancy with government regulations. This paper proposes a formalized approach to business growth, based on mathematics and logics, taking into consideration the particularities of the economic sector.

  19. Uncertainty reasoning in expert systems

    Science.gov (United States)

    Kreinovich, Vladik

    1993-01-01

    Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

  20. Investigating the dosimetric impact of seed location uncertainties in Collaborative Ocular Melanoma Study-based eye plaques.

    Science.gov (United States)

    Johnson, Jedediah E; Deufel, Christopher L; Furutani, Keith M

    2016-01-01

    To quantify the dosimetric effects of random and systematic seed position uncertainties in Collaborative Ocular Melanoma Study-based eye plaques. An eye plaque dose calculation routine was created using Task Group 43 formalism. A variety of clinical configurations were simulated, including two seed models: (125)I and (103)Pd, three eye plaque sizes, and eight plaque/eye orientations. Dose was calculated at four ocular anatomic sites and three central axis plaque depths. Random seed positional uncertainty was modeled by adding Gaussian random displacements, in one of three seed-motion degrees of freedom, to each seed's nominal coordinate. Distributions of dosimetric outcomes were obtained and fitted after 10(6) randomizations. Similar analysis was performed for deterministic, systematic shifts of the plaque along the eye surface and radially from the globe center. Random seed placement uncertainties of 0.2-mm root mean square (RMS) (amplitude) produce dose changes that are typically Eye plaque dosimetry is most sensitive to seed movement toward the center of the eye. Dosimetric uncertainty also increases with increasing dose gradients, which are typically greatest near the inner sclera, with smaller plaques, and with lower energy radionuclides (e.g., (103)Pd). Dosimetric uncertainties due to the random seed positional displacements anticipated in the clinic are expected to be <4% for each degree of freedom in most circumstances. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  1. Assessing spatial uncertainties of land allocation using a scenario approach and sensitivity analysis: A study for land use in Europe

    NARCIS (Netherlands)

    Verburg, P.H.; Tabeau, A.A.; Hatna, E.

    2013-01-01

    Land change model outcomes are vulnerable to multiple types of uncertainty, including uncertainty in input data, structural uncertainties in the model and uncertainties in model parameters. In coupled model systems the uncertainties propagate between the models. This paper assesses uncertainty of

  2. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  3. Visualization of Uncertainty

    Science.gov (United States)

    Jones, P. W.; Strelitz, R. A.

    2012-12-01

    The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs

  4. A Conceptual Formalization of Crosscutting in AOSD

    NARCIS (Netherlands)

    van den Berg, Klaas; Conejero, J.M.

    2005-01-01

    We propose a formalization of crosscutting based on a conceptual framework for AOSD. Crosscutting is clearly distinguished from the related concepts scattering and tangling. The definitions of these concepts are formalized and visualized with matrices and matrix operations. This allows more precise

  5. Formalizing Evaluation in Music Information Retrieval

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2013-01-01

    We develop a formalism to disambiguate the evaluation of music information retrieval systems. We define a ``system,'' what it means to ``analyze'' one, and make clear the aims, parts, design, execution, interpretation, and assumptions of its ``evaluation.'' We apply this formalism to discuss...

  6. Applicability of four parameter formalisms in interpreting ...

    Indian Academy of Sciences (India)

    terms of applicability of the three well known four-parameter formalisms for the representation of the thermody- namic properties of binary systems. The study indicates that the derived values of the infinite dilution parameters based on the formalisms compare favourably with the computed data available in the literature.

  7. Hamiltonian path integral formalism with higher derivatives

    Energy Technology Data Exchange (ETDEWEB)

    Barcelos-Neto, J.; Natividade, C.P. (Rio de Janeiro Univ. (Brazil). Inst. de Fisica)

    1991-07-01

    We study the Hamiltonian path integral formalism for systems containing higher derivatives. First we show the consistency of the formalism in applications involving only scalar fields. Later we use the Maxwell electromagnetic theory with a higher order regularization term to show that the Batalin-Fradkin-Vilkovisky (BFV) theory can also be consistently described. (orig.).

  8. DNA expressions - A formal notation for DNA

    NARCIS (Netherlands)

    Vliet, Rudy van

    2015-01-01

    We describe a formal notation for DNA molecules that may contain nicks and gaps. The resulting DNA expressions denote formal DNA molecules. Different DNA expressions may denote the same molecule. Such DNA expressions are called equivalent. We examine which DNA expressions are minimal, which

  9. Formal balancing of chemical reaction networks

    NARCIS (Netherlands)

    van der Schaft, Abraham; Rao, S.; Jayawardhana, B.

    2016-01-01

    In this paper we recall and extend the main results of Van der Schaft, Rao, Jayawardhana (2015) concerning the use of Kirchhoff’s Matrix Tree theorem in the explicit characterization of complex-balanced reaction networks and the notion of formal balancing. The notion of formal balancing corresponds

  10. Coefficient rings of formal group laws

    International Nuclear Information System (INIS)

    Buchstaber, V M; Ustinov, A V

    2015-01-01

    We describe the coefficient rings of universal formal group laws which arise in algebraic geometry, algebraic topology and their application to mathematical physics. We also describe the homomorphisms of these coefficient rings coming from reductions of one formal group law to another. The proofs are based on the number-theoretic properties of binomial coefficients. Bibliography: 37 titles

  11. Restorative Practices as Formal and Informal Education

    Science.gov (United States)

    Carter, Candice C.

    2013-01-01

    This article reviews restorative practices (RP) as education in formal and informal contexts of learning that are fertile sites for cultivating peace. Formal practices involve instruction about response to conflict, while informal learning occurs beyond academic lessons. The research incorporated content analysis and a critical examination of the…

  12. Formal Engineering Hybrid Systems: Semantic Underpinnings

    NARCIS (Netherlands)

    Bujorianu, M.C.; Bujorianu, L.M.

    2008-01-01

    In this work we investigate some issues in applying formal methods to hybrid system development and develop a categorical framework. We study the themes of stochastic reasoning, heterogeneous formal specification and retrenchment. Hybrid systems raise a rich pallets of aspects that need to be

  13. Industrial Practice in Formal Methods : A Review

    DEFF Research Database (Denmark)

    Bicarregui, Juan C.; Fitzgerald, John; Larsen, Peter Gorm

    2009-01-01

    We examine the the industrial application of formal methods using data gathered in a review of 62 projects taking place over the last 25 years. The review suggests that formal methods are being applied in a wide range of application domains, with increasingly strong tool support. Significant chal...

  14. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, Muhammad Torabi; Dimitrakos, Theo; Martinelli, Fabio

    We formally specify a payment protocol described by Vogt et al. This protocol is intended for fair exchange of time-sensitive data. Here the mCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free mu-calculus. These properties are then

  15. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  16. A STUDENT'S REFERENCE GRAMMAR OF MODERN FORMAL INDONESIAN.

    Science.gov (United States)

    MACDONALD, R. ROSS; SOENJONO, DARJOWIDJOJO

    THE INDONESIAN DESCRIBED IN THIS GRAMMAR IS THE FORMAL LANGUAGE USED IN PUBLISHED TEXTS RATHER THAN THE COLLOQUIAL LANGUAGE. ALL OF THE TEXTS USED WERE PUBLISHED BETWEEN 1945 AND 1966 AND THEY INCLUDE POLITICAL SPEECHES, LEGAL DOCUMENTS, AND TEXTBOOKS. SINCE THIS BOOK WAS DESIGNED PRIMARILY FOR GENERAL STUDENTS OF THE INDONESIAN LANGUAGE AND ONLY…

  17. Formalization of Bachmair and Ganzinger's Ordered Resolution Prover

    DEFF Research Database (Denmark)

    Schlichtkrull, Anders; Blanchette, Jasmin Christian; Traytel, Dmitriy

    2018-01-01

    This Isabelle/HOL formalization covers Sections 2 to 4 of Bachmair and Ganzinger’s “Resolution Theorem Proving” chapter in the Handbook of Automated Reasoning. This includes soundness and completeness of unordered and ordered variants of ground resolution with and without literal selection...

  18. Improving Project Management Using Formal Models and Architectures

    Science.gov (United States)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  19. Non-Formal Educator Use of Evaluation Results

    Science.gov (United States)

    Baughman, Sarah; Boyd, Heather H.; Franz, Nancy K.

    2012-01-01

    Increasing demands for accountability in educational programming have resulted in increasing calls for program evaluation in educational organizations. Many organizations include conducting program evaluations as part of the job responsibilities of program staff. Cooperative Extension is a complex organization offering non-formal educational…

  20. Basic concepts of medical genetics, formal genetics, Part 1

    African Journals Online (AJOL)

    Mohammad Saad Zaghloul Salem

    2013-11-15

    Nov 15, 2013 ... The definition of formal genetics is still a matter of contention. However, it can be defined as a branch of basic genetics con- cerned with deducing and figuring out relevant genetic data from constructed figures that contain specific genetic informa- tion. These informative figures include, for instance, con-.

  1. Formal and Informal Employment Growth in Manufacturing (India ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This research project focuses on the informal and formal manufacturing labour forces of India and Bangladesh, which comprise most of the poor in South Asia. ... IDRC is investing in local solutions to address climate change-related challenges in India, including heat stress, water management, and climate-related migration.

  2. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to keff sensitivity data, cross-section uncertainty data, how keff sensitivity data and keff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  3. Formal modelling and analysis of socio-technical systems

    DEFF Research Database (Denmark)

    Probst, Christian W.; Kammüller, Florian; Hansen, Rene Rydhof

    2016-01-01

    -technical systems are still mostly identified through brainstorming of experts. In this work we discuss several approaches to formalising socio-technical systems and their analysis. Starting from a flow logic-based analysis of the insider threat, we discuss how to include the socio aspects explicitly, and show......Attacks on systems and organisations increasingly exploit human actors, for example through social engineering. This non-technical aspect of attacks complicates their formal treatment and automatic identification. Formalisation of human behaviour is difficult at best, and attacks on socio...... a formalisation that proves properties of this formalisation. On the formal side, our work closes the gap between formal and informal approaches to socio-technical systems. On the informal side, we show how to steal a birthday cake from a bakery by social engineering....

  4. Formalization and Implementation of Algebraic Methods in Geometry

    Directory of Open Access Journals (Sweden)

    Filip Marić

    2012-02-01

    Full Text Available We describe our ongoing project of formalization of algebraic methods for geometry theorem proving (Wu's method and the Groebner bases method, their implementation and integration in educational tools. The project includes formal verification of the algebraic methods within Isabelle/HOL proof assistant and development of a new, open-source Java implementation of the algebraic methods. The project should fill-in some gaps still existing in this area (e.g., the lack of formal links between algebraic methods and synthetic geometry and the lack of self-contained implementations of algebraic methods suitable for integration with dynamic geometry tools and should enable new applications of theorem proving in education.

  5. Public perception and communication of scientific uncertainty.

    Science.gov (United States)

    Broomell, Stephen B; Kane, Patrick Bodilly

    2017-02-01

    Understanding how the public perceives uncertainty in scientific research is fundamental for effective communication about research and its inevitable uncertainty. Previous work found that scientific evidence differentially influenced beliefs from individuals with different political ideologies. Evidence that threatens an individual's political ideology is perceived as more uncertain than nonthreatening evidence. The authors present 3 studies examining perceptions of scientific uncertainty more broadly by including sciences that are not politically polarizing. Study 1 develops scales measuring perceptions of scientific uncertainty. It finds (a) 3 perceptual dimensions of scientific uncertainty, with the primary dimension representing a perception of precision; (b) the precision dimension of uncertainty is strongly associated with the perceived value of a research field; and (c) differences in perceived uncertainty across political affiliations. Study 2 manipulated these dimensions, finding that Republicans were more sensitive than Democrats to descriptions of uncertainty associated with a research field (e.g., psychology). Study 3 found that these views of a research field did not extend to the evaluation of individual results produced by the field. Together, these studies show that perceptions of scientific uncertainty associated with entire research fields are valid predictors of abstract perceptions of scientific quality, benefit, and allocation of funding. Yet, they do not inform judgments about individual results. Therefore, polarization in the acceptance of specific results is not likely due to individual differences in perceived scientific uncertainty. Further, the direction of influence potentially could be reversed, such that perceived quality of scientific results could be used to influence perceptions about scientific research fields. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Enhancing System Realisation in Formal Model Development

    DEFF Research Database (Denmark)

    Tran-Jørgensen, Peter Würtz Vinther

    – a formal methods tool that supports the Vienna Development Method. The development of the code generation infrastructure has involved the re-design of the software architecture of Overture. The new architecture brings forth the reuse and extensibility features of Overture to take into account the needs......Software for mission-critical systems is sometimes analysed using formal specification to increase the chances of the system behaving as intended. When sufficient insights into the system have been obtained from the formal analysis, the formal specification is realised in the form of a software...... implementation. One way to realise the system’s software is by automatically generating it from the formal specification – a technique referred to as code generation. However, in general it is difficult to make guarantees about the correctness of the generated code – especially while requiring automation...

  7. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  8. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...... risk discourse (Myers 2005; 2007). In additional, however, I argue that commonplaces are used to mitigate feelings of insecurity caused by uncertainty and to negotiate new codes of moral conduct. Keywords: uncertainty, commonplaces, risk discourse, focus groups, appraisal...

  9. Formal representation of complex SNOMED CT expressions

    Directory of Open Access Journals (Sweden)

    Markó Kornél

    2008-10-01

    Full Text Available Abstract Background Definitory expressions about clinical procedures, findings and diseases constitute a major benefit of a formally founded clinical reference terminology which is ontologically sound and suited for formal reasoning. SNOMED CT claims to support formal reasoning by description-logic based concept definitions. Methods On the basis of formal ontology criteria we analyze complex SNOMED CT concepts, such as "Concussion of Brain with(out Loss of Consciousness", using alternatively full first order logics and the description logic ℰℒ MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGaciGaaiaabeqaaeqabiWaaaGcbaWenfgDOvwBHrxAJfwnHbqeg0uy0HwzTfgDPnwy1aaceaGae8hmHuKae8NeHWeaaa@37B1@. Results Typical complex SNOMED CT concepts, including negations or not, can be expressed in full first-order logics. Negations cannot be properly expressed in the description logic ℰℒ MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGaciGaaiaabeqaaeqabiWaaaGcbaWenfgDOvwBHrxAJfwnHbqeg0uy0HwzTfgDPnwy1aaceaGae8hmHuKae8NeHWeaaa@37B1@ underlying SNOMED CT. All concepts concepts the meaning of which implies a temporal scope may be subject to diverging interpretations, which are often unclear in SNOMED CT as their contextual determinants are not made explicit. Conclusion The description of complex medical occurrents is ambiguous, as the same situations can be described as (i a complex occurrent C that has A and B as temporal parts, (ii a simple occurrent A' defined as a kind of A followed by some B, or (iii a simple occurrent B' defined as a kind of B preceded by some A. As negative statements in SNOMED CT cannot be exactly represented without

  10. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  11. Risk, uncertainty in risk, and the EPA release limits for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1993-01-01

    A conceptual model for the organization and execution of a performance assessment of a radioactive waste disposal site, including uncertainty and sensitivity analysis, is described. This model is based on a formal definition of risk as a collection of ordered triples, where the first element in each triple is a set of similar occurrences, the second element is the probability or frequency of the first element, and the third element is a vector of consequences associated with the first element. This division of risk into its three constituent parts provides a useful model for the structure of a performance assessment for several reasons. First, it provides a clear distinction between the major parts of a performance assessment. Second, it provides a way to distinguish between different types of uncertainty, including completeness, aggregation, model selection, imprecisely known variables, and stochastic variation. Third, it leads naturally to the representation of stochastic variation with a complementary cumulative distribution function (CCDF) and the representation of state of knowledge uncertainty with a family or distribution of CCDFs. Fourth, it provides a context in which the U.S. Environmental Protection Agency limits for radioactive releases to the accessible environment can be represented and calculated. The preceding ideas are illustrated with results obtained in a preliminary performance assessment for the Waste Isolation Pilot Plant in southeastern New Mexico

  12. Being Included and Excluded

    DEFF Research Database (Denmark)

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural p...

  13. Representation of analysis results involving aleatory and epistemic uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  14. Reducing uncertainty about objective functions in adaptive management

    Science.gov (United States)

    Williams, B.K.

    2012-01-01

    This paper extends the uncertainty framework of adaptive management to include uncertainty about the objectives to be used in guiding decisions. Adaptive decision making typically assumes explicit and agreed-upon objectives for management, but allows for uncertainty as to the structure of the decision process that generates change through time. Yet it is not unusual for there to be uncertainty (or disagreement) about objectives, with different stakeholders expressing different views not only about resource responses to management but also about the appropriate management objectives. In this paper I extend the treatment of uncertainty in adaptive management, and describe a stochastic structure for the joint occurrence of uncertainty about objectives as well as models, and show how adaptive decision making and the assessment of post-decision monitoring data can be used to reduce uncertainties of both kinds. Different degrees of association between model and objective uncertainty lead to different patterns of learning about objectives. ?? 2011.

  15. Integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management

    International Nuclear Information System (INIS)

    Catrinu, M.D.; Nordgard, D.E.

    2011-01-01

    Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made. This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network. This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.

  16. Uncertainty of measurement: an immunology laboratory perspective.

    Science.gov (United States)

    Beck, Sarah C; Lock, Robert J

    2015-01-01

    'Measurement uncertainty of measured quantity values' (ISO15189) requires that the laboratory shall determine the measurement uncertainty for procedures used to report measured quantity values on patients' samples. Where we have numeric data measurement uncertainty can be expressed as the standard deviation or as the co-efficient of variation. However, in immunology many of the assays are reported either as semi-quantitative (i.e. an antibody titre) or qualitative (positive or negative) results. In the latter context, measuring uncertainty is considerably more difficult. There are, however, strategies which can allow us to minimise uncertainty. A number of parameters can contribute to making measurements uncertain. These include bias, precision, standard uncertainty (expressed as standard deviation or coefficient of variation), sensitivity, specificity, repeatability, reproducibility and verification. Closely linked to these are traceability and standardisation. In this article we explore the challenges presented to immunology with regard to measurement uncertainty. Many of these challenges apply equally to other disciplines working with qualitative or semi-quantitative data. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  17. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  18. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  19. Risk, Uncertainty, and Entrepreneurship

    DEFF Research Database (Denmark)

    Koudstaal, Martin; Sloof, Randolph; Van Praag, Mirjam

    2016-01-01

    Theory predicts that entrepreneurs have distinct attitudes toward risk and uncertainty, but empirical evidence is mixed. To better understand these mixed results, we perform a large “lab-in-the-field” experiment comparing entrepreneurs to managers (a suitable comparison group) and employees (n D ...

  20. Schrodinger's Uncertainty Principle?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Schrödinger's Uncertainty Principle? - Lilies can be Painted. Rajaram Nityananda. General Article Volume 4 Issue 2 February 1999 pp 24-26. Fulltext. Click here to view fulltext PDF. Permanent link:

  1. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  2. Uncertainty Quantification in Fatigue Crack Growth Prognosis

    Directory of Open Access Journals (Sweden)

    Shankar Sankararaman

    2011-01-01

    Full Text Available This paper presents a methodology to quantify the uncertainty in fatigue crack growth prognosis, applied to structures with complicated geometry and subjected to variable amplitude multi-axial loading. Finite element analysis is used to address the complicated geometry and calculate the stress intensity factors. Multi-modal stress intensity factors due to multi-axial loading are combined to calculate an equivalent stress intensity factor using a characteristic plane approach. Crack growth under variable amplitude loading is modeled using a modified Paris law that includes retardation effects. During cycle-by-cycle integration of the crack growth law, a Gaussian process surrogate model is used to replace the expensive finite element analysis. The effect of different types of uncertainty – physical variability, data uncertainty and modeling errors – on crack growth prediction is investigated. The various sources of uncertainty include, but not limited to, variability in loading conditions, material parameters, experimental data, model uncertainty, etc. Three different types of modeling errors – crack growth model error, discretization error and surrogate model error – are included in analysis. The different types of uncertainty are incorporated into the crack growth prediction methodology to predict the probability distribution of crack size as a function of number of load cycles. The proposed method is illustrated using an application problem, surface cracking in a cylindrical structure.

  3. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  4. Improvement of Statistical Decisions under Parametric Uncertainty

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  5. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  6. On Fitting a Formal Method into Practice

    DEFF Research Database (Denmark)

    Gmehlich, Rainer; Grau, Katrin; Hallerstede, Stefan

    2011-01-01

    of industrial use. We report on where Event-B and its tools have succeeded, where they have not. We also report on advances that were inspired by the case study. Interestingly, the case study was not a pure formal methods problem. In addition to Event-B, it used Problem Frames for capturing requirements....... The interaction between the two proved to be crucial for the success of the case study. The heart of the problem was tracing informal requirements from Problem Frames descriptions to formal Event-B models. To a large degree, this issue dictated the approach that had to be used for formal modelling. A dedicated...

  7. Expanding Uncertainty Principle to Certainty-Uncertainty Principles with Neutrosophy and Quad-stage Method

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-03-01

    Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.

  8. Uncertainties, confidence ellipsoids and security polytopes in LSA

    Science.gov (United States)

    Grabe, Michael

    1992-05-01

    For a given error model, the uncertainties of and the couplings between parameters estimated by a least-squares adjustment (LSA) are formalized. The error model is restricted to normally distributed random errors and to systematic errors that remain constant during measurement, but whose magnitudes and signs are unknown. An outline of the associated, new formalism for estimating measurement uncertainties is sketched as regards its function as a measure of the consistency between theory and experiment. The couplings due to random errors lead to ellipsoids stemming from singular linear mappings of Hotelling's ellipsoids. Those introduced by systematic errors create convex polytopes, so-called security polytopes, which are singular linear mappings of hyperblocks caused by a ldworst-case treatment” of systematic errors.

  9. Quantum Uncertainty and Fundamental Interactions

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  10. Policy Uncertainty and the US Ethanol Industry

    Directory of Open Access Journals (Sweden)

    Jason P. H. Jones

    2017-11-01

    Full Text Available The Renewable Fuel Standard (RFS2, as implemented, has introduced uncertainty into US ethanol producers and the supporting commodity market. First, the fixed mandate for what is mainly cornstarch-based ethanol has increased feedstock price volatility and exerts a general effect across the agricultural sector. Second, the large discrepancy between the original Energy Independence and Security Act (EISA intentions and the actual RFS2 implementation for some fuel classes has increased the investment uncertainty facing investors in biofuel production, distribution, and consumption. Here we discuss and analyze the sources of uncertainty and evaluate the effect of potential RFS2 adjustments as they influence these uncertainties. This includes the use of a flexible, production dependent mandate on corn starch ethanol. We find that a flexible mandate on cornstarch ethanol relaxed during drought could significantly reduce commodity price spikes and alleviate the decline of livestock production in cases of feedstock production shortfalls, but it would increase the risk for ethanol investors.

  11. Radiometer Design Analysis Based Upon Measurement Uncertainty

    Science.gov (United States)

    Racette, Paul E.; Lang, Roger H.

    2004-01-01

    This paper introduces a method for predicting the performance of a radiometer design based on calculating the measurement uncertainty. The variety in radiometer designs and the demand for improved radiometric measurements justify the need for a more general and comprehensive method to assess system performance. Radiometric resolution, or sensitivity, is a figure of merit that has been commonly used to characterize the performance of a radiometer. However when evaluating the performance of a calibration design for a radiometer, the use of radiometric resolution has limited application. These limitations are overcome by considering instead the measurement uncertainty. A method for calculating measurement uncertainty for a generic radiometer design including its calibration algorithm is presented. The result is a generalized technique by which system calibration architectures and design parameters can be studied to optimize instrument performance for given requirements and constraints. Example applications demonstrate the utility of using measurement uncertainty as a figure of merit.

  12. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  13. Charging transient in polyvinyl formal

    Indian Academy of Sciences (India)

    Unknown

    tally influences all the transport phenomena and their effects at the electrodes. The different facts, including the weak polar structure of polymer, the power-law dependence of current on field, the observed value of 'n' and the thermal activation of current over a certain temperature range indicate that a space charge due to ...

  14. The Intersituational Generality of Formal Thought

    Science.gov (United States)

    Stone, Mary Ann; Ausubel, David P.

    1969-01-01

    Shows that, contrary to Piagetian Theory, formal thought in a variety of subject matters is not possible until sufficient requisite concrete background experience in each content area involved has been attained. (MH)

  15. El Salvador - Non-Formal Skills Development

    Data.gov (United States)

    Millennium Challenge Corporation — The Non-Formal Skills Development Sub-Activity had a budget of $5 million (USD) to provide short-term training to vulnerable populations in El Salvador's Northern...

  16. Transitions from Formal Education to the Workplace

    Science.gov (United States)

    Olson, Joann S.

    2014-01-01

    This chapter frames the transition to adulthood in the context of the moving from formal educational settings to the often less-structured learning that occurs in workplace settings. Although schooling may end, learning continues.

  17. Regla formal de justicia, valores y principios

    OpenAIRE

    López Ruiz, Francisco

    1995-01-01

    SUMARIO: 1. Racionalidad y regla formal de justicia. 2. Las diferencias estructurales entre normas y principios. 3. Clases de principios. 4. Funciones de los principios. 5. Principios y racionalidad material del Derecho. Publicado

  18. Formal specification of human-computer interfaces

    Science.gov (United States)

    Auernheimer, Brent

    1990-01-01

    A high-level formal specification of a human computer interface is described. Previous work is reviewed and the ASLAN specification language is described. Top-level specifications written in ASLAN for a library and a multiwindow interface are discussed.

  19. Toward a formal ontology for narrative

    Directory of Open Access Journals (Sweden)

    Ciotti, Fabio

    2016-03-01

    Full Text Available In this paper the rationale and the first draft of a formal ontology for modeling narrative texts are presented. Building on the semiotic and structuralist narratology, and on the work carried out in the late 1980s by Giuseppe Gigliozzi in Italy, the focus of my research are the concepts of character and of narrative world/space. This formal model is expressed in the OWL 2 ontology language. The main reason to adopt a formal modeling approach is that I consider the purely probabilistic-quantitative methods (now widespread in digital literary studies inadequate. An ontology, on one hand provides a tool for the analysis of strictly literary texts. On the other hand (though beyond the scope of the present work, its formalization can also represent a significant contribution towards grounding the application of storytelling methods outside of scholarly contexts.

  20. Statistical Survey of Non-Formal Education

    Directory of Open Access Journals (Sweden)

    Ondřej Nývlt

    2012-12-01

    Full Text Available focused on a programme within a regular education system. Labour market flexibility and new requirements on employees create a new domain of education called non-formal education. Is there a reliable statistical source with a good methodological definition for the Czech Republic? Labour Force Survey (LFS has been the basic statistical source for time comparison of non-formal education for the last ten years. Furthermore, a special Adult Education Survey (AES in 2011 was focused on individual components of non-formal education in a detailed way. In general, the goal of the EU is to use data from both internationally comparable surveys for analyses of the particular fields of lifelong learning in the way, that annual LFS data could be enlarged by detailed information from AES in five years periods. This article describes reliability of statistical data aboutnon-formal education. This analysis is usually connected with sampling and non-sampling errors.

  1. Approaches to Formal Verification of Security Protocols

    OpenAIRE

    Lal, Suvansh; Jain, Mohit; Chaplot, Vikrant

    2011-01-01

    In recent times, many protocols have been proposed to provide security for various information and communication systems. Such protocols must be tested for their functional correctness before they are used in practice. Application of formal methods for verification of security protocols would enhance their reliability thereby, increasing the usability of systems that employ them. Thus, formal verification of security protocols has become a key issue in computer and communications security. In...

  2. Improved formalism for precision Higgs coupling fits

    International Nuclear Information System (INIS)

    Barklow, Tim; Peskin, Michael E.; Jung, Sunghoon; Tian, Junping

    2017-08-01

    Future e + e - colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e + e - data, based on the Effective Field Theory description of corrections to the Standard Model. We apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e + e - colliders.

  3. Towards a Formal Model of Context Awareness

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan

    2006-01-01

    There is a definite lack of formal support for modeling realistic context-awareness in pervasive computing applications. The CONAWA calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new constructs...... and capabilities. The calculus is a step in the direction of making formal methods applicable in the area of pervasive computing....

  4. Towards a Formal Notion of Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl Kristian

    2003-01-01

    Trust management systems have been proposed as an alternative to traditional security mechanisms in Global Computing. We present some challenges in establishing a formal foundation for the notion of trust, and some preliminary ideas towards a category of trust models.......Trust management systems have been proposed as an alternative to traditional security mechanisms in Global Computing. We present some challenges in establishing a formal foundation for the notion of trust, and some preliminary ideas towards a category of trust models....

  5. Improved formalism for precision Higgs coupling fits

    Science.gov (United States)

    Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; Karl, Robert; List, Jenny; Ogawa, Tomohisa; Peskin, Michael E.; Tian, Junping

    2018-03-01

    Future e+e- colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e+e- data, based on the effective field theory description of corrections to the Standard Model. We apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e+e- colliders.

  6. Formalization of Many-Valued Logics

    DEFF Research Database (Denmark)

    Villadsen, Jørgen; Schlichtkrull, Anders

    2017-01-01

    Partiality is a key challenge for computational approaches to artificial intelligence in general and natural language in particular. Various extensions of classical two-valued logic to many-valued logics have been investigated in order to meet this challenge. We use the proof assistant Isabelle...... to formalize the syntax and semantics of many-valued logics with determinate as well as indeterminate truth values. The formalization allows for a concise presentation and makes automated verification possible....

  7. Unifying Class-Based Representation Formalisms

    OpenAIRE

    Calvanese, D.; Lenzerini, M.; Nardi, D.

    2011-01-01

    The notion of class is ubiquitous in computer science and is central in many formalisms for the representation of structured knowledge used both in knowledge representation and in databases. In this paper we study the basic issues underlying such representation formalisms and single out both their common characteristics and their distinguishing features. Such investigation leads us to propose a unifying framework in which we are able to capture the fundamental aspects of several representatio...

  8. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. The role of rating curve uncertainty in real-time flood forecasting

    Science.gov (United States)

    Ocio, David; Le Vine, Nataliya; Westerberg, Ida; Pappenberger, Florian; Buytaert, Wouter

    2017-05-01

    Data assimilation has been widely tested for flood forecasting, although its use in operational systems is mainly limited to a simple statistical error correction. This can be due to the complexity involved in making more advanced formal assumptions about the nature of the model and measurement errors. Recent advances in the definition of rating curve uncertainty allow estimating a flow measurement error that includes both aleatory and epistemic uncertainties more explicitly and rigorously than in the current practice. The aim of this study is to understand the effect such a more rigorous definition of the flow measurement error has on real-time data assimilation and forecasting. This study, therefore, develops a comprehensive probabilistic framework that considers the uncertainty in model forcing data, model structure, and flow observations. Three common data assimilation techniques are evaluated: (1) Autoregressive error correction, (2) Ensemble Kalman Filter, and (3) Regularized Particle Filter, and applied to two locations in the flood-prone Oria catchment in the Basque Country, northern Spain. The results show that, although there is a better match between the uncertain forecasted and uncertain true flows, there is a low sensitivity for the threshold exceedances used to issue flood warnings. This suggests that a standard flow measurement error model, with a spread set to a fixed flow fraction, represents a reasonable trade-off between complexity and realism. Standard models are therefore recommended for operational flood forecasting for sites with well-defined stage-discharge curves that are based on a large range of flow observations.

  10. Declarative representation of uncertainty in mathematical models.

    Science.gov (United States)

    Miller, Andrew K; Britten, Randall D; Nielsen, Poul M F

    2012-01-01

    An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent) variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language) to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses) in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.

  11. Declarative representation of uncertainty in mathematical models.

    Directory of Open Access Journals (Sweden)

    Andrew K Miller

    Full Text Available An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.

  12. A Bayesian approach to simultaneously quantify assignments and linguistic uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC FREDERICKSBURG; Ross, Timothy J [UNM

    2010-10-07

    Subject matter expert assessments can include both assignment and linguistic uncertainty. This paper examines assessments containing linguistic uncertainty associated with a qualitative description of a specific state of interest and the assignment uncertainty associated with assigning a qualitative value to that state. A Bayesian approach is examined to simultaneously quantify both assignment and linguistic uncertainty in the posterior probability. The approach is applied to a simplified damage assessment model involving both assignment and linguistic uncertainty. The utility of the approach and the conditions under which the approach is feasible are examined and identified.

  13. Identification and communication of uncertainties of phenomenological models in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Simola, K.

    2001-11-01

    This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)

  14. Modeling Uncertainty in Climate Change: A Multi-Model Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul

    2015-10-01

    The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO2 concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.

  15. Informal work and formal plans

    DEFF Research Database (Denmark)

    Dalsted, Rikke Juul; Hølge-Hazelton, Bibi; Kousgaard, Marius Brostrøm

    2012-01-01

    trajectories. METHODS AND THEORY: An in-depth case study of patient trajectories at a Danish hospital and surrounding municipality using individual interviews with patients. Theory about trajectory and work by Strauss was included. RESULTS: Patients continuously took initiatives to organize their treatment...... participation. When looking at integrated care from the perspective of patients, the development of a more holistic and personalized approach is needed....

  16. Visualizing Java uncertainty

    Science.gov (United States)

    Knight, Claire; Munro, Malcolm

    2001-07-01

    Distributed component based systems seem to be the immediate future for software development. The use of such techniques, object oriented languages, and the combination with ever more powerful higher-level frameworks has led to the rapid creation and deployment of such systems to cater for the demand of internet and service driven business systems. This diversity of solution through both components utilised and the physical/virtual locations of those components can provide powerful resolutions to the new demand. The problem lies in the comprehension and maintenance of such systems because they then have inherent uncertainty. The components combined at any given time for a solution may differ, the messages generated, sent, and/or received may differ, and the physical/virtual locations cannot be guaranteed. Trying to account for this uncertainty and to build in into analysis and comprehension tools is important for both development and maintenance activities.

  17. How Uncertain is Uncertainty?

    Science.gov (United States)

    Vámos, Tibor

    The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.

  18. DOD ELAP Lab Uncertainties

    Science.gov (United States)

    2012-03-01

    certify to :  ISO   9001  (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO /IEC 17025:2005  Each has uncertainty...Analytical Measurement  Uncertainty Estimation” Defense Technical Information  Center # ADA 396946 William S. Ingersoll,  2001 12  Follows the  ISO  GUM...SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY

  19. Climate change decision-making: Model & parameter uncertainties explored

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.; Linville, C.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.

  20. Growth uncertainty and risksharing

    OpenAIRE

    Stefano Athanasoulis; Eric Van Wincoop

    1997-01-01

    How large are potential benefits from global risksharing? In order to answer this question we propose a new methodology that is closely connected with the empirical growth literature. We obtain estimates of residual risk (growth uncertainty) at various horizons from regressions of country-specific growth in deviation from world growth on a wide set of variables in the information set. Since this residual risk can be entirely hedged through risksharing, we use it to obtain a measure of the pot...

  1. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  2. 43 CFR 30.235 - What will the judge's decision in a formal probate proceeding contain?

    Science.gov (United States)

    2010-10-01

    ....235 What will the judge's decision in a formal probate proceeding contain? The judge must decide the... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false What will the judge's decision in a formal... requirements of this section. (a) In all cases, the judge's decision must: (1) Include the name, birth date...

  3. The Law, Policy, and Politics of Formal Hypnosis in the Public Community College Classroom.

    Science.gov (United States)

    Sachs, Steven Mark

    Information from printed sources, legal documents, and interviews with community college administrators formed the basis of an investigation of the legal, policy, and political implications of the use of formal hypnosis as an instructional augmentation in the community college classroom. Study findings included the following: (1) no formal policy…

  4. Anaphora and Logical Form: On Formal Meaning Representations for Natural Language. Technical Report No. 36.

    Science.gov (United States)

    Nash-Webber, Bonnie; Reiter, Raymond

    This paper describes a computational approach to certain problems of anaphora in natural language and argues in favor of formal meaning representation languages (MRLs) for natural language. After presenting arguments in favor of formal meaning representation languages, appropriate MRLs are discussed. Minimal requirements include provisions for…

  5. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  6. Participation under Uncertainty

    International Nuclear Information System (INIS)

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  7. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  8. Formal models, languages and applications

    CERN Document Server

    Rangarajan, K; Mukund, M

    2006-01-01

    A collection of articles by leading experts in theoretical computer science, this volume commemorates the 75th birthday of Professor Rani Siromoney, one of the pioneers in the field in India. The articles span the vast range of areas that Professor Siromoney has worked in or influenced, including grammar systems, picture languages and new models of computation. Sample Chapter(s). Chapter 1: Finite Array Automata and Regular Array Grammars (150 KB). Contents: Finite Array Automata and Regular Array Grammars (A Atanasiu et al.); Hexagonal Contextual Array P Systems (K S Dersanambika et al.); Con

  9. A Formal Approach to Domain-Oriented Software Design Environments

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    This paper describes a formal approach to domain-oriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domain-oriented specification language and its relationship to implementation-level subroutines. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. The diagrams also serve to document the specifications. Deductive program synthesis ensures that end-user specifications are correctly implemented. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory, which includes an axiomatization of JPL's SPICELIB subroutine library. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development. Furthermore, AMPHION synthesizes one to two page programs consisting of calls to SPICELIB subroutines from these specifications in just a few minutes. Test results obtained by metering AMPHION's deductive program synthesis component are examined. AMPHION has been installed at JPL and is currently undergoing further refinement in preparation for distribution to hundreds of SPICELIB users worldwide. Current work to support end-user customization of AMPHION's specification acquisition subsystem is briefly discussed, as well as future work to enable domain-expert creation of new AMPHION applications through development of suitable domain theories.

  10. Asymmetrical peer interaction and formal operational development: Dialogue dimensions analysis

    Directory of Open Access Journals (Sweden)

    Stepanović-Ilić Ivana

    2015-01-01

    Full Text Available The main goal of the study is to define dialogue dimensions in order to describe the interaction within peer dyads and potentially connect them with formal operations development in the less competent participants. Its significance is related to rare investigations of this subject in the context of formal operations development and to practical implications regarding peer involvement in education process. The sample included 316 students aged 12 and 14. The research had an experimental design: pre-test, intervention and post-test. In the pre-test and the post-test phases students solved the formal operations test BLOT. According to the pre-test results, 47 dyads were formed where less and more competent students jointly solved tasks from BLOT. Their dialogues were coded by 14 dimensions operationalized for this purpose. Correlations between the dialogue dimensions indicate clearly distinguished positive and negative interaction patterns. There are no connections between dialogue dimensions and progress of less competent adolescents on BLOT in the entire sample, but several are found in the subsamples. Arguments exchange seems to be the most encouraging dialogue feature regarding formal operations development, particularly in older students. This confirms relevant research data and the expectations about peers’ constructive role in fostering cognitive development. [Projekat Ministarstva nauke Republike Srbije, br. 179018: Identification, measurement and development of cognitive and emotional competences important for a society oriented towards European integrations

  11. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    Science.gov (United States)

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  12. Sources of uncertainty in flood inundation maps

    Science.gov (United States)

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  13. About uncertainties in practical salinity calculations

    Directory of Open Access Journals (Sweden)

    M. Le Menn

    2011-10-01

    Full Text Available In the current state of the art, salinity is a quantity computed from conductivity ratio measurements, with temperature and pressure known at the time of the measurement, and using the Practical Salinity Scale algorithm of 1978 (PSS-78. This calculation gives practical salinity values S. The uncertainty expected in PSS-78 values is ±0.002, but no details have ever been given on the method used to work out this uncertainty, and the error sources to include in this calculation. Following a guide published by the Bureau International des Poids et Mesures (BIPM, using two independent methods, this paper assesses the uncertainties of salinity values obtained from a laboratory salinometer and Conductivity-Temperature-Depth (CTD measurements after laboratory calibration of a conductivity cell. The results show that the part due to the PSS-78 relations fits is sometimes as significant as the instrument's. This is particularly the case with CTD measurements where correlations between variables contribute mainly to decreasing the uncertainty of S, even when expanded uncertainties of conductivity cell calibrations are for the most part in the order of 0.002 mS cm−1. The relations given here, and obtained with the normalized GUM method, allow a real analysis of the uncertainties' sources and they can be used in a more general way, with instruments having different specifications.

  14. Collision entropy and optimal uncertainty

    OpenAIRE

    Bosyk, G. M.; Portesi, M.; Plastino, A.

    2011-01-01

    We propose an alternative measure of quantum uncertainty for pairs of arbitrary observables in the 2-dimensional case, in terms of collision entropies. We derive the optimal lower bound for this entropic uncertainty relation, which results in an analytic function of the overlap of the corresponding eigenbases. Besides, we obtain the minimum uncertainty states. We compare our relation with other formulations of the uncertainty principle.

  15. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  16. A review of uncertainty research in impact assessment

    International Nuclear Information System (INIS)

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-01

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  17. A review of uncertainty research in impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  18. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  19. Uncertainty in Climate Change Research: An Integrated Approach

    Science.gov (United States)

    Mearns, L.

    2017-12-01

    Uncertainty has been a major theme in research regarding climate change from virtually the very beginning. And appropriately characterizing and quantifying uncertainty has been an important aspect of this work. Initially, uncertainties were explored regarding the climate system and how it would react to future forcing. A concomitant area of concern was viewed in the future emissions and concentrations of important forcing agents such as greenhouse gases and aerosols. But, of course we know there are important uncertainties in all aspects of climate change research, not just that of the climate system and emissions. And as climate change research has become more important and of pragmatic concern as possible solutions to the climate change problem are addressed, exploring all the relevant uncertainties has become more relevant and urgent. More recently, over the past five years or so, uncertainties in impacts models, such as agricultural and hydrological models, have received much more attention, through programs such as AgMIP, and some research in this arena has indicated that the uncertainty in the impacts models can be as great or greater than that in the climate system. Still there remains other areas of uncertainty that remain underexplored and/or undervalued. This includes uncertainty in vulnerability and governance. Without more thoroughly exploring these last uncertainties, we likely will underestimate important uncertainties particularly regarding how different systems can successfully adapt to climate change . In this talk I will discuss these different uncertainties and how to combine them to give a complete picture of the total uncertainty individual systems are facing. And as part of this, I will discuss how the uncertainty can be successfully managed even if it is fairly large and deep. Part of my argument will be that large uncertainty is not the enemy, but rather false certainty is the true danger.

  20. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Lab. (ANL), Argonne, IL (United States); Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty

  1. Fourier Series Formalization in ACL2(r

    Directory of Open Access Journals (Sweden)

    Cuong K. Chau

    2015-09-01

    Full Text Available We formalize some basic properties of Fourier series in the logic of ACL2(r, which is a variant of ACL2 that supports reasoning about the real and complex numbers by way of non-standard analysis. More specifically, we extend a framework for formally evaluating definite integrals of real-valued, continuous functions using the Second Fundamental Theorem of Calculus. Our extended framework is also applied to functions containing free arguments. Using this framework, we are able to prove the orthogonality relationships between trigonometric functions, which are the essential properties in Fourier series analysis. The sum rule for definite integrals of indexed sums is also formalized by applying the extended framework along with the First Fundamental Theorem of Calculus and the sum rule for differentiation. The Fourier coefficient formulas of periodic functions are then formalized from the orthogonality relations and the sum rule for integration. Consequently, the uniqueness of Fourier sums is a straightforward corollary. We also present our formalization of the sum rule for definite integrals of infinite series in ACL2(r. Part of this task is to prove the Dini Uniform Convergence Theorem and the continuity of a limit function under certain conditions. A key technique in our proofs of these theorems is to apply the overspill principle from non-standard analysis.

  2. Construction of Formal Ontology on Computer

    Directory of Open Access Journals (Sweden)

    Yelda FIRAT

    2013-01-01

    Full Text Available From a general point of view, the defined reality models by using a formal language are called formal ontology. These general purposeful models can be also used in the special fields of application. Because ontologies are large-scale by their nature, they have the potential of hosting errors and deficiencies in case they are generated by human labor. Besides, an ontology formed such a way will be difficult to be adapted to a application specific. In this study, an application of a computerized formal ontology formation saved from probable problems which are likely to arise using human labor will be introduced. In the application, semantic analysis of Turkish has been targetted as a special application field using corpus-based marking method in the framework of formal concept analysis. Beside the applications of these kinds of natural language processing formal ontologies are used considerably in the field of education as they present the information in the way of semantic structures clearly and meaningfully. The most important reasons of this are the use of the information technologies such as expansion of constructivist approach and the concept maps used for educational reasons.

  3. Formal modeling of robot behavior with learning.

    Science.gov (United States)

    Kirwan, Ryan; Miller, Alice; Porr, Bernd; Di Prodi, P

    2013-11-01

    We present formal specification and verification of a robot moving in a complex network, using temporal sequence learning to avoid obstacles. Our aim is to demonstrate the benefit of using a formal approach to analyze such a system as a complementary approach to simulation. We first describe a classical closed-loop simulation of the system and compare this approach to one in which the system is analyzed using formal verification. We show that the formal verification has some advantages over classical simulation and finds deficiencies our classical simulation did not identify. Specifically we present a formal specification of the system, defined in the Promela modeling language and show how the associated model is verified using the Spin model checker. We then introduce an abstract model that is suitable for verifying the same properties for any environment with obstacles under a given set of assumptions. We outline how we can prove that our abstraction is sound: any property that holds for the abstracted model will hold in the original (unabstracted) model.

  4. Equational Programming (A Study in Executable Algebraic Formal Specification).

    Science.gov (United States)

    1986-03-05

    universal algebra extend to OSA, including existence of initial algebras , completeness of equational deduction, Birkhoff variety and Malcev quasivariety...AZi--i66 19 EQUATIONAL PROCRAMDING ( STUDY IN EXECUTABLE ALGEBRAIC in FORMAL SPECIFICETION)U) SRI INTERNATIONAL MENLO PARK CA COMPUTER SCIENCE LAB J...TEST CHART ANDARDS t A ,, .. .- .. 0, IL LN % EQUATIONAL PROGRAMMINGCo (A Study in Executable Algebraic -ormal Specification) 7 I= Final Report ~5

  5. The Formal Approach to Computer Game Rule Development Automation

    OpenAIRE

    Elena, A.

    2009-01-01

    Computer game rules development is one of the weakly automated tasks in game development. This paper gives an overview of the ongoing research project which deals with automation of rules development for turn-based strategy computer games. Rules are the basic elements of these games. This paper proposes a new approach to automation including visual formal rules model creation, model verification and modelbased code generation.

  6. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  7. Structuring Formal Requirements Specifications for Reuse and Product Families

    Science.gov (United States)

    Heimdahl, Mats P. E.

    2001-01-01

    In this project we have investigated how formal specifications should be structured to allow for requirements reuse, product family engineering, and ease of requirements change, The contributions of this work include (1) a requirements specification methodology specifically targeted for critical avionics applications, (2) guidelines for how to structure state-based specifications to facilitate ease of change and reuse, and (3) examples from the avionics domain demonstrating the proposed approach.

  8. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  9. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  10. Application of perturbation theory methods to nuclear data uncertainty propagation using the collision probability method

    International Nuclear Information System (INIS)

    Sabouri, Pouya

    2013-01-01

    This thesis presents a comprehensive study of sensitivity/uncertainty analysis for reactor performance parameters (e.g. the k-effective) to the base nuclear data from which they are computed. The analysis starts at the fundamental step, the Evaluated Nuclear Data File and the uncertainties inherently associated with the data they contain, available in the form of variance/covariance matrices. We show that when a methodical and consistent computation of sensitivity is performed, conventional deterministic formalisms can be sufficient to propagate nuclear data uncertainties with the level of accuracy obtained by the most advanced tools, such as state-of-the-art Monte Carlo codes. By applying our developed methodology to three exercises proposed by the OECD (Uncertainty Analysis for Criticality Safety Assessment Benchmarks), we provide insights of the underlying physical phenomena associated with the used formalisms. (author)

  11. A formalized approach to making effective natural resource management decisions for Alaska National Parks

    Science.gov (United States)

    MacCluskie, Margaret C.; Romito, Angela; Peterson, James T.; Lawler, James P.

    2015-01-01

    A fundamental goal of the National Park Service (NPS) is the long-term protection and management of resources in the National Park System. Reaching this goal requires multiple approaches, including the conservation of essential habitats and the identification and elimination of potential threats to biota and habitats. To accomplish these goals, the NPS has implemented the Alaska Region Vital Signs Inventory and Monitoring (I&M) Program to monitor key biological, chemical, and physical components of ecosystems at more than 270 national parks. The Alaska Region has four networks—Arctic, Central, Southeast, and Southwest. By monitoring vital signs over large spatial and temporal scales, park managers are provided with information on the status and trajectory of park resources as well as a greater understanding and insight into the ecosystem dynamics. While detecting and quantifying change is important to conservation efforts, to be useful for formulating remedial actions, monitoring data must explicitly relate to management objectives and be collected in such a manner as to resolve key uncertainties about the dynamics of the system (Nichols and Williams 2006). Formal decision making frameworks (versus more traditional processes described below) allow for the explicit integration of monitoring data into decision making processes to improve the understanding of system dynamics, thereby improving future decisions (Williams 2011).

  12. First order formalism for quantum gravity

    International Nuclear Information System (INIS)

    Gleiser, M.; Holman, R.; Neto, N.P.

    1987-05-01

    We develop a first order formalism for the quantization of gravity. We take as canonical variables both the induced metric and the extrinsic curvature of the (d - 1) -dimensional hypersurfaces obtained by the foliation of the d - dimensional spacetime. After solving the constraint algebra we use the Dirac formalism to quantize the theory and obtain a new representation for the Wheeler-DeWitt equation, defined in the functional space of the extrinsic curvature. We also show how to obtain several different representations of the Wheeler-DeWitt equation by considering actions differing by a total divergence. In particular, the intrinsic and extrinsic time approaches appear in a natural way, as do equivalent representations obtained by functional Fourier transforms of appropriate variables. We conclude with some remarks about the construction of the Hilbert space within the first order formalism. 10 refs

  13. Designing for Non-Formal Learning

    DEFF Research Database (Denmark)

    Petersson, Eva

    2008-01-01

    perspective on designing for non-formal learning by elaborating from a position at the juncture of social semiotics and Vygotskian inspired socio-cultural theories. The embodied complex processes of sign transformation, by way of modes, media, play and engagement will be discussed. To support my position I...... opportunities in empowerment of children with impairment, especially in respect of their ‘non-formal learning' potentials. The questioning of how to best use the technology so as to optimise possibilities and overcome constraints in education, therapy and rehabilitation is an aspect of my talk. Additionally, I...... semiotic point of view. This is due to the fact that design is a way to configure communicative resources and social interaction (Kress & van Leeuwen, 2001) which, from my position, supports designing for non-formal learning which is at the core of my research. In this presentation, I will introduce my...

  14. Formal analogies in physics teacher education

    DEFF Research Database (Denmark)

    Avelar Sotomaior Karam, Ricardo; Ricardo, Elio

    2012-01-01

    the relevance of the subject, formal analogies are rarely systematically approached in physics education. In order to discuss this issue with pre-service physics teachers, we planned a lecture and designed a questionnaire with the goal of encouraging them to think about some “coincidences” in well known......Reasoning by similarities, especially the ones associated with formal aspects, is one of the most valuable sources for the development of physical theories. The essential role of formal analogies in science can be highlighted by the fact that several equations for different physical situations have...... the exact same appearance. Coulomb’s law’s similarity with Newton’s, Maxwell’s application of fluid theory to electromagnetism and Hamilton’s optical mechanical analogy are some among many other examples. These cases illustrate the power of mathematics in providing unifying structures for physics. Despite...

  15. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  16. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  17. User Interface Technology for Formal Specification Development

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  18. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  19. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  20. Oil price uncertainty in Canada

    International Nuclear Information System (INIS)

    Elder, John; Serletis, Apostolos

    2009-01-01

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  1. A PCE-based multiscale framework for the characterization of uncertainties in complex systems

    Science.gov (United States)

    Mehrez, Loujaine; Fish, Jacob; Aitharaju, Venkat; Rodgers, Will R.; Ghanem, Roger

    2018-02-01

    This paper presents a framework for the modeling and analysis of material systems that exhibit uncertainties in their constituents at all scales. The framework integrates multiscale formalism with a polynomial chaos construction enabling an explicit representation of quantities of interests, at any scale, in terms of any form of underlying uncertain parameters, a key feature to model multiscale dependencies. It is demonstrated how the framework can successfully tackle settings where a hierarchy of scales must be explicitly modeled. The application of this framework is illustrated in the construction of stochastic models of mesoscale and macroscale properties of non-crimp fabric composites. Joint statistical properties of upscaled components of the composite, including properties of tow, laminae and laminate, are computed.

  2. Educación no formal

    Science.gov (United States)

    Tignanelli, H.

    Se comentan en esta comunicación, las principales contribuciones realizadas en el campo de la educación en astronomía en los niveles primario, secundario y terciario, como punto de partida para la discusión de la actual inserción de los contenidos astronómicos en los nuevos contenidos curriculares de la EGB - Educación General Básica- y Polimodal, de la Reforma Educativa. En particular, se discuten los alcances de la educación formal y no formal, su importancia para la capacitación de profesores y maestros, y perspectivas a futuro.

  3. Towards formalization of inspection using petrinets

    International Nuclear Information System (INIS)

    Javed, M.; Naeem, M.; Bahadur, F.; Wahab, A.

    2014-01-01

    Achieving better quality software has always been a challenge for software developers. Inspection is one of the most efficient techniques, which ensure the quality of software during its development. To the best of our knowledge, current inspection techniques are not realized by any formal approach. In this paper, we propose an inspection technique, which is not only backed by the formal mathematical semantics of Petri nets, but also supports inspecting concurrent processes. We also use a case study of an agent based distributed processing system to demonstrate the inspection of concurrent processes. (author)

  4. Formal Concept Analysis for Information Retrieval

    OpenAIRE

    Qadi, Abderrahim El; Aboutajedine, Driss; Ennouary, Yassine

    2010-01-01

    In this paper we describe a mechanism to improve Information Retrieval (IR) on the web. The method is based on Formal Concepts Analysis (FCA) that it is makes semantical relations during the queries, and allows a reorganizing, in the shape of a lattice of concepts, the answers provided by a search engine. We proposed for the IR an incremental algorithm based on Galois lattice. This algorithm allows a formal clustering of the data sources, and the results which it turns over are classified by ...

  5. The UML as a Formal Modeling Notation

    OpenAIRE

    Evans, Andy; France, Robert; Lano, Kevin; Rumpe, Bernhard

    2014-01-01

    The Unified Modeling Language (UML) is rapidly emerging as a de-facto standard for modelling OO systems. Given this role, it is imperative that the UML needs a well-defined, fully explored semantics. Such semantics is required in order to ensure that UML concepts are precisely stated and defined. In this paper we motivate an approach to formalizing UML in which formal specification techniques are used to gain insight into the semantics of UML notations and diagrams and describe a roadmap for ...

  6. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  7. Formal Institutions and Subjective Well-Being

    DEFF Research Database (Denmark)

    Bjørnskov, Christian; Dreher, Axel; Fischer, Justina

    A long tradition in economics explores the association between the quality of formal institutions and economic performance. The literature on the relationship between such institutions and happiness is, however, rather limited. In this paper, we revisit the findings from recent cross-country stud......A long tradition in economics explores the association between the quality of formal institutions and economic performance. The literature on the relationship between such institutions and happiness is, however, rather limited. In this paper, we revisit the findings from recent cross...

  8. Formalizing the Problem of Music Description

    DEFF Research Database (Denmark)

    Sturm, Bob L.; Bardeli, Rolf; Langlois, Thibault

    2015-01-01

    The lack of a formalism for “the problem of music descrip- tion” results in, among other things: ambiguity in what problem a music description system must address, how it should be evaluated, what criteria define its success, and the paradox that a music description system can reproduce the “ground...... truth” of a music dataset without attending to the music it contains. To address these issues, we formal- ize the problem of music description such that all elements of an instance of it are made explicit. This can thus inform the building of a system, and how it should be evaluated in a meaningful way...

  9. Formal Modeling and Analysis of Timed Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Niebert, Peter

    This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts...... of two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real......-time systems, discrete time systems, timed languages, and real-time operating systems....

  10. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    the declarative nature of the projected graphs (which are also DCR graphs). We have also provided semantics for distributed executions based on synchronous communication among network of projected graphs and proved that global and distributed executions are equivalent. Further, to support modeling of processes...... using DCR Graphs and to make the formal model available to a wider audience, we have developed prototype tools for specification and a workflow engine for the execution of DCR Graphs. We have also developed tools interfacing SPIN model checker to formally verify safety and liveness properties on the DCR...

  11. Augmenting Reality and Formality of Informal and Non-Formal Settings to Enhance Blended Learning

    Science.gov (United States)

    Pérez-Sanagustin, Mar; Hernández-Leo, Davinia; Santos, Patricia; Kloos, Carlos Delgado; Blat, Josep

    2014-01-01

    Visits to museums and city tours have been part of higher and secondary education curriculum activities for many years. However these activities are typically considered "less formal" when compared to those carried out in the classroom, mainly because they take place in informal or non-formal settings. Augmented Reality (AR) technologies…

  12. The Dynamics of Formal Organization: Essays on bureaucracy and formal rules

    NARCIS (Netherlands)

    S.E. Osadchiy (Sergey)

    2011-01-01

    textabstractTheories of bureaucracy in organization studies constitute a perspective in which formal or written rules are seen as fundamental to the understanding of organization. It is argued, for example, that formal rules facilitate organizational decision-making, establish the basis for

  13. Lending Policies of Informal, Formal, and Semi-formal Lenders: Evidence from Vietnam

    NARCIS (Netherlands)

    Lensink, B.W.; Pham, T.T.T.

    2007-01-01

    This paper compares lending policies of formal, informal and semiformal lenders with respect to household lending in Vietnam. The analysis suggests that the probability of using formal or semiformal credit increases if borrowers provide collateral, a guarantor and/or borrow for business-related

  14. A violation of the uncertainty principle implies a violation of the second law of thermodynamics.

    Science.gov (United States)

    Hänggi, Esther; Wehner, Stephanie

    2013-01-01

    Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? Here we give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature.

  15. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  16. Conveying uncertainty in prognosis to patients with ESRD.

    Science.gov (United States)

    Parvez, Sanah; Abdel-Kader, Khaled; Song, Mi-Kyung; Unruh, Mark

    2015-01-01

    Prognosis is a component of medical practice imbued with uncertainty. In nephrology, where mortality rates of elderly patients on dialysis are comparable to those of cancer patients, the implications of prognosis are unavoidable. Yet while most patients with end-stage renal disease (ESRD) desire to hear their prognosis, many nephrologists balk at this prospect in part owing to the uncertainty inherent in prognostic estimates. In this review, the concept of 'uncertainty' in clinical practice is considered from physician and patient perspectives. From the training perspective, providers learn that uncertainty is inescapable in medicine and develop strategies to manage its presence, including the avoidance of communicating uncertainty to their patients. This presages infrequent discussions of prognosis, which in turn influence patient preferences for treatments that have little therapeutic benefits. A general approach to conveying prognostic uncertainty to ESRD patients includes confronting our own emotional reaction to uncertainty, learning how to effectively communicate uncertainty to our patients, and using an effective interdisciplinary team approach to demonstrate an ongoing commitment to our patients despite the presence of prognostic uncertainty. Uncertainty in prognosis is inevitable. Once providers learn to incorporate it into their discussions of prognosis and collaborate with their ESRD patients, such discussions can foster trust and reduce anxiety for both sides. © 2015 S. Karger AG, Basel.

  17. A Survey of Formal Methods for Intelligent Swarms

    Science.gov (United States)

    Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.

    2004-01-01

    cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.

  18. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  19. Uncertainty in ocean mass trends from GRACE

    Science.gov (United States)

    Quinn, Katherine J.; Ponte, Rui M.

    2010-05-01

    Ocean mass, together with steric sea level, are the key components of total observed sea level change. Monthly observations from the Gravity Recovery and Climate Experiment (GRACE) can provide estimates of the ocean mass component of the sea level budget, but full use of the data requires a detailed understanding of its errors and biases. We have examined trends in ocean mass calculated from 6 yr of GRACE data and found differences of up to 1 mmyr-1 between estimates derived from different GRACE processing centre solutions. In addition, variations in post-processing masking and filtering procedures required to convert the GRACE data into ocean mass lead to trend differences of up to 0.5 mmyr-1. Necessary external model adjustments add to these uncertainties, with reported postglacial rebound corrections differing by as much as 1 mmyr-1. Disagreement in the regional trends between the GRACE processing centres is most noticeably in areas south of Greenland, and in the southeast and northwest Pacific Ocean. Non-ocean signals, such as in the Indian Ocean due to the 2004 Sumatran-Andean earthquake, and near Greenland and West Antarctica due to land signal leakage, can also corrupt the ocean trend estimates. Based on our analyses, formal errors may not capture the true uncertainty in either regional or global ocean mass trends derived from GRACE.

  20. Earthquake Loss Estimation Uncertainties

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  1. Formalizing Real-Time Embedded System into Promela

    Directory of Open Access Journals (Sweden)

    Sukvanich Punwess

    2015-01-01

    Full Text Available We propose an alternative of formalization of the real-time embedded system into Promela model. The proposed formal model supports the essential features of the real-time embedded system, including system resource-constrained handling, task prioritization, task synchronization, real-time preemption, the parallelism of resources via DMA. Meanwhile, the model is also fully compatible with the partial order reduction algorithm for model checking. The timed automata of the real-time embedded system are considered and transformed into Promela, in our approach, by replacing time ticking into the repeated cycle of the timed values to do the conditional guard to enable the synchronization among the whole system operations. Our modeling approach could satisfactorily verify a small real-time system with parameterized dependent tasks and different scheduling topologies.

  2. Formalization of the classification pattern: survey of classification modeling in information systems engineering.

    Science.gov (United States)

    Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James

    2018-01-01

    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus

  3. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  4. Towards a Formal Framework for Computational Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl Kristian; Sassone, Vladimiro

    2006-01-01

    We define a mathematical measure for the quantitative comparison of probabilistic computational trust systems, and use it to compare a well-known class of algorithms based on the so-called beta model. The main novelty is that our approach is formal, rather than based on experimental simulation....

  5. Rhythmic Characteristics of Colloquial and Formal Tamil

    Science.gov (United States)

    Keane, Elinor

    2006-01-01

    Application of recently developed rhythmic measures to passages of read speech in colloquial and formal Tamil revealed some significant differences between the two varieties, which are in diglossic distribution. Both were also distinguished from a set of control data from British English speakers reading an equivalent passage. The findings have…

  6. Does Formal Environmental Knowledge Inform the Everyday ...

    African Journals Online (AJOL)

    This paper explores the link between formal environmental knowledge encapsulated in the University of Cambridge International Examination Curriculum and learners' ability to translate this knowledge into everyday practices in Lesotho. The paper reports on research undertaken in three secondary schools in Lesotho ...

  7. Formal Schema Theory and Teaching EFL Reading

    Science.gov (United States)

    Young, Barbara N; Man, Zhou

    2005-01-01

    Inquirers designed and conducted a study investigating whether or not results derived from previous research focusing on teaching and learning English as a native or foreign language would be replicated in a learning environment in which English is taught as a foreign language as in China. Because activation of formal schemata plays an important…

  8. Effective operator formalism for open quantum systems

    DEFF Research Database (Denmark)

    Reiter, Florentin; Sørensen, Anders Søndberg

    2012-01-01

    We present an effective operator formalism for open quantum systems. Employing perturbation theory and adiabatic elimination of excited states for a weakly driven system, we derive an effective master equation which reduces the evolution to the ground-state dynamics. The effective evolution...

  9. The simplest formal argument for fitness optimization

    Indian Academy of Sciences (India)

    2008-12-23

    Dec 23, 2008 ... changing gene frequencies, and the optimization programme represents design. The fundamental strategy of the Formal. Darwinism Project is linking two .... tions research. Note that there is no sense of generations, or of genes. Indeed there is no population, and so for many rea- sons there is no sense of ...

  10. Formally analysing the concepts of domestic violence

    NARCIS (Netherlands)

    Poelmans, J.; Elzinga, P.; Viaene, S.; Dedene, G.

    2011-01-01

    The types of police inquiries performed these days are incredibly diverse. Often data processing architectures are not suited to cope with this diversity since most of the case data is still stored as unstructured text. In this paper Formal Concept Analysis (FCA) is showcased for its exploratory

  11. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  12. A Formal Model for Trust Lifecycle Management

    DEFF Research Database (Denmark)

    Wagealla, Waleed; Carbone, Marco; English, Colin

    2003-01-01

    We propose a formal model of trust informed by the Global Computing scenario and focusing on the aspects of trust formation, evolution, and propagation. The model is based on a novel notion of trust structures which, building on concepts from trust management and domain theory, feature at the sam...

  13. Incremental guideline formalization with tool support

    NARCIS (Netherlands)

    Serban, Radu; Puig-Centelles, Anna; ten Teije, Annette

    2006-01-01

    Guideline formalization is recognized as an important component in improving computerized guidelines, which in turn leads to better informedness, lower inter-practician variability and, ultimately, to higher quality healthcare. By means of a modeling exercise, we investigate the role of guideline

  14. A New Formalism for Relational Algebra

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Larsen, Kim Skak; Schmidt, Erik Meineche

    1992-01-01

    We present a new formalism for relational algebra, the FC language, which is based on a novel factorization of relations. The acronym stands for factorize and combine. A pure version of this language is equivalent to relational algebra in the sense that semantics preserving translations exist...

  15. Moving interprofessional learning forward through formal assessment.

    Science.gov (United States)

    Stone, Judy

    2010-04-01

    There is increasing agreement that graduates who finish tertiary education with the full complement of skills and knowledge required for their designated profession are not 'work-ready' unless they also acquire interpersonal, collaborative practice and team-working capabilities. Health workers are unable to contribute to organisational culture in a positive way unless they too attain these capabilities. These capabilities have been shown to improve health care in terms of patient safety, worker satisfaction and health service efficiency. Given the importance of interprofessional learning (IPL) which seeks to address these capabilities, why is IPL not consistently embedded into the education of undergraduates, postgraduates and vocationally qualified personnel through formal assessment? This paper offers an argument for the formal assessment of IPL. It illustrates how the interests of the many stakeholders in IPL can benefit from, and contribute to, the integration of IPL into mainstream professional development and tertiary education. It offers practical examples of assessment in IPL which could drive learning and offer authentic, contextual teaching and learning experiences to undergraduates and health workers alike. Assessment drives learning and without formal assessment IPL will continue to be viewed as an optional topic of little relative importance for learners. In order to make the next step forward, IPL needs to be recognised and endorsed through formal assessment, both at the tertiary education level and within the workplace environment. This is supported by workforce initiatives and tertiary education policy which can be used to specify the capabilities or generic skills necessary for effective teamwork and collaborative practice.

  16. What makes industries believe in formal methods

    NARCIS (Netherlands)

    Vissers, C.A.; van Sinderen, Marten J.; Ferreira Pires, Luis; Pires, L.F.; Danthine, A.S.; Leduc, G.; Wolper, P.

    1993-01-01

    The introduction of formal methods in the design and development departments of an industrial company has far reaching and long lasting consequences. In fact it changes the whole environment of methods, tools and skills that determine the design culture of that company. A decision to replace current

  17. A Formal Model for Context-Awareness

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan

    here is a definite lack of formal support for modeling real- istic context-awareness in pervasive computing applications. The Conawa calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new construc...

  18. Simulation and formal analysis of visual attention

    NARCIS (Netherlands)

    Bosse, T.; Maanen, P.P. van; Treur, J.

    2009-01-01

    In this paper a simulation model for visual attention is discussed and formally analysed. The model is part of the design of an agent-based system that supports a naval officer in its task to compile a tactical picture of the situation in the field. A case study is described in which the model is

  19. Maintaining formal models of living guidelines efficiently

    NARCIS (Netherlands)

    Seyfang, Andreas; Martínez-Salvador, Begoña; Serban, Radu; Wittenberg, Jolanda; Miksch, Silvia; Marcos, Mar; Ten Teije, Annette; Rosenbrand, Kitty C J G M

    2007-01-01

    Translating clinical guidelines into formal models is beneficial in many ways, but expensive. The progress in medical knowledge requires clinical guidelines to be updated at relatively short intervals, leading to the term living guideline. This causes potentially expensive, frequent updates of the

  20. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Today formal verification is finding increasing acceptance in some areas, especially model abstraction and functional verification. Other major chal- lenges, like timing verification, remain before this technology can be posed as a complete alternative to simulation. This special issue is devoted to presenting some of the ...

  1. Protocol design and implementation using formal methods

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Ferreira Pires, Luis; Pires, L.F.; Vissers, C.A.

    1992-01-01

    This paper reports on a number of formal methods that support correct protocol design and implementation. These methods are placed in the framework of a design methodology for distributed systems that was studied and developed within the ESPRIT II Lotosphere project (2304). The paper focuses on

  2. A formal model for total quality management

    NARCIS (Netherlands)

    S.C. van der Made-Potuijt; H.B. Bertsch (Boudewijn); L.P.J. Groenewegen

    1996-01-01

    textabstractTotal Quality Management (TQM) is a systematic approach to managing a company. TQM is systematic in the sense that it is uses facts through observation, analysis and measurable goals. There are theoretical descriptions of this management concept, but there is no formal model of it. A

  3. Formal synthesis of naturally occurring norephedrine

    Indian Academy of Sciences (India)

    A concise and simple synthesis of 1-hydroxy-phenethylamine derivatives has been achieved following classical organic transformations using commercially available chiral pools. The said derivatives were explored for the synthesis of naturally occurring bio-active small molecules. Formal synthesis of norephedrine, virolin ...

  4. A Formal Model of Identity Mixer

    DEFF Research Database (Denmark)

    Camenisch, Jan; Mödersheim, Sebastian Alexander; Sommer, Dieter

    2010-01-01

    Identity Mixer is an anonymous credential system developed at IBM that allows users for instance to prove that they are over 18 years old without revealing their name or birthdate. This privacy-friendly tech- nology is realized using zero-knowledge proofs. We describe a formal model of Identity...

  5. Confusion about entrepreneurship? Formal versus informal small ...

    African Journals Online (AJOL)

    chestt

    that contributes to both business formation and the ultimate expansion or growth of the business. The entrepreneurial actions related to these business activities are analysed in this study. The differential application of these actions in the formal and informal business panels is of particular importance for this study. Although ...

  6. Applicability of four parameter formalisms in interpreting ...

    Indian Academy of Sciences (India)

    The four parameter functions are generally considered to be adequate for representation of the thermodynamic properties for the strongly interacting binary systems. The present study involves a critical comparison in terms of applicability of the three well known four-parameter formalisms for the representation of the ...

  7. Educational attainment, formal employment and contraceptives ...

    African Journals Online (AJOL)

    Based on this, the study examines educational attainment, formal employment and contraceptives practices among working women in Lagos State University. Survey design was adopted for the study. Using Stratified and simple random sampling techniques, quantitative data was gathered through the administration of ...

  8. 29 CFR 101.20 - Formal hearing.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 2 2010-07-01 2010-07-01 false Formal hearing. 101.20 Section 101.20 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD STATEMENTS OF PROCEDURES Representation Cases Under... a copy of the petition, is served on the unions and employer filing or named in the petition and on...

  9. The factualization of uncertainty:

    DEFF Research Database (Denmark)

    Meyer, G.; Folker, A.P.; Jørgensen, R.B.

    2005-01-01

    on risk assessment does nothing of the sort and is not likely to present an escape from the international deadlock on the use of genetic modification in agriculture and food production. The new legislation is likely to stimulate the kind of emotive reactions it was intended to prevent. In risk assessment...... exercises, scientific uncertainty is turned into risk, expressed in facts and figures. Paradoxically, this conveys an impression of certainty, while value-disagreement and conflicts of interest remain hidden below the surface of factuality. Public dialogue and negotiation along these lines are rendered...... would be to take care of itself – rethinking the role and the limitations of science in a social context, and, thereby gaining the strength to fulfill this role and to enter into dialogue with the rest of society. Scientific communities appear to be obvious candidates for prompting reflection...

  10. Uncertainty as Certaint

    Science.gov (United States)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  11. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    -Nürnberg, Chair for Quality Management and Manufacturing-Oriented Metrology (Germany). 'Metro-E-Learn' project proposes to develop and implement a coherent learning and competence chain that leads from introductory and foundation e-courses in initial manufacturing engineering studies towards higher....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  12. Formalism and the notion of truth

    Science.gov (United States)

    Spencer, Joseph M.

    The most widely acknowledged conceptions of truth take some kind of relation to be at truth's core. This dissertation attempts to establish that an adequate conception of this relation begins with an investigation of the entanglement of the formal and the material as set forth in the model theoretical development of set theoretical mathematics. Truth concerns first and most crucially a certain commerce across the border between the formal and the material, between the ideal and the real. The entanglement of the formal and the material must be thought in itself, apart from or prior to any assimilation into philosophical schemas committed to larger metaphysical claims. This is accomplished in model theory. The twentieth century witnessed two attempts at bringing model theoretical mathematics to bear on accounting philosophically for the concept of truth: that of Alfred Tarski, and that of Alain Badiou. In order to investigate the relevance of model theory to the task of working out a philosophical conception of truth, this dissertation investigates, through comparative work, these two thinkers. It is necessary to see where their projects converge in important ways, as well as where their projects diverge in equally important ways. What brings their work into close proximity is their shared conviction that truth must be thought in light of model theory. Nonetheless, the two do not agree about exactly how model theory sheds light on truth. Comparative study thus reveals both a shared site for thinking and a struggle over the significance of that site. Agreement between Tarski and Badiou concerns the excess of the purely formal over itself, marked by the generation of an undecidable statement within formal systems of a certain level of complexity. Both thinkers determine that this formal excess touches on the material, and both further determine that the consequent entanglement of the formal and the material provides the basic frame for any philosophical consideration

  13. Uncertainty information in climate data records from Earth observation

    Directory of Open Access Journals (Sweden)

    C. J. Merchant

    2017-07-01

    Full Text Available The question of how to derive and present uncertainty information in climate data records (CDRs has received sustained attention within the European Space Agency Climate Change Initiative (CCI, a programme to generate CDRs addressing a range of essential climate variables (ECVs from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions. The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the

  14. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  15. Concept similarity and related categories in information retrieval using formal concept analysis

    Science.gov (United States)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  16. Between Development and Environment: Uncertainties of Agrofuels

    Science.gov (United States)

    Leon Sicard, Tomas Enrique

    2009-01-01

    This article examines the dominant agricultural model in Colombia of which the emergence of biofuels is an inevitable and major consequence. Some uncertainties and complexities of the introduction of biofuels and the use of genetically modified crops are analyzed, including a general reflection on the possibilities of producing biofuels on the…

  17. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  18. Uncertainties in predicting solar panel power output

    Science.gov (United States)

    Anspaugh, B.

    1974-01-01

    The problem of calculating solar panel power output at launch and during a space mission is considered. The major sources of uncertainty and error in predicting the post launch electrical performance of the panel are considered. A general discussion of error analysis is given. Examples of uncertainty calculations are included. A general method of calculating the effect on the panel of various degrading environments is presented, with references supplied for specific methods. A technique for sizing a solar panel for a required mission power profile is developed.

  19. Concepciones acerca de la maternidad en la educación formal y no formal

    Directory of Open Access Journals (Sweden)

    Alvarado Calderón, Kathia

    2005-06-01

    Full Text Available Este artículo presenta algunos resultados de la investigación desarrollada en el Instituto de Investigación en Educación (INIE, bajo el nombre "Construcción del concepto de maternidad en la educación formal y no formal". Utilizando un enfoque cualitativo de investigación, recurrimos a las técnicas de elaboración de dibujos, entrevistas y grupo focal como recursos para la recolección de la información. De esta manera, podemos acercarnos a las concepciones de la maternidad que utilizan los participantes de las diferentes instancias educativas (formal y no formal con quienes se trabajó. This article presents some results the research developed in the Instituto de Investigación en Educación (INIE, named "Construcción del concepto de maternidad en la educación formal y no formal". It begins with a theoretical analysis about social conceptions regarding motherhood in the occidental societies. Among the techniques for gathering information were thematic drawing, interview and focus group, using a qualitative approach research method. This is followed by a brief summary of main findings. The article concludes with a proposal of future working lines for the deconstruction of the motherhood concept in formal and informal education contexts.

  20. Formalization and Land Grabbing in Africa: Facilitation or Protection ...

    African Journals Online (AJOL)

    The first focuses on investment and land grabbing, and the second on the formalization of rural property rights. Less has been written on the impact of formalization on land grabbing and of land grabbing on formalization. Recently, formalization has been put forward to protect the rights of pastoralists and farmers from land ...

  1. 20 CFR 702.336 - Formal hearings; new issues.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Formal hearings; new issues. 702.336 Section... Procedures Formal Hearings § 702.336 Formal hearings; new issues. (a) If, during the course of the formal hearing, the evidence presented warrants consideration of an issue or issues not previously considered...

  2. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  3. Uncertainties in the Norwegian greenhouse gas emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Flugsrud, Ketil; Hoem, Britta

    2011-11-15

    The national greenhouse gas (GHG) emission inventory is compiled from estimates based on emission factors and activity data and from direct measurements by plants. All these data and parameters will contribute to the overall inventory uncertainty. The uncertainties and probability distributions of the inventory input parameters have been assessed based on available data and expert judgements.Finally, the level and trend uncertainties of the national GHG emission inventory have been estimated using Monte Carlo simulation. The methods used in the analysis correspond to an IPCC tier 2 method, as described in the IPCC Good Practice Guidance (IPCC 2000) (IPCC 2000). Analyses have been made both excluding and including the sector LULUCF (land use, land-use change and forestry). The uncertainty analysis performed in 2011 is an update of the uncertainty analyses performed for the greenhouse gas inventory in 2006 and 2000. During the project we have been in contact with experts, and have collected information about uncertainty from them. Main focus has been on the source categories where changes have occured since the last uncertainty analysis was performed in 2006. This includes new methodology for several source categories (for example for solvents and road traffic) as well as revised uncertainty estimates. For the installations included in the emission trading system, new information from the annual ETS reports about uncertainty in activity data and CO2 emission factor (and N2O emission factor for nitric acid production) has been used. This has improved the quality of the uncertainty estimates for the energy and manufacturing sectors. The results show that the uncertainty level in the total calculated greenhouse gas emissions for 2009 is around 4 per cent. When including the LULUCF sector, the total uncertainty is around 17 per cent in 2009. The uncertainty estimate is lower now than previous analyses have shown. This is partly due to a considerable work made to improve

  4. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  5. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Fantke, Peter

    2017-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  6. Essential competencies analysis of a training model development for non-formal vocational teachers under the office of the non-formal and informal education in Thailand

    Directory of Open Access Journals (Sweden)

    Chayanopparat Piyanan

    2016-01-01

    Full Text Available Non-formal vocational education provides practical experiences in a particular occupational field to non-formal semi-skilled learners. Non-formal vocational teachers are the key persons to deliver particular occupational knowledge. The essential competencies enhancement for non-sformal vocational teachers will improve teaching performance. The question of the research is what the essential competencies for the nonformal vocational teachers are. The research method was 1 to review related literature, 2 to collect a needs assessment, and 3 to analyse the essential competencies for non-formal vocational teachers. The population includes non-formal vocational teachers at the executive level and nonformal vocational teachers. The results from the essential competencies analysis found that the essential competencies for non-formal vocational teachers consist of 5 capabilities including 1 Adult learning design capability, 2 Adult learning principle application capability, 3 ICT searching capability for teaching preparation, 4 Instructional plan development capability and 5 Instructional media development capability.

  7. Impact of discharge data uncertainty on nutrient load uncertainty

    Science.gov (United States)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  8. Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report

    Science.gov (United States)

    Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael

    2017-01-01

    This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.

  9. Propagation of Neutron Cross Section, Fission Yield, and Decay Data Uncertainties in Depletion Calculations

    Science.gov (United States)

    Martinez, J. S.; Zwermann, W.; Gallner, L.; Puente-Espel, F.; Cabellos, O.; Velkov, K.; Hannstein, V.

    2014-04-01

    Propagation of nuclear data uncertainties in reactor calculations is interesting for design purposes and libraries evaluation. Previous versions of the GRS XSUSA library propagated only neutron cross section uncertainties. We have extended XSUSA uncertainty assessment capabilities by including propagation of fission yields and decay data uncertainties due to the their relevance in depletion simulations. We apply this extended methodology to the UAM6 PWR Pin-Cell Burnup Benchmark, which involves uncertainty propagation through burnup.

  10. Review of strategies for handling geological uncertainty in groundwater flow and transport modeling

    DEFF Research Database (Denmark)

    Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.

    2012-01-01

    be accounted for, but is often neglected, in assessments of prediction uncertainties. Strategies for assessing prediction uncertainty due to geologically related uncertainty may be divided into three main categories, accounting for uncertainty due to: (a) the geological structure; (b) effective model...... parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...

  11. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  12. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  13. Money and Growth under Uncertainty.

    Science.gov (United States)

    ECONOMICS, UNCERTAINTY), (*MONEY, DECISION MAKING), (* BEHAVIOR , MATHEMATICAL MODELS), PRODUCTION, CONSUMPTION , EQUILIBRIUM(PHYSIOLOGY), GROWTH(PHYSIOLOGY), MANAGEMENT ENGINEERING, PROBABILITY, INTEGRAL EQUATIONS, THESES

  14. An Institutional Framework for Heterogeneous Formal Development in UML

    OpenAIRE

    Knapp, Alexander; Mossakowski, Till; Roggenbach, Markus

    2014-01-01

    We present a framework for formal software development with UML. In contrast to previous approaches that equip UML with a formal semantics, we follow an institution based heterogeneous approach. This can express suitable formal semantics of the different UML diagram types directly, without the need to map everything to one specific formalism (let it be first-order logic or graph grammars). We show how different aspects of the formal development process can be coherently formalised, ranging fr...

  15. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  16. Experimental studies of uncertainties associated with chromatographic techniques.

    Science.gov (United States)

    Barwick, V J; Ellison, S L; Lucking, C L; Burn, M J

    2001-05-25

    The paper describes experiments for the evaluation of uncertainties associated with a number of chromatographic parameters. Studies of the analysis of vitamins by HPLC illustrate the estimation of the uncertainties associated with experimental "input" parameters such as the detector wavelength, column temperature and mobile phase flow-rate. Experimental design techniques, which allow the efficient study a number of parameters simultaneously, are described. Multiple linear regression was used to fit response surfaces to the data. The resulting equations were used in the estimation of the uncertainties. Three approaches to uncertainty calculation were compared--Kragten's spreadsheet, symmetric spreadsheet and algebraic differentiation. In cases where non-linearity in the model was significant, agreement between the uncertainty estimates was poor as the spreadsheet approaches do not include second-order uncertainty terms.

  17. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  18. Informal Risk Perceptions and Formal Theory

    International Nuclear Information System (INIS)

    Cayford, Jerry

    2001-01-01

    Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to a qualitative

  19. Informal Risk Perceptions and Formal Theory

    Energy Technology Data Exchange (ETDEWEB)

    Cayford, Jerry [Resources for the Future, Washington, DC (United States)

    2001-07-01

    Economists have argued persuasively that our goals are wider than just risk minimization, and that they include a prudent weighing of costs and benefits. This economic line of thought recognizes that our policy goals are complex. As we widen the range of goals we are willing to entertain, though, we need to check that the methods we customarily employ are appropriate for the tasks to which we customarily apply them. This paper examines some economic methods of risk assessment, in light of the question of what our policy goals are and should be. Once the question of goals is open, more complexities than just cost intrude: what the public wants and why begs to be addressed. This leads us to the controversial issue of public risk perceptions. We have now examined a number of procedures that experts use to make public policy decisions. Behind all these issues is always the question of social welfare: what actions can we take, what policies should we embrace, to make the world a better place? In many cases, the public and the experts disagree about what the right choice is. In the first section, we saw a possible defense of the experts based on democratic theory: the people's participation, and even their will, can be legitimately set aside in the pursuit of their true interests. If this defense is to work, a great deal of weight rests on the question of the people's interests and the competence and integrity of the experts' pursuit of it. But at the same time, social preferences are ill-defined, and so are not good candidates for rational actor theory. Both the prescriptive legitimacy claim and the very workings of formal theory we have seen to depend on informal, qualitative, political judgments. Unfortunately, we have also seen a steady pattern of expert reliance on technical procedures even when they were manifestly unsuited to the task. The experts seem so intent on excluding informal thought that they would prefer even a bad quantitative process to

  20. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.