WorldWideScience

Sample records for include formal uncertainties

  1. Formal Ontologies and Uncertainty. In Geographical Knowledge

    Directory of Open Access Journals (Sweden)

    Matteo Caglioni

    2014-05-01

    Full Text Available Formal ontologies have proved to be a very useful tool to manage interoperability among data, systems and knowledge. In this paper we will show how formal ontologies can evolve from a crisp, deterministic framework (ontologies of hard knowledge to new probabilistic, fuzzy or possibilistic frameworks (ontologies of soft knowledge. This can considerably enlarge the application potential of formal ontologies in geographic analysis and planning, where soft knowledge is intrinsically linked to the complexity of the phenomena under study.  The paper briefly presents these new uncertainty-based formal ontologies. It then highlights how ontologies are formal tools to define both concepts and relations among concepts. An example from the domain of urban geography finally shows how the cause-to-effect relation between household preferences and urban sprawl can be encoded within a crisp, a probabilistic and a possibilistic ontology, respectively. The ontology formalism will also determine the kind of reasoning that can be developed from available knowledge. Uncertain ontologies can be seen as the preliminary phase of more complex uncertainty-based models. The advantages of moving to uncertainty-based models is evident: whether it is in the analysis of geographic space or in decision support for planning, reasoning on geographic space is almost always reasoning with uncertain knowledge of geographic phenomena.

  2. Towards a formal taxonomy of hybrid uncertainty representations

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, C.; Rocha, L.

    1997-02-01

    Recent years have seen a proliferation of methods in addition to probability theory to represent information and uncertainty, including fuzzy sets and systems, fuzzy measures, rough sets, random sets, possibility distributions, imprecise probabilities, etc. We can identify these fields collectively as General Information Theory. The components of GIT represent information according to different axiomatic bases, and are thus capable of capturing different semantic aspects of uncertainty. Traditionally, these semantic criteria include such categories as fuzziness, vagueness, nonspecificity, conflict, and randomness. So it is clear that there is a pressing need for the GIT community to synthesize these methods, searching out larger formal frameworks within which to place these various components with respect to each other. Ideally, syntactic (mathematical) generalization can both aid and be aided by the semantic analysis available in terms of the conceptual categories outlined above. In this paper we present some preliminary ideas about how to formally relate various uncertainty representations together in a taxonomic lattice, capturing both syntactic and semantic generalization. Some partial and provisional results are shown. Assume a simple finite universe of discourse {Omega} = (a, b, c). We want to describe a situation in which we ask a question of the sort {open_quotes}what is the value of a variable x which takes values in {Omega}?{close_quotes}. When there is no uncertainty, we have a single alternative, say x = a. In logical terms, we would say that the proposition p: {open_quotes}the value of x is a{close_quotes} is TRUE. Our approach begins with two primitive concepts which can change our knowledge of x, each of which represents a different form of uncertainty, nonspecificity and fuxxiness.

  3. Uncertainty principle in loop quantum cosmology by Moyal formalism

    Science.gov (United States)

    Perlov, Leonid

    2018-03-01

    In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.

  4. A formal treatment of uncertainty sources in a level 2 PSA

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    The methodological framework of the level 2 PSA appears to be currently standardized in a formalized fashion, but there have been different opinions on the way the sources of uncertainty are characterized and treated. This is primarily because the level 2 PSA deals with complex phenomenological processes that are deterministic in nature rather than random processes, and there are no probabilistic models characterizing them clearly. As a result, the probabilistic quantification of the level 2 PSA is often subjected to two sources of uncertainty: (a) incomplete modeling of accident pathways or different predictions for the behavior of phenomenological events and (b) expert-to-expert variation in estimating the occurrence probability of phenomenological events. While a clear definition of the two sources of uncertainty involved in the level 2 PSA makes it possible to treat an uncertainty in a consistent manner, careless application of these different sources of uncertainty may produce different conclusions in the decision-making process. The primary purpose of this paper is to characterize typical sources of uncertainty that would often be addressed in the level 2 PSA and their impacts on the PSA level 2 risk results. An additional purpose of this paper is to give a formal approach on how to combine random uncertainties addressed in the level 1 PSA with subjectivistic uncertainties addressed in the level 2 PSA

  5. A formal guidance for handling different uncertainty sources employed in the level 2 PSA

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon; Ha, Jae Joo

    2004-01-01

    The methodological framework of the level 2 PSA appears to be currently standardized in a formalized fashion, but there have been different opinions on the way the sources of uncertainty are characterized and treated. This is primarily because the level 2 PSA deals with complex phenomenological processes that are deterministic in nature rather than random processes, and there are no probabilistic models characterizing them clearly. As a result, the probabilistic quantification of the level 2 PSA CET/APET is often subjected to two sources of uncertainty: (a) incomplete modeling of accident pathways or different predictions for the behavior of phenomenological events and (b) expert-to-expert variation in estimating the occurrence probability of phenomenological events. While a clear definition of the two sources of uncertainty involved in the level 2 PSA makes it possible to treat an uncertainty in a consistent manner, careless application of these different sources of uncertainty may produce different conclusions in the decision-making process. The primary purpose of this paper is to characterize typical sources of uncertainty that would often be addressed in the level 2 PSA and to provide a formal guidance for quantifying their impacts on the PSA level 2 risk results. An additional purpose of this paper is to give a formal approach on how to combine random uncertainties addressed in the level 1 PSA with subjectivistic uncertainties addressed in the level 2 PSA

  6. Formal modeling of a system of chemical reactions under uncertainty.

    Science.gov (United States)

    Ghosh, Krishnendu; Schlipf, John

    2014-10-01

    We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.

  7. Universal uncertainty principle in the measurement operator formalism

    International Nuclear Information System (INIS)

    Ozawa, Masanao

    2005-01-01

    Heisenberg's uncertainty principle has been understood to set a limitation on measurements; however, the long-standing mathematical formulation established by Heisenberg, Kennard, and Robertson does not allow such an interpretation. Recently, a new relation was found to give a universally valid relation between noise and disturbance in general quantum measurements, and it has become clear that the new relation plays a role of the first principle to derive various quantum limits on measurement and information processing in a unified treatment. This paper examines the above development on the noise-disturbance uncertainty principle in the model-independent approach based on the measurement operator formalism, which is widely accepted to describe a class of generalized measurements in the field of quantum information. We obtain explicit formulae for the noise and disturbance of measurements given by measurement operators, and show that projective measurements do not satisfy the Heisenberg-type noise-disturbance relation that is typical in the gamma-ray microscope thought experiments. We also show that the disturbance on a Pauli operator of a projective measurement of another Pauli operator constantly equals √2, and examine how this measurement violates the Heisenberg-type relation but satisfies the new noise-disturbance relation

  8. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  9. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    Science.gov (United States)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  10. A formal statistical approach to representing uncertainty in rainfall-runoff modelling with focus on residual analysis and probabilistic output evaluation - Distinguishing simulation and prediction

    DEFF Research Database (Denmark)

    Breinholt, Anders; Møller, Jan Kloppenborg; Madsen, Henrik

    2012-01-01

    While there seems to be consensus that hydrological model outputs should be accompanied with an uncertainty estimate the appropriate method for uncertainty estimation is not agreed upon and a debate is ongoing between advocators of formal statistical methods who consider errors as stochastic...... and GLUE advocators who consider errors as epistemic, arguing that the basis of formal statistical approaches that requires the residuals to be stationary and conform to a statistical distribution is unrealistic. In this paper we take a formal frequentist approach to parameter estimation and uncertainty...... necessary but the statistical assumptions were nevertheless not 100% justified. The residual analysis showed that significant autocorrelation was present for all simulation models. We believe users of formal approaches to uncertainty evaluation within hydrology and within environmental modelling in general...

  11. Momentum conservation decides Heisenberg's interpretation of the uncertainty formulas

    International Nuclear Information System (INIS)

    Angelidis, T.D.

    1977-01-01

    In the light of Heisenberg's interpretation of the uncertainty formulas, the conditions necessary for the derivation of the quantitative statement or law of momentum conservation are considered. The result of such considerations is a contradiction between the formalism of quantum physics and the asserted consequences of Heisenberg's interpretation. This contradiction decides against Heisenberg's interpretation of the uncertainty formulas on upholding that the formalism of quantum physics is both consistent and complete, at least insofar as the statement of momentum conservation can be proved within this formalism. A few comments are also included on Bohr's complementarity interpretation of the formalism of quantum physics. A suggestion, based on a statistical mode of empirical testing of the uncertainty formulas, does not give rise to any such contradiction

  12. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  13. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  14. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  15. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  16. Incorporating model parameter uncertainty into inverse treatment planning

    International Nuclear Information System (INIS)

    Lian Jun; Xing Lei

    2004-01-01

    Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment

  17. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  18. Revisiting the formal foundation of Probabilistic Databases

    NARCIS (Netherlands)

    Wanders, B.; van Keulen, Maurice

    2015-01-01

    One of the core problems in soft computing is dealing with uncertainty in data. In this paper, we revisit the formal foundation of a class of probabilistic databases with the purpose to (1) obtain data model independence, (2) separate metadata on uncertainty and probabilities from the raw data, (3)

  19. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    International Nuclear Information System (INIS)

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  20. Quantum time uncertainty in a gravity's rainbow formalism

    International Nuclear Information System (INIS)

    Galan, Pablo; Marugan, Guillermo A. Mena

    2004-01-01

    The existence of a minimum time uncertainty is usually argued to be a consequence of the combination of quantum mechanics and general relativity. Most of the studies that point to this result are nonetheless based on perturbative quantization approaches, in which the effect of matter on the geometry is regarded as a correction to a classical background. In this paper, we consider rainbow spacetimes constructed from doubly special relativity by using a modification of the proposals of Magueijo and Smolin. In these models, gravitational effects are incorporated (at least to a certain extent) in the definition of the energy-momentum of particles without adhering to a perturbative treatment of the backreaction. In this context, we derive and compare the expressions of the time uncertainty in quantizations that use as evolution parameter either the background or the rainbow time coordinates. These two possibilities can be regarded as corresponding to perturbative and nonperturbative quantization schemes, respectively. We show that, while a nonvanishing time uncertainty is generically unavoidable in a perturbative framework, an infinite time resolution can in fact be achieved in a nonperturbative quantization for the whole family of doubly special relativity theories with unbounded physical energy

  1. Including model uncertainty in risk-informed decision making

    International Nuclear Information System (INIS)

    Reinert, Joshua M.; Apostolakis, George E.

    2006-01-01

    Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and are known to have significant model uncertainties. Because we work with basic event probabilities, this methodology is not appropriate for analyzing uncertainties that cause a structural change to the model, such as success criteria. We use the risk achievement worth (RAW) importance measure with respect to both the core damage frequency (CDF) and the change in core damage frequency (ΔCDF) to identify potentially important basic events. We cross-check these with generically important model uncertainties. Then, sensitivity analysis is performed on the basic event probabilities, which are used as a proxy for the model parameters, to determine how much error in these probabilities would need to be present in order to impact the decision. A previously submitted licensing basis change is used as a case study. Analysis using the SAPHIRE program identifies 20 basic events as important, four of which have model uncertainties that have been identified in the literature as generally important. The decision is fairly insensitive to uncertainties in these basic events. In three of these cases, one would need to show that model uncertainties would lead to basic event probabilities that would be between two and four orders of magnitude larger than modeled in the risk assessment before they would become important to the decision. More detailed analysis would be required to determine whether these higher probabilities are reasonable. Methods to perform this analysis from the literature are reviewed and an example is demonstrated using the case study

  2. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  3. A formal framework for scenario development in support of environmental decision-making

    Science.gov (United States)

    Mahmoud, M.; Liu, Yajing; Hartmann, H.; Stewart, S.; Wagener, T.; Semmens, D.; Stewart, R.; Gupta, H.; Dominguez, D.; Dominguez, F.; Hulse, D.; Letcher, R.; Rashleigh, Brenda; Smith, C.; Street, R.; Ticehurst, J.; Twery, M.; van, Delden H.; Waldick, R.; White, D.; Winter, L.

    2009-01-01

    Scenarios are possible future states of the world that represent alternative plausible conditions under different assumptions. Often, scenarios are developed in a context relevant to stakeholders involved in their applications since the evaluation of scenario outcomes and implications can enhance decision-making activities. This paper reviews the state-of-the-art of scenario development and proposes a formal approach to scenario development in environmental decision-making. The discussion of current issues in scenario studies includes advantages and obstacles in utilizing a formal scenario development framework, and the different forms of uncertainty inherent in scenario development, as well as how they should be treated. An appendix for common scenario terminology has been attached for clarity. Major recommendations for future research in this area include proper consideration of uncertainty in scenario studies in particular in relation to stakeholder relevant information, construction of scenarios that are more diverse in nature, and sharing of information and resources among the scenario development research community. ?? 2008 Elsevier Ltd.

  4. S5-4: Formal Modeling of Affordance in Human-Included Systems

    Directory of Open Access Journals (Sweden)

    Namhun Kim

    2012-10-01

    Full Text Available In spite of it being necessary for humans to consider modeling, analysis, and control of human-included systems, it has been considered a challenging problem because of the critical role of humans in complex systems and of humans' capability of executing unanticipated actions–both beneficial and detrimental ones. Thus, to provide systematic approaches to modeling human actions as a part of system behaviors, a formal modeling framework for human-involved systems in which humans play a controlling role based on their perceptual information is presented. The theory of affordance provides definitions of human actions and their associated properties; Finite State Automata (FSA based modeling is capable of mapping nondeterministic humans into computable components in the system representation. In this talk, we investigate the role of perception in human actions in the system operation and examine the representation of perceptual elements in affordance-based modeling formalism. The proposed framework is expected to capture the natural ways in which humans participate in the system as part of its operation. A human-machine cooperative manufacturing system control example and a human agent simulation example will be introduced for the illustrative purposes at the end of the presentation.

  5. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  6. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  7. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  8. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  9. Uncertainty of rotating shadowband irradiometers and Si-pyranometers including the spectral irradiance error

    Science.gov (United States)

    Wilbert, Stefan; Kleindiek, Stefan; Nouri, Bijan; Geuder, Norbert; Habte, Aron; Schwandt, Marko; Vignola, Frank

    2016-05-01

    Concentrating solar power projects require accurate direct normal irradiance (DNI) data including uncertainty specifications for plant layout and cost calculations. Ground measured data are necessary to obtain the required level of accuracy and are often obtained with Rotating Shadowband Irradiometers (RSI) that use photodiode pyranometers and correction functions to account for systematic effects. The uncertainty of Si-pyranometers has been investigated, but so far basically empirical studies were published or decisive uncertainty influences had to be estimated based on experience in analytical studies. One of the most crucial estimated influences is the spectral irradiance error because Si-photodiode-pyranometers only detect visible and color infrared radiation and have a spectral response that varies strongly within this wavelength interval. Furthermore, analytic studies did not discuss the role of correction functions and the uncertainty introduced by imperfect shading. In order to further improve the bankability of RSI and Si-pyranometer data, a detailed uncertainty analysis following the Guide to the Expression of Uncertainty in Measurement (GUM) has been carried out. The study defines a method for the derivation of the spectral error and spectral uncertainties and presents quantitative values of the spectral and overall uncertainties. Data from the PSA station in southern Spain was selected for the analysis. Average standard uncertainties for corrected 10 min data of 2 % for global horizontal irradiance (GHI), and 2.9 % for DNI (for GHI and DNI over 300 W/m²) were found for the 2012 yearly dataset when separate GHI and DHI calibration constants were used. Also the uncertainty in 1 min resolution was analyzed. The effect of correction functions is significant. The uncertainties found in this study are consistent with results of previous empirical studies.

  10. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  11. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  12. Network optimization including gas lift and network parameters under subsurface uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Riegert, R.; Baffoe, J.; Pajonk, O. [SPT Group GmbH, Hamburg (Germany); Badalov, H.; Huseynov, S. [Technische Univ. Clausthal, Clausthal-Zellerfeld (Germany). ITE; Trick, M. [SPT Group, Calgary, AB (Canada)

    2013-08-01

    Optimization of oil and gas field production systems poses a great challenge to field development due to complex and multiple interactions between various operational design parameters and subsurface uncertainties. Conventional analytical methods are capable of finding local optima based on single deterministic models. They are less applicable for efficiently generating alternative design scenarios in a multi-objective context. Practical implementations of robust optimization workflows integrate the evaluation of alternative design scenarios and multiple realizations of subsurface uncertainty descriptions. Production or economic performance indicators such as NPV (Net Present Value) are linked to a risk-weighted objective function definition to guide the optimization processes. This work focuses on an integrated workflow using a reservoir-network simulator coupled to an optimization framework. The work will investigate the impact of design parameters while considering the physics of the reservoir, wells, and surface facilities. Subsurface uncertainties are described by well parameters such as inflow performance. Experimental design methods are used to investigate parameter sensitivities and interactions. Optimization methods are used to find optimal design parameter combinations which improve key performance indicators of the production network system. The proposed workflow will be applied to a representative oil reservoir coupled to a network which is modelled by an integrated reservoir-network simulator. Gas-lift will be included as an explicit measure to improve production. An objective function will be formulated for the net present value of the integrated system including production revenue and facility costs. Facility and gas lift design parameters are tuned to maximize NPV. Well inflow performance uncertainties are introduced with an impact on gas lift performance. Resulting variances on NPV are identified as a risk measure for the optimized system design. A

  13. Decision-Making under Criteria Uncertainty

    Science.gov (United States)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  14. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  15. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  16. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  17. Uncertainties of the Yn Parameters of the Hage-Cifarelli Formalism

    Energy Technology Data Exchange (ETDEWEB)

    Smith-Nelson, Mark A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Burr, Thomas Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hutchinson, Jesson D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cutler, Theresa Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-14

    One method for determining the physical parameters of a multiplying system is summarized by Cifarelli [1]. In this methodology the single, double and triple rates are determined from what is commonly referred to as Feynman histograms. This paper will examine two methods for estimating the uncertainty in the parameters used in inferring these rates. These methods will be compared with simulated data in order to determine which one best approximates the sample uncertainty.

  18. Representing uncertainty in objective functions: extension to include the influence of serial correlation

    Science.gov (United States)

    Croke, B. F.

    2008-12-01

    The role of performance indicators is to give an accurate indication of the fit between a model and the system being modelled. As all measurements have an associated uncertainty (determining the significance that should be given to the measurement), performance indicators should take into account uncertainties in the observed quantities being modelled as well as in the model predictions (due to uncertainties in inputs, model parameters and model structure). In the presence of significant uncertainty in observed and modelled output of a system, failure to adequately account for variations in the uncertainties means that the objective function only gives a measure of how well the model fits the observations, not how well the model fits the system being modelled. Since in most cases, the interest lies in fitting the system response, it is vital that the objective function(s) be designed to account for these uncertainties. Most objective functions (e.g. those based on the sum of squared residuals) assume homoscedastic uncertainties. If model contribution to the variations in residuals can be ignored, then transformations (e.g. Box-Cox) can be used to remove (or at least significantly reduce) heteroscedasticity. An alternative which is more generally applicable is to explicitly represent the uncertainties in the observed and modelled values in the objective function. Previous work on this topic addressed the modifications to standard objective functions (Nash-Sutcliffe efficiency, RMSE, chi- squared, coefficient of determination) using the optimal weighted averaging approach. This paper extends this previous work; addressing the issue of serial correlation. A form for an objective function that includes serial correlation will be presented, and the impact on model fit discussed.

  19. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  20. Some illustrative examples of model uncertainty

    International Nuclear Information System (INIS)

    Bier, V.M.

    1994-01-01

    In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion

  1. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Heuristics structure and pervade formal risk assessment.

    Science.gov (United States)

    MacGillivray, Brian H

    2014-04-01

    Lay perceptions of risk appear rooted more in heuristics than in reason. A major concern of the risk regulation literature is that such "error-strewn" perceptions may be replicated in policy, as governments respond to the (mis)fears of the citizenry. This has led many to advocate a relatively technocratic approach to regulating risk, characterized by high reliance on formal risk and cost-benefit analysis. However, through two studies of chemicals regulation, we show that the formal assessment of risk is pervaded by its own set of heuristics. These include rules to categorize potential threats, define what constitutes valid data, guide causal inference, and to select and apply formal models. Some of these heuristics lay claim to theoretical or empirical justifications, others are more back-of-the-envelope calculations, while still more purport not to reflect some truth but simply to constrain discretion or perform a desk-clearing function. These heuristics can be understood as a way of authenticating or formalizing risk assessment as a scientific practice, representing a series of rules for bounding problems, collecting data, and interpreting evidence (a methodology). Heuristics are indispensable elements of induction. And so they are not problematic per se, but they can become so when treated as laws rather than as contingent and provisional rules. Pitfalls include the potential for systematic error, masking uncertainties, strategic manipulation, and entrenchment. Our central claim is that by studying the rules of risk assessment qua rules, we develop a novel representation of the methods, conventions, and biases of the prior art. © 2013 Society for Risk Analysis.

  4. Uncertainty and the de Finetti tables

    OpenAIRE

    Baratgin , Jean; Over , David; Politzer , Guy

    2013-01-01

    International audience; The new paradigm in the psychology of reasoning adopts a Bayesian, or prob-abilistic, model for studying human reasoning. Contrary to the traditional binary approach based on truth functional logic, with its binary values of truth and falsity, a third value that represents uncertainty can be introduced in the new paradigm. A variety of three-valued truth table systems are available in the formal literature, including one proposed by de Finetti. We examine the descripti...

  5. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  6. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  7. Uncertainty and its propagation in dynamics models

    International Nuclear Information System (INIS)

    Devooght, J.

    1994-01-01

    The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision

  8. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Science.gov (United States)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  9. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  10. Pricing of medical devices under coverage uncertainty--a modelling approach.

    Science.gov (United States)

    Girling, Alan J; Lilford, Richard J; Young, Terry P

    2012-12-01

    Product vendors and manufacturers are increasingly aware that purchasers of health care will fund new clinical treatments only if they are perceived to deliver value-for-money. This influences companies' internal commercial decisions, including the price they set for their products. Other things being equal, there is a price threshold, which is the maximum price at which the device will be funded and which, if its value were known, would play a central role in price determination. This paper examines the problem of pricing a medical device from the vendor's point of view in the presence of uncertainty about what the price threshold will be. A formal solution is obtained by maximising the expected value of the net revenue function, assuming a Bayesian prior distribution for the price threshold. A least admissible price is identified. The model can also be used as a tool for analysing proposed pricing policies when no formal prior specification of uncertainty is available. Copyright © 2011 John Wiley & Sons, Ltd.

  11. Formalism for neutron cross section covariances in the resonance region using kernel approximation

    Energy Technology Data Exchange (ETDEWEB)

    Oblozinsky, P.; Cho,Y-S.; Matoon,C.M.; Mughabghab,S.F.

    2010-04-09

    We describe analytical formalism for estimating neutron radiative capture and elastic scattering cross section covariances in the resolved resonance region. We use capture and scattering kernels as the starting point and show how to get average cross sections in broader energy bins, derive analytical expressions for cross section sensitivities, and deduce cross section covariances from the resonance parameter uncertainties in the recently published Atlas of Neutron Resonances. The formalism elucidates the role of resonance parameter correlations which become important if several strong resonances are located in one energy group. Importance of potential scattering uncertainty as well as correlation between potential scattering and resonance scattering is also examined. Practical application of the formalism is illustrated on {sup 55}Mn(n,{gamma}) and {sup 55}Mn(n,el).

  12. Formal Epistemology and New Paradigm Psychology of Reasoning

    NARCIS (Netherlands)

    Pfeifer, Niki; Douven, Igor

    This position paper advocates combining formal epistemology and the new paradigm psychology of reasoning in the studies of conditionals and reasoning with uncertainty. The new paradigm psychology of reasoning is characterized by the use of probability theory as a rationality framework instead of

  13. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  14. Propagation of cross section uncertainties in combined Monte Carlo neutronics and burnup calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kuijper, J.C.; Oppe, J.; Klein Meulekamp, R.; Koning, H. [NRG - Fuels, Actinides and Isotopes group, Petten (Netherlands)

    2005-07-01

    Some years ago a methodology was developed at NRG for the calculation of 'density-to-density' and 'one-group cross section-to-density' sensitivity matrices and covariance matrices for final nuclide densities for burnup schemes consisting of multiple sets of flux/spectrum and burnup calculations. The applicability of the methodology was then demonstrated by calculations of BR3 MOX pin irradiation experiments employing multi-group cross section uncertainty data from the EAF4 data library. A recent development is the extension of this methodology to enable its application in combination with the OCTOPUS-MCNP-FISPACT/ORIGEN Monte Carlo burnup scheme. This required some extensions to the sensitivity matrix calculation tool CASEMATE. The extended methodology was applied on the 'HTR Plutonium Cell Burnup Benchmark' to calculate the uncertainties (covariances) in the final densities, as far as these uncertainties are caused by uncertainties in cross sections. Up to 600 MWd/kg these uncertainties are larger than the differences between the code systems. However, it should be kept in mind that the calculated uncertainties are based on EAF4 uncertainty data. It is not exactly clear on beforehand what a proper set of associated (MCNP) cross sections and covariances would yield in terms of final uncertainties in calculated densities. This will be investigated, by the same formalism, once these data becomes available. It should be noted that the studies performed up till the present date are mainly concerned with the influence of uncertainties in cross sections. The influence of uncertainties in the decay constants, although included in the formalism, is not considered further. Also the influence of other uncertainties (such as -geometrical- modelling approximations) has been left out of consideration for the time being. (authors)

  15. Propagation of cross section uncertainties in combined Monte Carlo neutronics and burnup calculations

    International Nuclear Information System (INIS)

    Kuijper, J.C.; Oppe, J.; Klein Meulekamp, R.; Koning, H.

    2005-01-01

    Some years ago a methodology was developed at NRG for the calculation of 'density-to-density' and 'one-group cross section-to-density' sensitivity matrices and covariance matrices for final nuclide densities for burnup schemes consisting of multiple sets of flux/spectrum and burnup calculations. The applicability of the methodology was then demonstrated by calculations of BR3 MOX pin irradiation experiments employing multi-group cross section uncertainty data from the EAF4 data library. A recent development is the extension of this methodology to enable its application in combination with the OCTOPUS-MCNP-FISPACT/ORIGEN Monte Carlo burnup scheme. This required some extensions to the sensitivity matrix calculation tool CASEMATE. The extended methodology was applied on the 'HTR Plutonium Cell Burnup Benchmark' to calculate the uncertainties (covariances) in the final densities, as far as these uncertainties are caused by uncertainties in cross sections. Up to 600 MWd/kg these uncertainties are larger than the differences between the code systems. However, it should be kept in mind that the calculated uncertainties are based on EAF4 uncertainty data. It is not exactly clear on beforehand what a proper set of associated (MCNP) cross sections and covariances would yield in terms of final uncertainties in calculated densities. This will be investigated, by the same formalism, once these data becomes available. It should be noted that the studies performed up till the present date are mainly concerned with the influence of uncertainties in cross sections. The influence of uncertainties in the decay constants, although included in the formalism, is not considered further. Also the influence of other uncertainties (such as -geometrical- modelling approximations) has been left out of consideration for the time being. (authors)

  16. Uncertainty and inference in the world of paleoecological data

    Science.gov (United States)

    McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.

    2017-12-01

    Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a

  17. Associating uncertainty with datasets using Linked Data and allowing propagation via provenance chains

    Science.gov (United States)

    Car, Nicholas; Cox, Simon; Fitch, Peter

    2015-04-01

    (PROV-O Entity and Activity classes) have UncertML elements recorded. This methodology is intentionally flexible to allow uncertainty metadata in many forms, not limited to UncertML. While the more formal representation of uncertainty metadata is desirable (using UncertProv elements to implement the UncertML conceptual model ), this will not always be possible, and any uncertainty data stored will be better than none. Since the UncertProv ontology contains a superset of UncertML elements to facilitate the representation of non-UncertML uncertainty data, it could easily be extended to include other formal uncertainty conceptual models thus allowing non-UncertML propagation calculations.

  18. Masses of Formal Philosophy

    DEFF Research Database (Denmark)

    Masses of Formal Philosophy is an outgrowth of Formal Philosophy. That book gathered the responses of some of the most prominent formal philosophers to five relatively open and broad questions initiating a discussion of metaphilosophical themes and problems surrounding the use of formal methods i...... in philosophy. Including contributions from a wide range of philosophers, Masses of Formal Philosophy contains important new responses to the original five questions.......Masses of Formal Philosophy is an outgrowth of Formal Philosophy. That book gathered the responses of some of the most prominent formal philosophers to five relatively open and broad questions initiating a discussion of metaphilosophical themes and problems surrounding the use of formal methods...

  19. Improving weather predictability by including land-surface model parameter uncertainty

    Science.gov (United States)

    Orth, Rene; Dutra, Emanuel; Pappenberger, Florian

    2016-04-01

    The land surface forms an important component of Earth system models and interacts nonlinearly with other parts such as ocean and atmosphere. To capture the complex and heterogenous hydrology of the land surface, land surface models include a large number of parameters impacting the coupling to other components of the Earth system model. Focusing on ECMWF's land-surface model HTESSEL we present in this study a comprehensive parameter sensitivity evaluation using multiple observational datasets in Europe. We select 6 poorly constrained effective parameters (surface runoff effective depth, skin conductivity, minimum stomatal resistance, maximum interception, soil moisture stress function shape, total soil depth) and explore their sensitivity to model outputs such as soil moisture, evapotranspiration and runoff using uncoupled simulations and coupled seasonal forecasts. Additionally we investigate the possibility to construct ensembles from the multiple land surface parameters. In the uncoupled runs we find that minimum stomatal resistance and total soil depth have the most influence on model performance. Forecast skill scores are moreover sensitive to the same parameters as HTESSEL performance in the uncoupled analysis. We demonstrate the robustness of our findings by comparing multiple best performing parameter sets and multiple randomly chosen parameter sets. We find better temperature and precipitation forecast skill with the best-performing parameter perturbations demonstrating representativeness of model performance across uncoupled (and hence less computationally demanding) and coupled settings. Finally, we construct ensemble forecasts from ensemble members derived with different best-performing parameterizations of HTESSEL. This incorporation of parameter uncertainty in the ensemble generation yields an increase in forecast skill, even beyond the skill of the default system. Orth, R., E. Dutra, and F. Pappenberger, 2016: Improving weather predictability by

  20. A first formal approach to animal spirits beyond uncertainty

    Directory of Open Access Journals (Sweden)

    Gerasimos T. Soldatos

    2015-12-01

    Full Text Available Standard Macroeconomics treats animal spirits as a source of uncertainty disturbing otherwise rational expectations. But, Keynesian animal spirits ensue from suboptimal emotional responses to socioeconomic status change beyond matters of uncertainty. This paper identifies such spirits with the disturbance from the optimal decision-making implied by an emotional well-being utility function. The introduction of a policy-maker, holding its own view of private welfare in a society of emotional individuals, generates by itself, i.e. in the absence of animal spirits, uniform business fluctuations. This is the result of the income redistribution needed to reconcile the policy-maker’s with the emotional individual’s view of private welfare. Consequently, if animal-spirits induced fluctuations are already present when a policy-maker is introduced in the economy, the aim of policy intervention should be the design of that income redistribution that would not aggravate the business cycle but that would end up in uniform only cycles, with the aid perhaps of discretionary interest rate policy. Nevertheless, if animal spirits do not exist when the policy-maker enters the system, the income-redistribution induced cycles may incite such spirits by themselves in which case the cycles will not be of the uniform type. All comes down to “income and emotion”, to an ageless and ecumenical fact of life, complicated purposefully or not by authority.

  1. Formal, Non-Formal and Informal Learning in the Sciences

    Science.gov (United States)

    Ainsworth, Heather L.; Eaton, Sarah Elaine

    2010-01-01

    This research report investigates the links between formal, non-formal and informal learning and the differences between them. In particular, the report aims to link these notions of learning to the field of sciences and engineering in Canada and the United States, including professional development of adults working in these fields. It offers…

  2. Advancing Uncertainty: Untangling and Discerning Related Concepts

    OpenAIRE

    Janice Penrod

    2002-01-01

    Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal in...

  3. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Energy Technology Data Exchange (ETDEWEB)

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    the lineament scale (k{sub t} = 2) on the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology

  4. Addressing model uncertainty in dose-response: The case of chloroform

    International Nuclear Information System (INIS)

    Evans, J.S.

    1994-01-01

    This paper discusses the issues involved in addressing model uncertainty in the analysis of dose-response relationships. A method for addressing model uncertainty is described and applied to characterize the uncertainty in estimates of the carcinogenic potency of chloroform. The approach, which is rooted in Bayesian concepts of subjective probability, uses probability trees and formally-elicited expert judgments to address model uncertainty. It is argued that a similar approach could be used to improve the characterization of model uncertainty in the dose-response relationships for health effects from ionizing radiation

  5. International conference on Facets of Uncertainties and Applications

    CERN Document Server

    Skowron, Andrzej; Maiti, Manoranjan; Kar, Samarjit

    2015-01-01

    Since the emergence of the formal concept of probability theory in the seventeenth century, uncertainty has been perceived solely in terms of probability theory. However, this apparently unique link between uncertainty and probability theory has come under investigation a few decades back. Uncertainties are nowadays accepted to be of various kinds. Uncertainty in general could refer to different sense like not certainly known, questionable, problematic, vague, not definite or determined, ambiguous, liable to change, not reliable. In Indian languages, particularly in Sanskrit-based languages, there are other higher levels of uncertainties. It has been shown that several mathematical concepts such as the theory of fuzzy sets, theory of rough sets, evidence theory, possibility theory, theory of complex systems and complex network, theory of fuzzy measures and uncertainty theory can also successfully model uncertainty.

  6. Uncertainties, confidence ellipsoids and security polytopes in LSA

    Science.gov (United States)

    Grabe, Michael

    1992-05-01

    For a given error model, the uncertainties of and the couplings between parameters estimated by a least-squares adjustment (LSA) are formalized. The error model is restricted to normally distributed random errors and to systematic errors that remain constant during measurement, but whose magnitudes and signs are unknown. An outline of the associated, new formalism for estimating measurement uncertainties is sketched as regards its function as a measure of the consistency between theory and experiment. The couplings due to random errors lead to ellipsoids stemming from singular linear mappings of Hotelling's ellipsoids. Those introduced by systematic errors create convex polytopes, so-called security polytopes, which are singular linear mappings of hyperblocks caused by a ldworst-case treatment” of systematic errors.

  7. Quantum Uncertainty and Decision-Making in Game Theory

    Science.gov (United States)

    Asano, M.; Ohya, M.; Tanaka, Y.; Khrennikov, A.; Basieva, I.

    2011-01-01

    Recently a few authors pointed to a possibility to apply the mathematical formalism of quantum mechanics to cognitive psychology, in particular, to games of the Prisoners Dilemma (PD) type.6_18 In this paper, we discuss the problem of rationality in game theory and point out that the quantum uncertainty is similar to the uncertainty of knowledge, which a player feels subjectively in his decision-making.

  8. Combining Formal, Non-Formal and Informal Learning for Workforce Skill Development

    Science.gov (United States)

    Misko, Josie

    2008-01-01

    This literature review, undertaken for Australian Industry Group, shows how multiple variations and combinations of formal, informal and non-formal learning, accompanied by various government incentives and organisational initiatives (including job redesign, cross-skilling, multi-skilling, diversified career pathways, action learning projects,…

  9. The effect of classroom instruction, attitudes towards science and motivation on students' views of uncertainty in science

    Science.gov (United States)

    Schroeder, Meadow

    This study examined developmental and gender differences in Grade 5 and 9 students' views of uncertainty in science and the effect of classroom instruction on attitudes towards science, and motivation. Study 1 examined views of uncertainty in science when students were taught science using constructivist pedagogy. A total of 33 Grade 5 (n = 17, 12 boys, 5 girls) and Grade 9 (n = 16, 8 boys, 8 girls) students were interviewed about the ideas they had about uncertainty in their own experiments (i.e., practical science) and in professional science activities (i.e., formal science). Analysis found an interaction between grade and gender in the number of categories of uncertainty identified for both practical and formal science. Additionally, in formal science, there was a developmental shift from dualism (i.e., science is a collection of basic facts that are the result of straightforward procedures) to multiplism (i.e., there is more than one answer or perspective on scientific knowledge) from Grade 5 to Grade 9. Finally, there was a positive correlation between the understanding uncertainty in practical and formal science. Study 2 compared the attitudes and motivation towards science and motivation of students in constructivist and traditional classrooms. Scores on the measures were also compared to students' views of uncertainty for constructivist-taught students. A total of 28 students in Grade 5 (n = 13, 11 boys, 2 girls) and Grade 9 (n = 15, 6 boys, 9 girls), from traditional science classrooms and the 33 constructivist students from Study 1 participated. Regardless of classroom instruction, fifth graders reported more positive attitudes towards science than ninth graders. Students from the constructivist classrooms reported more intrinsic motivation than students from the traditional classrooms. Constructivist students' views of uncertainty in formal and practical science did not correlate with their attitudes towards science and motivation.

  10. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    Science.gov (United States)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  11. The interplay between formal and informal contracting in integrated project delivery

    NARCIS (Netherlands)

    Bygballe, L.E.; Dewulf, Geert P.M.R.; Levitt, R.

    2015-01-01

    This research examines the interplay between formal and informal contracting in integrated project delivery (IPD). It investigates how the interplay enables parties in health-care construction projects to cope with uncertainty and complexities, due to, among others, changing demands. New delivery

  12. Cross section and method uncertainties: the application of sensitivity analysis to study their relationship in radiation transport benchmark problems

    International Nuclear Information System (INIS)

    Weisbi, C.R.; Oblow, E.M.; Ching, J.; White, J.E.; Wright, R.Q.; Drischler, J.

    1975-08-01

    Sensitivity analysis is applied to the study of an air transport benchmark calculation to quantify and distinguish between cross-section and method uncertainties. The boundary detector response was converged with respect to spatial and angular mesh size, P/sub l/ expansion of the scattering kernel, and the number and location of energy grid boundaries. The uncertainty in the detector response due to uncertainties in nuclear data is 17.0 percent (one standard deviation, not including uncertainties in energy and angular distribution) based upon the ENDF/B-IV ''error files'' including correlations in energy and reaction type. Differences of approximately 6 percent can be attributed exclusively to differences in processing multigroup transfer matrices. Formal documentation of the PUFF computer program for the generation of multigroup covariance matrices is presented. (47 figures, 14 tables) (U.S.)

  13. Modified Phenomena Identification and Ranking Table (PIRT) for Uncertainty Analysis

    International Nuclear Information System (INIS)

    Gol-Mohamad, Mohammad P.; Modarres, Mohammad; Mosleh, Ali

    2006-01-01

    This paper describes a methodology of characterizing important phenomena, which is also part of a broader research by the authors called 'Modified PIRT'. The methodology provides robust process of phenomena identification and ranking process for more precise quantification of uncertainty. It is a two-step process of identifying and ranking methodology based on thermal-hydraulics (TH) importance as well as uncertainty importance. Analytical Hierarchical Process (AHP) has been used for as a formal approach for TH identification and ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the TH model(s) used to represent the important phenomena. This part uses subjective justification by evaluating available information and data from experiments, and code predictions. The proposed methodology was demonstrated by developing a PIRT for large break loss of coolant accident LBLOCA for the LOFT integral facility with highest core power (test LB-1). (authors)

  14. An Integrated Approach for Characterization of Uncertainty in Complex Best Estimate Safety Assessment

    International Nuclear Information System (INIS)

    Pourgol-Mohamad, Mohammad; Modarres, Mohammad; Mosleh, Ali

    2013-01-01

    This paper discusses an approach called Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA) to characterize and integrate a wide range of uncertainties associated with the best estimate models and complex system codes used for nuclear power plant safety analyses. Examples of applications include complex thermal hydraulic and fire analysis codes. In identifying and assessing uncertainties, the proposed methodology treats the complex code as a 'white box', thus explicitly treating internal sub-model uncertainties in addition to the uncertainties related to the inputs to the code. The methodology accounts for uncertainties related to experimental data used to develop such sub-models, and efficiently propagates all uncertainties during best estimate calculations. Uncertainties are formally analyzed and probabilistically treated using a Bayesian inference framework. This comprehensive approach presents the results in a form usable in most other safety analyses such as the probabilistic safety assessment. The code output results are further updated through additional Bayesian inference using any available experimental data, for example from thermal hydraulic integral test facilities. The approach includes provisions to account for uncertainties associated with user-specified options, for example for choices among alternative sub-models, or among several different correlations. Complex time-dependent best-estimate calculations are computationally intense. The paper presents approaches to minimize computational intensity during the uncertainty propagation. Finally, the paper will report effectiveness and practicality of the methodology with two applications to a complex thermal-hydraulics system code as well as a complex fire simulation code. In case of multiple alternative models, several techniques, including dynamic model switching, user-controlled model selection, and model mixing, are discussed. (authors)

  15. 75 FR 16514 - Bayer Material Science, LLC, Formally Known as Sheffield Plastics, Including On-Site Leased...

    Science.gov (United States)

    2010-04-01

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-71,045] Bayer Material Science... January 8th, 2010, applicable to workers of Bayer Material Science, LLC, formally known as Sheffield... polycarbonate film products. Information shows that Bayer Material Science, LLC was formally known as Sheffield...

  16. Uncertainty in eddy covariance measurements and its application to physiological models

    Science.gov (United States)

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  17. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    International Nuclear Information System (INIS)

    Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.

    2009-11-01

    the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology and fracturing properties main characteristics. From that

  18. Lessons Learned- The Use of Formal Expert Elicitation in Probablistic Seismic Hazard

    Energy Technology Data Exchange (ETDEWEB)

    K.J. Coppersmith; R.C. Perman; R.R. Youngs

    2006-05-10

    Probabilistic seismic hazard analyses provide the opportunity, indeed the requirement, to quantify the uncertainties in important inputs to the analysis. The locations of future earthquakes, their recurrence rates and maximum size, and the ground motions that will result at a site of interest are all quantities that require careful consideration because they are uncertain. The earliest PSHA models [Cornell, 1968] provided solely for the randomness or aleatory variability in these quantities. The most sophisticated seismic hazard models today, which include quantified uncertainties, are merely more realistic representations of this basic aleatory model. All attempts to quantify uncertainties require expert judgment. Further, all uncertainty models should endeavor to consider the range of views of the larger technical community at the time the hazard analysis is conducted. In some cases, especially for large projects under regulatory review, formal structured methods for eliciting expert judgments have been employed. Experience has shown that certain key elements are required for these assessments to be successful, including: (1) experts should be trained in probability theory, uncertainty quantification, and ways to avoid common cognitive biases; (2) comprehensive and user-friendly databases should be provided to the experts; (3) experts should be required to evaluate all potentially credible hypotheses; (4) workshops and other interactions among the experts and proponents of published viewpoints should be encouraged; (5) elicitations are best conducted in individual interview sessions; (6) feedback should be provided to the experts to give them insight into the significance of alternative assessments to the hazard results; and (7) complete documentation should include the technical basis for all assessments. Case histories are given from seismic hazard analyses in Europe, western North America, and the stable continental region of the United States.

  19. Uncertainty-driven nuclear data evaluation including thermal (n,α) applied to 59Ni

    Science.gov (United States)

    Helgesson, P.; Sjöstrand, H.; Rochman, D.

    2017-11-01

    This paper presents a novel approach to the evaluation of nuclear data (ND), combining experimental data for thermal cross sections with resonance parameters and nuclear reaction modeling. The method involves sampling of various uncertain parameters, in particular uncertain components in experimental setups, and provides extensive covariance information, including consistent cross-channel correlations over the whole energy spectrum. The method is developed for, and applied to, 59Ni, but may be used as a whole, or in part, for other nuclides. 59Ni is particularly interesting since a substantial amount of 59Ni is produced in thermal nuclear reactors by neutron capture in 58Ni and since it has a non-threshold (n,α) cross section. Therefore, 59Ni gives a very important contribution to the helium production in stainless steel in a thermal reactor. However, current evaluated ND libraries contain old information for 59Ni, without any uncertainty information. The work includes a study of thermal cross section experiments and a novel combination of this experimental information, giving the full multivariate distribution of the thermal cross sections. In particular, the thermal (n,α) cross section is found to be 12.7 ± . 7 b. This is consistent with, but yet different from, current established values. Further, the distribution of thermal cross sections is combined with reported resonance parameters, and with TENDL-2015 data, to provide full random ENDF files; all of this is done in a novel way, keeping uncertainties and correlations in mind. The random files are also condensed into one single ENDF file with covariance information, which is now part of a beta version of JEFF 3.3. Finally, the random ENDF files have been processed and used in an MCNP model to study the helium production in stainless steel. The increase in the (n,α) rate due to 59Ni compared to fresh stainless steel is found to be a factor of 5.2 at a certain time in the reactor vessel, with a relative

  20. Developmental trauma disorder: pros and cons of including formal criteria in the psychiatric diagnostic systems

    Directory of Open Access Journals (Sweden)

    Schmid Marc

    2013-01-01

    Full Text Available Abstract Background This article reviews the current debate on developmental trauma disorder (DTD with respect to formalizing its diagnostic criteria. Victims of abuse, neglect, and maltreatment in childhood often develop a wide range of age-dependent psychopathologies with various mental comorbidities. The supporters of a formal DTD diagnosis argue that post-traumatic stress disorder (PTSD does not cover all consequences of severe and complex traumatization in childhood. Discussion Traumatized individuals are difficult to treat, but clinical experience has shown that they tend to benefit from specific trauma therapy. A main argument against inclusion of formal DTD criteria into existing diagnostic systems is that emphasis on the etiology of the disorder might force current diagnostic systems to deviate from their purely descriptive nature. Furthermore, comorbidities and biological aspects of the disorder may be underdiagnosed using the DTD criteria. Summary Here, we discuss arguments for and against the proposal of DTD criteria and address implications and consequences for the clinical practice.

  1. Formal System Verification - Extension 2

    Science.gov (United States)

    2012-08-08

    vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in

  2. Decision making under uncertainty: An investigation into the application of formal decision-making methods to safety issue decisions

    International Nuclear Information System (INIS)

    Bohn, M.P.

    1992-12-01

    As part of the NRC-sponsored program to study the implications of Generic Issue 57, ''Effects of Fire Protection System Actuation on Safety-Related Equipment,'' a subtask was performed to evaluate the applicability of formal decision analysis methods to generic issues cost/benefit-type decisions and to apply these methods to the GI-57 results. In this report, the numerical results obtained from the analysis of three plants (two PWRs and one BWR) as developed in the technical resolution program for GI-57 were studied. For each plant, these results included a calculation of the person-REM averted due to various accident scenarios and various proposed modifications to mitigate the accident scenarios identified. These results were recomputed to break out the benefit in terms of contributions due to random event scenarios, fire event scenarios, and seismic event scenarios. Furthermore, the benefits associated with risk (in terms of person-REM) averted from earthquakes at three different seismic ground motion levels were separately considered. Given this data, formal decision methodologies involving decision trees, value functions, and utility functions were applied to this basic data. It is shown that the formal decision methodology can be applied at several different levels. Examples are given in which the decision between several retrofits is changed from that resulting from a simple cost/benefit-ratio criterion by virtue of the decision-makinger's expressed (and assumed) preferences

  3. PDF uncertainties in precision electroweak measurements, including the W mass, in ATLAS

    CERN Document Server

    Cooper-Sarkar, Amanda; The ATLAS collaboration

    2015-01-01

    Now that the Higgs mass is known all the parameters of the SM are known- but with what accuracy? Precision EW measurements test the self-consistency of the SM- and thus can give hints of BSM physics. Precision measurements of $sin^2\\theta _W$ and the W mass are limited by PDF uncertainties This contribution discusses these uncertainties and what can be done to improve them.

  4. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  5. Modelling ecosystem service flows under uncertainty with stochiastic SPAN

    Science.gov (United States)

    Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.

    2012-01-01

    Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.

  6. Indigenous Knowledge and Education from the Quechua Community to School: Beyond the Formal/Non-Formal Dichotomy

    Science.gov (United States)

    Sumida Huaman, Elizabeth; Valdiviezo, Laura Alicia

    2014-01-01

    In this article, we propose to approach Indigenous education beyond the formal/non-formal dichotomy. We argue that there is a critical need to conscientiously include Indigenous knowledge in education processes from the school to the community; particularly, when formal systems exclude Indigenous cultures and languages. Based on ethnographic…

  7. Application of perturbation theory methods to nuclear data uncertainty propagation using the collision probability method

    International Nuclear Information System (INIS)

    Sabouri, Pouya

    2013-01-01

    This thesis presents a comprehensive study of sensitivity/uncertainty analysis for reactor performance parameters (e.g. the k-effective) to the base nuclear data from which they are computed. The analysis starts at the fundamental step, the Evaluated Nuclear Data File and the uncertainties inherently associated with the data they contain, available in the form of variance/covariance matrices. We show that when a methodical and consistent computation of sensitivity is performed, conventional deterministic formalisms can be sufficient to propagate nuclear data uncertainties with the level of accuracy obtained by the most advanced tools, such as state-of-the-art Monte Carlo codes. By applying our developed methodology to three exercises proposed by the OECD (Uncertainty Analysis for Criticality Safety Assessment Benchmarks), we provide insights of the underlying physical phenomena associated with the used formalisms. (author)

  8. Uncertainty of fast biological radiation dose assessment for emergency response scenarios.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Higueras, Manuel; Puig, Pedro; Einbeck, Jochen; Samaga, Daniel; Barquinero, Joan Francesc; Barrios, Lleonard; Brzozowska, Beata; Fattibene, Paola; Gregoire, Eric; Jaworska, Alicja; Lloyd, David; Oestreicher, Ursula; Romm, Horst; Rothkamm, Kai; Roy, Laurence; Sommer, Sylwester; Terzoudi, Georgia; Thierens, Hubert; Trompier, Francois; Vral, Anne; Woda, Clemens

    2017-01-01

    Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response. Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological dose estimates are compared using simulated and real data from recent exercises. The results demonstrate that a Bayesian method of uncertainty assessment is the most appropriate, even in the absence of detailed prior information. The relative accuracy and relevance of techniques for calculating uncertainty and combining assay results to produce single dose and uncertainty estimates is further discussed. Finally, it is demonstrated that whatever uncertainty estimation method is employed, ignoring the uncertainty on fast dose assessments can have an important impact on rapid biodosimetric categorization.

  9. Evaluation of uncertainty in the measurement of sense of natural language constructions

    Directory of Open Access Journals (Sweden)

    Bisikalo Oleg V.

    2017-01-01

    Full Text Available The task of evaluating uncertainty in the measurement of sense in natural language constructions (NLCs was researched through formalization of the notions of the language image, formalization of artificial cognitive systems (ACSs and the formalization of units of meaning. The method for measuring the sense of natural language constructions incorporated fuzzy relations of meaning, which ensures that information about the links between lemmas of the text is taken into account, permitting the evaluation of two types of measurement uncertainty of sense characteristics. Using developed applications programs, experiments were conducted to investigate the proposed method to tackle the identification of informative characteristics of text. The experiments resulted in dependencies of parameters being obtained in order to utilise the Pareto distribution law to define relations between lemmas, analysis of which permits the identification of exponents of an average number of connections of the language image as the most informative characteristics of text.

  10. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  11. Brine migration resulting from CO2 injection into saline aquifers – An approach to risk estimation including various levels of uncertainty

    DEFF Research Database (Denmark)

    Walter, Lena; Binning, Philip John; Oladyshkin, Sergey

    2012-01-01

    resulting from displaced brine. Quantifying risk on the basis of numerical simulations requires consideration of different kinds of uncertainties and this study considers both, scenario uncertainty and statistical uncertainty. Addressing scenario uncertainty involves expert opinion on relevant geological......Comprehensive risk assessment is a major task for large-scale projects such as geological storage of CO2. Basic hazards are damage to the integrity of caprocks, leakage of CO2, or reduction of groundwater quality due to intrusion of fluids. This study focuses on salinization of freshwater aquifers...... for large-scale 3D models including complex physics. Therefore, we apply a model reduction based on arbitrary polynomial chaos expansion combined with probabilistic collocation method. It is shown that, dependent on data availability, both types of uncertainty can be equally significant. The presented study...

  12. Measurement uncertainties in science and technology

    CERN Document Server

    Grabe, Michael

    2014-01-01

    This book recasts the classical Gaussian error calculus from scratch, the inducements concerning both random and unknown systematic errors. The idea of this book is to create a formalism being fit to localize the true values of physical quantities considered – true with respect to the set of predefined physical units. Remarkably enough, the prevailingly practiced forms of error calculus do not feature this property which however proves in every respect, to be physically indispensable. The amended formalism, termed Generalized Gaussian Error Calculus by the author, treats unknown systematic errors as biases and brings random errors to bear via enhanced confidence intervals as laid down by students. The significantly extended second edition thoroughly restructures and systematizes the text as a whole and illustrates the formalism by numerous numerical examples. They demonstrate the basic principles of how to understand uncertainties to localize the true values of measured values - a perspective decisive in vi...

  13. MODARIA WG5: Towards a practical guidance for including uncertainties in the results of dose assessment of routine releases

    Energy Technology Data Exchange (ETDEWEB)

    Mora, Juan C. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Telleria, Diego [International Atomic Energy Agency - IAEA (Austria); Al Neaimi, Ahmed [Emirates Nuclear Energy Corporation - ENEC (United Arab Emirates); Blixt Buhr, Anna Ma [Vattenfall AB (Sweden); Bonchuk, Iurii [Radiation Protection Institute - RPI (Ukraine); Chouhan, Sohan [Atomic Energy of Canada Limited - AECL (Canada); Chyly, Pavol [SE-VYZ (Slovakia); Curti, Adriana R. [Autoridad Regulatoria Nuclear - ARN (Argentina); Da Costa, Dejanira [Instituto de Radioprotecao e Dosimetria - IRD (Brazil); Duran, Juraj [VUJE Inc (Slovakia); Galeriu, Dan [Horia Hulubei National Institute of Physics and Nuclear Engineering - IFIN-HH (Romania); Haegg, Ann- Christin; Lager, Charlotte [Swedish Radiation Safety Authority - SSM (Sweden); Heling, Rudie [Nuclear Research and Consultancy Group - NRG (Netherlands); Ivanis, Goran; Shen, Jige [Ecometrix Incorporated (Canada); Iosjpe, Mikhail [Norwegian Radiation Protection Authority - NRPA (Norway); Krajewski, Pawel M. [Central Laboratory for Radiological Protection - CLOR (Poland); Marang, Laura; Vermorel, Fabien [Electricite de France - EdF (France); Mourlon, Christophe [Institut de Radioprotection et de Surete Nucleaire - IRSN (France); Perez, Fabricio F. [Belgian Nuclear Research Centre - SCK (Belgium); Woodruffe, Andrew [Federal Authority for Nuclear Regulation - FANR (United Arab Emirates); Zorko, Benjamin [Jozef Stefan Institute (Slovenia)

    2014-07-01

    MODARIA (Modelling and Data for Radiological Impact Assessments) project was launched in 2012 with the aim of improving the capabilities in radiation dose assessment by means of acquisition of improved data for model testing, model testing and comparison, reaching consensus on modelling philosophies, approaches and parameter values, development of improved methods and exchange of information. The project focuses on areas where uncertainties remain in the predictive capability of environmental models, emphasizing in reducing associated uncertainties or developing new approaches to strengthen the evaluation of the radiological impact. Within MODARIA, four main areas were defined, one of them devoted to Uncertainty and Variability. In this area four working groups were included, Working Group 5 dealing with the 'uncertainty and variability analysis for assessments of radiological impacts arising from routine discharges of radionuclides'. Whether doses are estimated by using measurement data, by applying models, or through a combination of measurements and calculations, the variability and uncertainty contribute to a distribution of possible values. The degree of variability and uncertainty is represented by the shape and extent of that distribution. The main objective of WG5 is to explore how to consider uncertainties and variabilities in the results of assessment of doses in planned situations for controlling the impact of routine releases from radioactive and nuclear installations to the environment. The final aim is to produce guidance for the calculation of uncertainties in these exposure situations and for the presentation of such results to the different stakeholders. To achieve that objective the main tasks identified were: to find tools and methods for uncertainty and variability analysis applicable to dose assessments in routine radioactive discharges, to define scenarios where information on uncertainty and variability of parameters is available

  14. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    Science.gov (United States)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  15. The role of sensitivity analysis in assessing uncertainty

    International Nuclear Information System (INIS)

    Crick, M.J.; Hill, M.D.

    1987-01-01

    Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice

  16. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  17. Thermal photon production in Au + Au collisions: Viscous corrections in two different hydrodynamic formalisms

    Energy Technology Data Exchange (ETDEWEB)

    Peralta-Ramos, J., E-mail: jperalta@ift.unesp.b [Instituto de Fisica Teorica, Universidade Estadual Paulista, Rua Doutor Bento Teobaldo Ferraz 271, Bloco II, 01140-070 Sao Paulo (Brazil); Nakwacki, M.S., E-mail: sole@iafe.uba.a [Instituto de Astronomia, Geofisica e Ciencias Atmosfericas, Universidade de Sao Paulo, Rua do Matao 1226, Cidade Universitaria, 05508-090 Sao Paulo (Brazil)

    2011-02-01

    We calculate the spectra of produced thermal photons in Au + Au collisions taking into account the nonequilibrium contribution to photon production due to finite shear viscosity. The evolution of the fireball is modeled by second-order as well as by divergence-type 2+1 dissipative hydrodynamics, both with an ideal equation of state and with one based on Lattice QCD that includes an analytical crossover. The spectrum calculated in the divergence-type theory is considerably enhanced with respect to the one calculated in the second-order theory, the difference being entirely due to differences in the viscous corrections to photon production. Our results show that the differences in hydrodynamic formalisms are an important source of uncertainty in the extraction of the value of {eta}/s from measured photon spectra. The uncertainty in the value of {eta}/s associated with different hydrodynamic models used to compute thermal photon spectra is larger than the one occurring in matching hadron elliptic flow to RHIC data.

  18. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  19. Incorporating uncertainty regarding applicability of evidence from meta-analyses into clinical decision making.

    Science.gov (United States)

    Kriston, Levente; Meister, Ramona

    2014-03-01

    Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Science.gov (United States)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  1. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    Science.gov (United States)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  2. A computational formalization for partial evaluation

    DEFF Research Database (Denmark)

    Hatcliff, John; Danvy, Olivier

    1997-01-01

    We formalize a partial evaluator for Eugenio Moggi's computational metalanguage. This formalization gives an evaluation-order independent view of binding-time analysis and program specialization, including a proper treatment of call unfolding. It also enables us to express the essence of `control...

  3. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  4. Uncertainty and Decision Making: Examples of Some Possible New Frontiers

    Science.gov (United States)

    Silliman, S. E.; Rodak, C. M.; Bolster, D.; Saavedra, K.; Evans, W.

    2011-12-01

    The concept of decision making under uncertainty for groundwater systems represents an exciting area of research and application. In this presentation, three examples are briefly introduced which represent possible new applications of risk and decision making under uncertainty. In the most classic of the three examples, a probabilistic strategy is considered within the context of management / assessment of proposed changes in land-use in the vicinity of a public water-supply well. Focused on health-risk related to contamination at the well, the analysis includes uncertainties in source location / strength, groundwater flow / transport, human exposure, and human health risk. The second example involves application of Probabilistic Risk Assessment (PRA) to the evaluation of development projects in rural regions of developing countries. PRA combined with Fault Tree Analysis provides a structure for analysis of the impact of data uncertainties on the estimation of health risk resulting from failure of multiple components of new water-resource systems. The third is an extension of the concept of "risk compensation" to the analysis of potential long-term risk associated with new water resource projects. Of direct interest here is the appearance of new risk to the public, such as introduction of new disease pathways or new sources of contamination of the source waters. As a result of limitations on conceptual model and/or limitations on data, this type of risk is often difficult to identify / assess, and is therefore not commonly included in formal decision-making efforts: it may however seriously impact the long-term net benefit of a water resource project. The goal of presenting these three examples is to illustrate the breadth of possible application of uncertainty / risk analyses beyond the more classic applications to groundwater remediation and protection.

  5. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  6. On formally integrating science and policy: walking the walk

    Science.gov (United States)

    Nichols, James D.; Johnson, Fred A.; Williams, Byron K.; Boomer, G. Scott

    2015-01-01

    The contribution of science to the development and implementation of policy is typically neither direct nor transparent.  In 1995, the U.S. Fish and Wildlife Service (FWS) made a decision that was unprecedented in natural resource management, turning to an unused and unproven decision process to carry out trust responsibilities mandated by an international treaty.  The decision process was adopted for the establishment of annual sport hunting regulations for the most economically important duck population in North America, the 6 to 11 million mallards Anas platyrhynchos breeding in the mid-continent region of north-central United States and central Canada.  The key idea underlying the adopted decision process was to formally embed within it a scientific process designed to reduce uncertainty (learn) and thus make better decisions in the future.  The scientific process entails use of models to develop predictions of competing hypotheses about system response to the selected action at each decision point.  These prediction not only are used to select the optimal management action, but also are compared with the subsequent estimates of system state variables, providing evidence for modifying degrees of confidence in, and hence relative influence of, these models at the next decision point.  Science and learning in one step are formally and directly incorporated into the next decision, contrasting with the usual ad hoc and indirect use of scientific results in policy development and decision-making.  Application of this approach over the last 20 years has led to a substantial reduction in uncertainty, as well as to an increase in transparency and defensibility of annual decisions and a decrease in the contentiousness of the decision process.  As resource managers are faced with increased uncertainty associated with various components of global change, this approach provides a roadmap for the future scientific management of natural resources.  

  7. Improvement of Statistical Decisions under Parametric Uncertainty

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  8. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  10. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  11. Use of probability tables for propagating uncertainties in neutronics

    International Nuclear Information System (INIS)

    Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.

    2017-01-01

    Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.

  12. Evaluation of nuclear data and their uncertainties

    International Nuclear Information System (INIS)

    Story, J.S.

    1984-01-01

    Some topics studied within the Winfrith Nuclear Data Group in recent years, and still of current importance, are briefly reviewed. Moderator cross-sections; criteria to be met for reactor applications are listed; thermal neutron scattering theory is summarized, with the approximations used to facilitate comutation; neutron age data test stringently the accuracy of epithermal cross-sections; a modification of the CFS effective range treatment for S-wave scatter by H is presented, and new calculations with up-to-date slow neutron scattering data are advocated. Use of multilevel resonance formalisms; the top bound resonance should be included explicitly in calculations; additive statistical terms are given to allow for ''distant'' negative and positive resonances, in both MLBW and R-M formalisms; formulae are presented for estimating R-M level shifts for 1>0 resonances. Resonance mean spacings; the Syson-Mehta optimum estimator is utilised in a method which up-dates the staircase plot. Resonances of 56 Fe have been resolved to approx. 800keV, over which range the level density for given Jπ should increase 2-fold; this variation is allowed for in the mean spacing calculations. Fission-product decay power; present status of integral data and summation calculations for 235 U and 239 Pu fissions is summarized, with a variety of intercomparisons including 239 Pu/ 235 U ratios. Data uncertainties are considered, but the sequence of data on GAMMAsub(γ) for the 27.8keV resonance of 56 Fe provided a cautionary example. (author)

  13. Theoretical formulation of finite-dimensional discrete phase spaces: I. Algebraic structures and uncertainty principles

    International Nuclear Information System (INIS)

    Marchiolli, M.A.; Ruzzi, M.

    2012-01-01

    We propose a self-consistent theoretical framework for a wide class of physical systems characterized by a finite space of states which allows us, within several mathematical virtues, to construct a discrete version of the Weyl–Wigner–Moyal (WWM) formalism for finite-dimensional discrete phase spaces with toroidal topology. As a first and important application from this ab initio approach, we initially investigate the Robertson–Schrödinger (RS) uncertainty principle related to the discrete coordinate and momentum operators, as well as its implications for physical systems with periodic boundary conditions. The second interesting application is associated with a particular uncertainty principle inherent to the unitary operators, which is based on the Wiener–Khinchin theorem for signal processing. Furthermore, we also establish a modified discrete version for the well-known Heisenberg–Kennard–Robertson (HKR) uncertainty principle, which exhibits additional terms (or corrections) that resemble the generalized uncertainty principle (GUP) into the context of quantum gravity. The results obtained from this new algebraic approach touch on some fundamental questions inherent to quantum mechanics and certainly represent an object of future investigations in physics. - Highlights: ► We construct a discrete version of the Weyl–Wigner–Moyal formalism. ► Coherent states for finite-dimensional discrete phase spaces are established. ► Discrete coordinate and momentum operators are properly defined. ► Uncertainty principles depend on the topology of finite physical systems. ► Corrections for the discrete Heisenberg uncertainty relation are also obtained.

  14. The role of uncertainty analysis in dose reconstruction and risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Simon, S.L.; Thiessen. K.M.

    1996-01-01

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  15. Flood Hazard Mapping : Uncertainty and its Value in the Decision-making Process

    NARCIS (Netherlands)

    Mukolwe, M.M.

    2016-01-01

    Computers are increasingly used in the simulation of natural phenomena such as floods. However, these simulations are based on numerical approximations of equations formalizing our conceptual understanding of flood flows. Thus, model results are intrinsically subject to uncertainty and the use of

  16. Flood Hazard Mapping: Uncertainty and its Value in the Decision-making Process

    NARCIS (Netherlands)

    Mukolwe, M.M.

    2016-01-01

    Computers are increasingly used in the simulation of natural phenomena such as floods. However, these simulations are based on numerical approximations of equations formalizing our conceptual understanding of flood flows. Thus, model results are intrinsically subject to uncertainty and the use of

  17. The smooth entropy formalism for von Neumann algebras

    International Nuclear Information System (INIS)

    Berta, Mario; Furrer, Fabian; Scholz, Volkher B.

    2016-01-01

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra

  18. The smooth entropy formalism for von Neumann algebras

    Energy Technology Data Exchange (ETDEWEB)

    Berta, Mario, E-mail: berta@caltech.edu [Institute for Quantum Information and Matter, California Institute of Technology, Pasadena, California 91125 (United States); Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp [Department of Physics, Graduate School of Science, University of Tokyo, Tokyo, Japan and Institute for Theoretical Physics, Leibniz University Hanover, Hanover (Germany); Scholz, Volkher B., E-mail: scholz@phys.ethz.ch [Institute for Theoretical Physics, ETH Zurich, Zurich (Switzerland)

    2016-01-15

    We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.

  19. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  20. Formal matrices

    CERN Document Server

    Krylov, Piotr

    2017-01-01

    This monograph is a comprehensive account of formal matrices, examining homological properties of modules over formal matrix rings and summarising the interplay between Morita contexts and K theory. While various special types of formal matrix rings have been studied for a long time from several points of view and appear in various textbooks, for instance to examine equivalences of module categories and to illustrate rings with one-sided non-symmetric properties, this particular class of rings has, so far, not been treated systematically. Exploring formal matrix rings of order 2 and introducing the notion of the determinant of a formal matrix over a commutative ring, this monograph further covers the Grothendieck and Whitehead groups of rings. Graduate students and researchers interested in ring theory, module theory and operator algebras will find this book particularly valuable. Containing numerous examples, Formal Matrices is a largely self-contained and accessible introduction to the topic, assuming a sol...

  1. BWR transient analysis using neutronic / thermal hydraulic coupled codes including uncertainty quantification

    International Nuclear Information System (INIS)

    Hartmann, C.; Sanchez, V.; Tietsch, W.; Stieglitz, R.

    2012-01-01

    The KIT is involved in the development and qualification of best estimate methodologies for BWR transient analysis in cooperation with industrial partners. The goal is to establish the most advanced thermal hydraulic system codes coupled with 3D reactor dynamic codes to be able to perform a more realistic evaluation of the BWR behavior under accidental conditions. For this purpose a computational chain based on the lattice code (SCALE6/GenPMAXS), the coupled neutronic/thermal hydraulic code (TRACE/PARCS) as well as a Monte Carlo based uncertainty and sensitivity package (SUSA) has been established and applied to different kind of transients of a Boiling Water Reactor (BWR). This paper will describe the multidimensional models of the plant elaborated for TRACE and PARCS to perform the investigations mentioned before. For the uncertainty quantification of the coupled code TRACE/PARCS and specifically to take into account the influence of the kinetics parameters in such studies, the PARCS code has been extended to facilitate the change of model parameters in such a way that the SUSA package can be used in connection with TRACE/PARCS for the U and S studies. This approach will be presented in detail. The results obtained for a rod drop transient with TRACE/PARCS using the SUSA-methodology showed clearly the importance of some kinetic parameters on the transient progression demonstrating that the coupling of a best-estimate coupled codes with uncertainty and sensitivity tools is very promising and of great importance for the safety assessment of nuclear reactors. (authors)

  2. Advancing Uncertainty: Untangling and Discerning Related Concepts

    Directory of Open Access Journals (Sweden)

    Janice Penrod

    2002-12-01

    Full Text Available Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal investigation into the meaning of the concept. Through concept analysis, the concept was deconstructed to identify conceptual components and gaps in understanding. Using this skeletal framework of the concept identified through concept analysis, subsequent studies were carried out to add ‘flesh’ to the concept. First, a concept refinement using the literature as data was completed. Findings revealed that the current state of the concept of uncertainty failed to incorporate what was known of the lived experience. Therefore, using interview techniques as the primary data source, a phenomenological study of uncertainty among caregivers was conducted. Incorporating the findings of the phenomenology, the skeletal framework of the concept was further fleshed out using techniques of concept correction to produce a more mature conceptualization of uncertainty. In this section, I describe the flow of this qualitative project investigating the concept of uncertainty, with special emphasis on a particular threat to validity (called conceptual tunnel vision that was identified and addressed during the phases of concept correction. Though in this article I employ a study of uncertainty for illustration, limited substantive findings regarding uncertainty are presented to retain a clear focus on the methodological issues.

  3. A first formal link between the price equation and an optimization program.

    Science.gov (United States)

    Grafen, Alan

    2002-07-07

    The Darwin unification project is pursued. A meta-model encompassing an important class of population genetic models is formed by adding an abstract model of the number of successful gametes to the Price equation under uncertainty. A class of optimization programs are defined to represent the "individual-as-maximizing-agent analogy" in a general way. It is then shown that for each population genetic model there is a corresponding optimization program with which formal links can be established. These links provide a secure logical foundation for the commonplace biological principle that natural selection leads organisms to act as if maximizing their "fitness", provides a definition of "fitness", and clarifies the limitations of that principle. The situations covered do not include frequency dependence or social behaviour, but the approach is capable of extension.

  4. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project

  5. Risk Management and Uncertainty in Infrastructure Projects

    DEFF Research Database (Denmark)

    Harty, Chris; Neerup Themsen, Tim; Tryggestad, Kjell

    2014-01-01

    The assumption that large complex projects should be managed in order to reduce uncertainty and increase predictability is not new. What is relatively new, however, is that uncertainty reduction can and should be obtained through formal risk management approaches. We question both assumptions...... by addressing a more fundamental question about the role of knowledge in current risk management practices. Inquiries into the predominant approaches to risk management in large infrastructure and construction projects reveal their assumptions about knowledge and we discuss the ramifications these have...... for project and construction management. Our argument and claim is that predominant risk management approaches tends to reinforce conventional ideas of project control whilst undermining other notions of value and relevance of built assets and project management process. These approaches fail to consider...

  6. Spinor formalism and complex-vector formalism of general relativity

    International Nuclear Information System (INIS)

    Han-ying, G.; Yong-shi, W.; Gendao, L.

    1974-01-01

    In this paper, using E. Cartan's exterior calculus, we give the spinor form of the structure equations, which leads naturally to the Newman--Penrose equations. Furthermore, starting from the spinor spaces and the el (2C) algebra, we construct the general complex-vector formalism of general relativity. We find that both the Cahen--Debever--Defrise complex-vector formalism and that of Brans are its special cases. Thus, the spinor formalism and the complex-vector formalism of general relativity are unified on the basis of the uni-modular group SL(2C) and its Lie algebra

  7. The simplest formal argument for fitness optimization

    Indian Academy of Sciences (India)

    The Formal Darwinism Project aims to provide a formal argument linking population genetics to fitness optimization, which of necessity includes defining fitness. This bridges the gulf between those biologists who assume that natural selection leads to something close to fitness optimization and those biologists who believe ...

  8. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    International Nuclear Information System (INIS)

    Baccou, J.; Chojnacki, E.; Destercke, S.

    2013-01-01

    To demonstrate that the nuclear power plants are designed to respond safely at numerous postulated accidents computer codes are used. The models of these computer codes are an approximation of the real physical behaviour occurring during an accident. Moreover the data used to run these codes are also known with a limited accuracy. Therefore the code predictions are not exact but uncertain. To deal with these uncertainties, 'best estimate' codes with 'best estimate' input data are used to obtain a best estimate calculation and it is necessary to derive the uncertainty associated to their estimations. For this reason, regulatory authorities demand in particular to technical safety organization such as the French Institut de Radioprotection et de Surete Nucleaire (IRSN) to provide results taking into account all the uncertainty sources to assess safety quantities are below critical values. Uncertainty analysis can be seen as a problem of information treatment and a special effort on four methodological key issues has to be done. The first one is related to information modelling. In safety studies, one can distinguish two kinds of uncertainty. The first type, called aleatory uncertainty, is due to the natural variability of an observed phenomenon and cannot be reduced by the arrival of new information. The second type, called epistemic uncertainty, can arise from imprecision. Contrary to the previous one, this uncertainty can be reduced by increasing the state of knowledge. Performing a relevant information modelling therefore requires to work with a mathematical formalism flexible enough to faithfully treat both types of uncertainties. The second one deals with information propagation through a computer code. It requires to run the codes several times and it is usually achieved thanks to a coupling to a statistical software. The complexity of the propagation is strongly connected to the mathematical framework used for the information modelling. The more general the

  9. Automatic reconstruction of fault networks from seismicity catalogs including location uncertainty

    International Nuclear Information System (INIS)

    Wang, Y.

    2013-01-01

    Within the framework of plate tectonics, the deformation that arises from the relative movement of two plates occurs across discontinuities in the earth's crust, known as fault zones. Active fault zones are the causal locations of most earthquakes, which suddenly release tectonic stresses within a very short time. In return, fault zones slowly grow by accumulating slip due to such earthquakes by cumulated damage at their tips, and by branching or linking between pre-existing faults of various sizes. Over the last decades, a large amount of knowledge has been acquired concerning the overall phenomenology and mechanics of individual faults and earthquakes: A deep physical and mechanical understanding of the links and interactions between and among them is still missing, however. One of the main issues lies in our failure to always succeed in assigning an earthquake to its causative fault. Using approaches based in pattern-recognition theory, more insight into the relationship between earthquakes and fault structure can be gained by developing an automatic fault network reconstruction approach using high resolution earthquake data sets at largely different scales and by considering individual event uncertainties. This thesis introduces the Anisotropic Clustering of Location Uncertainty Distributions (ACLUD) method to reconstruct active fault networks on the basis of both earthquake locations and their estimated individual uncertainties. This method consists in fitting a given set of hypocenters with an increasing amount of finite planes until the residuals of the fit compare with location uncertainties. After a massive search through the large solution space of possible reconstructed fault networks, six different validation procedures are applied in order to select the corresponding best fault network. Two of the validation steps (cross-validation and Bayesian Information Criterion (BIC)) process the fit residuals, while the four others look for solutions that

  10. Automatic reconstruction of fault networks from seismicity catalogs including location uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.

    2013-07-01

    Within the framework of plate tectonics, the deformation that arises from the relative movement of two plates occurs across discontinuities in the earth's crust, known as fault zones. Active fault zones are the causal locations of most earthquakes, which suddenly release tectonic stresses within a very short time. In return, fault zones slowly grow by accumulating slip due to such earthquakes by cumulated damage at their tips, and by branching or linking between pre-existing faults of various sizes. Over the last decades, a large amount of knowledge has been acquired concerning the overall phenomenology and mechanics of individual faults and earthquakes: A deep physical and mechanical understanding of the links and interactions between and among them is still missing, however. One of the main issues lies in our failure to always succeed in assigning an earthquake to its causative fault. Using approaches based in pattern-recognition theory, more insight into the relationship between earthquakes and fault structure can be gained by developing an automatic fault network reconstruction approach using high resolution earthquake data sets at largely different scales and by considering individual event uncertainties. This thesis introduces the Anisotropic Clustering of Location Uncertainty Distributions (ACLUD) method to reconstruct active fault networks on the basis of both earthquake locations and their estimated individual uncertainties. This method consists in fitting a given set of hypocenters with an increasing amount of finite planes until the residuals of the fit compare with location uncertainties. After a massive search through the large solution space of possible reconstructed fault networks, six different validation procedures are applied in order to select the corresponding best fault network. Two of the validation steps (cross-validation and Bayesian Information Criterion (BIC)) process the fit residuals, while the four others look for solutions that

  11. SBME : Exploring boundaries between formal, non-formal, and informal learning

    OpenAIRE

    Shahoumian, Armineh; Parchoma, Gale; Saunders, Murray; Hanson, Jacky; Dickinson, Mike; Pimblett, Mark

    2013-01-01

    In medical education learning extends beyond university settings into practice. Non-formal and informal learning support learners’ efforts to meet externally set and learner-identified objectives. In SBME research, boundaries between formal, non-formal, and informal learning have not been widely explored. Whether SBME fits within or challenges these categories can make a contribution. Formal learning is described in relation to educational settings, planning, assessment, and accreditation. In...

  12. Formality of the Chinese collective leadership.

    Science.gov (United States)

    Li, Haiying; Graesser, Arthur C

    2016-09-01

    We investigated the linguistic patterns in the discourse of four generations of the collective leadership of the Communist Party of China (CPC) from 1921 to 2012. The texts of Mao Zedong, Deng Xiaoping, Jiang Zemin, and Hu Jintao were analyzed using computational linguistic techniques (a Chinese formality score) to explore the persuasive linguistic features of the leaders in the contexts of power phase, the nation's education level, power duration, and age. The study was guided by the elaboration likelihood model of persuasion, which includes a central route (represented by formal discourse) versus a peripheral route (represented by informal discourse) to persuasion. The results revealed that these leaders adopted the formal, central route more when they were in power than before they came into power. The nation's education level was a significant factor in the leaders' adoption of the persuasion strategy. The leaders' formality also decreased with their increasing age and in-power times. However, the predictability of these factors for formality had subtle differences among the different types of leaders. These results enhance our understanding of the Chinese collective leadership and the role of formality in politically persuasive messages.

  13. Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.

    Science.gov (United States)

    Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk

    2012-02-01

    Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Formalization of Database Systems -- and a Formal Definition of {IMS}

    DEFF Research Database (Denmark)

    Bjørner, Dines; Løvengreen, Hans Henrik

    1982-01-01

    Drawing upon an analogy between Programming Language Systems and Database Systems we outline the requirements that architectural specifications of database systems must futfitl, and argue that only formal, mathematical definitions may 6atisfy these. Then we illustrate home aspects and touch upon...... come ueee of formal definitions of data models and databaee management systems. A formal model of INS will carry this discussion. Finally we survey some of the exkting literature on formal definitions of database systems. The emphasis will be on constructive definitions in the denotationul semantics...... style of the VCM: Vienna Development Nethd. The role of formal definitions in international standardiaation efforts is briefly mentioned....

  15. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  16. SELF-EFFICACY OF FORMALLY AND NON-FORMALLY TRAINED PUBLIC SECTOR TEACHERS

    Directory of Open Access Journals (Sweden)

    Muhammad Nadeem ANWAR

    2009-07-01

    Full Text Available The main objective of the study was to compare the formally and non-formally trained in-service public sector teachers’ Self-efficacy. Five hypotheses were developed describing no difference in the self-efficacy of formally and non-formally trained teachers to influence decision making, influence school resources, instructional self-efficacy, disciplinary self-efficacy and create positive school climate. Teacher Efficacy Instrument (TSES developed by Bandura (2001 consisting of thirty 9-point items was used in the study. 342 formally trained and 255 non-formally trained respondents’ questionnaires were received out of 1500 mailed. The analysis of data revealed that the formally trained public sector teachers are high in their self-efficacy on all the five categories: to influence decision making, to influence school resources, instructional self-efficacy, disciplinary self-efficacy and self-efficacy to create positive school climate.

  17. Towards Formal Implementation of PUS Standard

    Science.gov (United States)

    Ilić, D.

    2009-05-01

    As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.

  18. A Comparitive Study of Subject Knowledge of B.Ed Graduates of Formal and Non-Formal Teacher Education Systems

    Science.gov (United States)

    Saif, Perveen; Reba, Amjad; ud Din, Jalal

    2017-01-01

    This study was designed to compare the subject knowledge of B.Ed graduates of formal and non-formal teacher education systems. The population of the study included all teachers from Girls High and Higher Secondary Schools both from private and public sectors from the district of Peshawar. Out of the total population, twenty schools were randomly…

  19. Informal work and formal plans

    DEFF Research Database (Denmark)

    Dalsted, Rikke Juul; Hølge-Hazelton, Bibi; Kousgaard, Marius Brostrøm

    2012-01-01

    INTRODUCTION: Formal pathways models outline that patients should receive information in order to experience a coherent journey but do not describe an active role for patients or their relatives. The aim of this is paper is to articulate and discuss the active role of patients during their cancer...... trajectories. METHODS AND THEORY: An in-depth case study of patient trajectories at a Danish hospital and surrounding municipality using individual interviews with patients. Theory about trajectory and work by Strauss was included. RESULTS: Patients continuously took initiatives to organize their treatment....... The patients' requests were not sufficiently supported in the professional organisation of work or formal planning. Patients' insertion and use of information in their trajectories challenged professional views and working processes. And the design of the formal pathway models limits the patients' active...

  20. Evaluating Sketchiness as a Visual Variable for the Depiction of Qualitative Uncertainty

    NARCIS (Netherlands)

    Boukhelifa, Nadia; Bezerianos, Anastasia; Isenberg, Tobias; Fekete, Jean-Daniel

    2012-01-01

    We report on results of a series of user studies on the perception of four visual variables that are commonly used in the literature to depict uncertainty. To the best of our knowledge, we provide the first formal evaluation of the use of these variables to facilitate an easier reading of

  1. Economic policy uncertainty index and economic activity: what causes what?

    Directory of Open Access Journals (Sweden)

    Ivana Lolić

    2017-01-01

    Full Text Available This paper is a follow-up on the Economic Policy Uncertainty (EPU index, developed in 2011 by Baker, Bloom, and Davis. The principal idea of the EPU index is to quantify the level of uncertainty in an economic system, based on three separate pillars: news media, number of federal tax code provisions expiring in the following years, and disagreement amongst professional forecasters on future tendencies of relevant macroeconomic variables. Although the original EPU index was designed and published for the US economy, it had instantly caught the attention of numerous academics and was rapidly introduced in 15 countries worldwide. Extensive academic debate has been triggered on the importance of economic uncertainty relating to the intensity and persistence of the recent crisis. Despite the intensive (mostly politically-motivated debate, formal scientific confirmation of causality running from the EPU index to economic activity has not followed. Moreover, empirical literature has completely failed to conduct formal econometric testing of the Granger causality between the two mentioned phenomena. This paper provides an estimation of the Toda-Yamamoto causality test between the EPU index and economic activity in the USA and several European countries. The results do not provide a general conclusion: causality seems to run in both directions only for the USA, while only in one direction for France and Germany. Having taken into account the Great Recession of 2008, the main result does not change, therefore casting doubt on the index methodology and overall media bias.

  2. A computational formalization for partial evaluation

    DEFF Research Database (Denmark)

    Hatcliff, John; Danvy, Olivier

    1996-01-01

    We formalize a partial evaluator for Eugenio Moggi's computational metalanguage. This formalization gives an evaluation-order independent view of binding-time analysis and program specialization, including a proper treatment of call unfolding. It also enables us to express the essence of `control......-based binding-time improvements' for let expressions. Specically, we prove that the binding-time improvements given by `continuation-based specialization' can be expressed in the metalanguage via monadic laws....

  3. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.

  4. Formal, non-formal and informal learning in music : vocal students as animateurs : a case study of non-formal learning

    NARCIS (Netherlands)

    Kors, Ninja; Mak, Peter

    2006-01-01

    The pilot project that will be described in this report was all about the animateur. What are his skills and attitudes? What are the pedagogical interventions that he uses in a workshop or an event? What are the main issues that arise when we try to include such a naturally non-formal and informal

  5. A formalization of computational trust

    NARCIS (Netherlands)

    Güven - Ozcelebi, C.; Holenderski, M.J.; Ozcelebi, T.; Lukkien, J.J.

    2018-01-01

    Computational trust aims to quantify trust and is studied by many disciplines including computer science, social sciences and business science. We propose a formal computational trust model, including its parameters and operations on these parameters, as well as a step by step guide to compute trust

  6. Formalisms for reuse and systems integration

    CERN Document Server

    Rubin, Stuart

    2015-01-01

    Reuse and integration are defined as synergistic concepts, where reuse addresses how to minimize redundancy in the creation of components; while, integration focuses on component composition. Integration supports reuse and vice versa. These related concepts support the design of software and systems for maximizing performance while minimizing cost. Knowledge, like data, is subject to reuse; and, each can be interpreted as the other. This means that inherent complexity, a measure of the potential utility of a system, is directly proportional to the extent to which it maximizes reuse and integration. Formal methods can provide an appropriate context for the rigorous handling of these synergistic concepts. Furthermore, formal languages allow for non ambiguous model specification; and, formal verification techniques provide support for insuring the validity of reuse and integration mechanisms.   This edited book includes 12 high quality research papers written by experts in formal aspects of reuse and integratio...

  7. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  8. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  9. Formalizing Informal Logic

    Directory of Open Access Journals (Sweden)

    Douglas Walton

    2015-12-01

    Full Text Available This paper presents a formalization of informal logic using the Carneades Argumentation System (CAS, a formal, computational model of argument that consists of a formal model of argument graphs and audiences. Conflicts between pro and con arguments are resolved using proof standards, such as preponderance of the evidence. CAS also formalizes argumentation schemes. Schemes can be used to check whether a given argument instantiates the types of argument deemed normatively appropriate for the type of dialogue.

  10. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project

  11. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  12. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  13. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 36: Technical uncertainty as a correlate of information use by US industry-affiliated aerospace engineers and scientists

    Science.gov (United States)

    Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.

    1994-01-01

    This paper reports the results of an exploratory study that investigated the influence of technical uncertainty on the use of information and information sources by U.S. industry-affiliated aerospace engineers and scientists in completing or solving a project, task, or problem. Data were collected through a self-administered questionnaire. Survey participants were U.S. aerospace engineers and scientists whose names appeared on the Society of Automotive Engineers (SAE) mailing list. The results support the findings of previous research and the following study assumptions. Information and information-source use differ for projects, problems, and tasks with high and low technical uncertainty. As technical uncertainty increases, information-source use changes from internal to external and from informal to formal sources. As technical uncertainty increases, so too does the use of federally funded aerospace research and development (R&D). The use of formal information sources to learn about federally funded aerospace R&D differs for projects, problems, and tasks with high and low technical uncertainty.

  14. Essential competencies analysis of a training model development for non-formal vocational teachers under the office of the non-formal and informal education in Thailand

    Directory of Open Access Journals (Sweden)

    Chayanopparat Piyanan

    2016-01-01

    Full Text Available Non-formal vocational education provides practical experiences in a particular occupational field to non-formal semi-skilled learners. Non-formal vocational teachers are the key persons to deliver particular occupational knowledge. The essential competencies enhancement for non-sformal vocational teachers will improve teaching performance. The question of the research is what the essential competencies for the nonformal vocational teachers are. The research method was 1 to review related literature, 2 to collect a needs assessment, and 3 to analyse the essential competencies for non-formal vocational teachers. The population includes non-formal vocational teachers at the executive level and nonformal vocational teachers. The results from the essential competencies analysis found that the essential competencies for non-formal vocational teachers consist of 5 capabilities including 1 Adult learning design capability, 2 Adult learning principle application capability, 3 ICT searching capability for teaching preparation, 4 Instructional plan development capability and 5 Instructional media development capability.

  15. Traditional and formal education: Means of improving grasscutter ...

    African Journals Online (AJOL)

    The study concludes that both traditional and non-formal education are important for the development and efficiency of grasscutter farming in Ogun Waterside Local Government Area of Ogun State. The following are the recommendations of the study: revision of the curriculum of formal schools to include items that inculcate ...

  16. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  17. Two-stage robust UC including a novel scenario-based uncertainty model for wind power applications

    International Nuclear Information System (INIS)

    Álvarez-Miranda, Eduardo; Campos-Valdés, Camilo; Rahmann, Claudia

    2015-01-01

    Highlights: • Methodological framework for obtaining Robust Unit Commitment (UC) policies. • Wind-power forecast using a revisited bootstrap predictive inference approach. • Novel scenario-based model for wind-power uncertainty. • Efficient modeling framework for obtaining nearly optimal UC policies in reasonable time. • Effective incorporation of wind-power uncertainty in the UC modeling. - Abstract: The complex processes involved in the determination of the availability of power from renewable energy sources, such as wind power, impose great challenges in the forecasting processes carried out by transmission system operators (TSOs). Nowadays, many of these TSOs use operation planning tools that take into account the uncertainty of the wind-power. However, most of these methods typically require strict assumptions about the probabilistic behavior of the forecast error, and usually ignore the dynamic nature of the forecasting process. In this paper a methodological framework to obtain Robust Unit Commitment (UC) policies is presented; such methodology considers a novel scenario-based uncertainty model for wind power applications. The proposed method is composed by three main phases. The first two phases generate a sound wind-power forecast using a bootstrap predictive inference approach. The third phase corresponds to modeling and solving a one-day ahead Robust UC considering the output of the first phase. The performance of proposed approach is evaluated using as case study a new wind farm to be incorporated into the Northern Interconnected System (NIS) of Chile. A projection of wind-based power installation, as well as different characteristic of the uncertain data, are considered in this study

  18. Integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management

    International Nuclear Information System (INIS)

    Catrinu, M.D.; Nordgard, D.E.

    2011-01-01

    Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made. This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network. This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.

  19. Estimation of full moment tensors, including uncertainties, for earthquakes, volcanic events, and nuclear explosions

    Science.gov (United States)

    Alvizuri, Celso R.

    We present a catalog of full seismic moment tensors for 63 events from Uturuncu volcano in Bolivia. The events were recorded during 2011-2012 in the PLUTONS seismic array of 24 broadband stations. Most events had magnitudes between 0.5 and 2.0 and did not generate discernible surface waves; the largest event was Mw 2.8. For each event we computed the misfit between observed and synthetic waveforms, and we used first-motion polarity measurements to reduce the number of possible solutions. Each moment tensor solution was obtained using a grid search over the six-dimensional space of moment tensors. For each event we show the misfit function in eigenvalue space, represented by a lune. We identify three subsets of the catalog: (1) 6 isotropic events, (2) 5 tensional crack events, and (3) a swarm of 14 events southeast of the volcanic center that appear to be double couples. The occurrence of positively isotropic events is consistent with other published results from volcanic and geothermal regions. Several of these previous results, as well as our results, cannot be interpreted within the context of either an oblique opening crack or a crack-plus-double-couple model. Proper characterization of uncertainties for full moment tensors is critical for distinguishing among physical models of source processes. A seismic moment tensor is a 3x3 symmetric matrix that provides a compact representation of a seismic source. We develop an algorithm to estimate moment tensors and their uncertainties from observed seismic data. For a given event, the algorithm performs a grid search over the six-dimensional space of moment tensors by generating synthetic waveforms for each moment tensor and then evaluating a misfit function between the observed and synthetic waveforms. 'The' moment tensor M0 for the event is then the moment tensor with minimum misfit. To describe the uncertainty associated with M0, we first convert the misfit function to a probability function. The uncertainty, or

  20. Formality in Brackets

    DEFF Research Database (Denmark)

    Garsten, Christina; Nyqvist, Anette

    Ethnographic work in formal organizations involves learning to recognize the many layers of front stage and back stage of organized life, and to bracket formality. It means to be alert to the fact that what is formal and front stage for one some actors, and in some situations, may in fact be back...... stage and informal for others. Walking the talk, donning the appropriate attire, wearing the proper suit, may be part of what is takes to figure out the code of formal organizational settings – an entrance ticket to the backstage, as it were. Oftentimes, it involves a degree of mimicry, of ‘following...... suits’ (Nyqvist 2013), and of doing ‘ethnography by failure’ (Garsten 2013). In this paper, we explore the layers of informality and formality in our fieldwork experiences among financial investors and policy experts, and discuss how to ethnographically represent embodied fieldwork practices. How do we...

  1. Uncertainty vs. Information (Invited)

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  2. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  3. Expanding Uncertainty Principle to Certainty-Uncertainty Principles with Neutrosophy and Quad-stage Method

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-03-01

    Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.

  4. Barriers to formal emergency obstetric care services' utilization.

    Science.gov (United States)

    Essendi, Hildah; Mills, Samuel; Fotso, Jean-Christophe

    2011-06-01

    Access to appropriate health care including skilled birth attendance at delivery and timely referrals to emergency obstetric care services can greatly reduce maternal deaths and disabilities, yet women in sub-Saharan Africa continue to face limited access to skilled delivery services. This study relies on qualitative data collected from residents of two slums in Nairobi, Kenya in 2006 to investigate views surrounding barriers to the uptake of formal obstetric services. Data indicate that slum dwellers prefer formal to informal obstetric services. However, their efforts to utilize formal emergency obstetric care services are constrained by various factors including ineffective health decision making at the family level, inadequate transport facilities to formal care facilities and insecurity at night, high cost of health services, and inhospitable formal service providers and poorly equipped health facilities in the slums. As a result, a majority of slum dwellers opt for delivery services offered by traditional birth attendants (TBAs) who lack essential skills and equipment, thereby increasing the risk of death and disability. Based on these findings, we maintain that urban poor women face barriers to access of formal obstetric services at family, community, and health facility levels, and efforts to reduce maternal morbidity and mortality among the urban poor must tackle the barriers, which operate at these different levels to hinder women's access to formal obstetric care services. We recommend continuous community education on symptoms of complications related to pregnancy and timely referral. A focus on training of health personnel on "public relations" could also restore confidence in the health-care system with this populace. Further, we recommend improving the health facilities in the slums, improving the services provided by TBAs through capacity building as well as involving TBAs in referral processes to make access to services timely. Measures can also be

  5. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  6. Fear of the Formal

    DEFF Research Database (Denmark)

    du Gay, Paul; Lopdrup-Hjorth, Thomas

    Over recent decades, institutions exhibiting high degrees of formality have come in for severe criticism. From the private to the public sector, and across a whole spectrum of actors spanning from practitioners to academics, formal organization is viewed with increasing doubt and skepticism....... In a “Schumpetarian world” (Teece et al., 1997: 509) of dynamic competition and incessant reform, formal organization appears as well suited to survival as a fish out of water. Indeed, formal organization, and its closely overlapping semantic twin bureaucracy, are not only represented as ill suited to the realities...... is that formal organization is an obstacle to be overcome. For that very reason, critics, intellectuals and reformers alike have urged public and private organizations to break out of the stifling straightjacket of formality, to dispense with bureaucracy, and to tear down hierarchies. This could either be done...

  7. Uncertainty, Social Location and Influence in Decision Making: A Sociometric Analysis

    OpenAIRE

    Michael L. Tushman; Elaine Romanelli

    1983-01-01

    This research investigates the relative impacts of formal status and informal communication roles on influence in administrative and technical decision making. While external information enters the organization via boundary spanning individuals, the exercise of influence at lower levels of the organization is dependent on mediating critical organizational contingencies. As the locus of task uncertainty shifts, so too does the relative influence of boundary spanning individuals and internal st...

  8. Addressing uncertainties in the ERICA Integrated Approach

    International Nuclear Information System (INIS)

    Oughton, D.H.; Agueero, A.; Avila, R.; Brown, J.E.; Copplestone, D.; Gilek, M.

    2008-01-01

    Like any complex environmental problem, ecological risk assessment of the impacts of ionising radiation is confounded by uncertainty. At all stages, from problem formulation through to risk characterisation, the assessment is dependent on models, scenarios, assumptions and extrapolations. These include technical uncertainties related to the data used, conceptual uncertainties associated with models and scenarios, as well as social uncertainties such as economic impacts, the interpretation of legislation, and the acceptability of the assessment results to stakeholders. The ERICA Integrated Approach has been developed to allow an assessment of the risks of ionising radiation, and includes a number of methods that are intended to make the uncertainties and assumptions inherent in the assessment more transparent to users and stakeholders. Throughout its development, ERICA has recommended that assessors deal openly with the deeper dimensions of uncertainty and acknowledge that uncertainty is intrinsic to complex systems. Since the tool is based on a tiered approach, the approaches to dealing with uncertainty vary between the tiers, ranging from a simple, but highly conservative screening to a full probabilistic risk assessment including sensitivity analysis. This paper gives on overview of types of uncertainty that are manifest in ecological risk assessment and the ERICA Integrated Approach to dealing with some of these uncertainties

  9. Necessity of Integral Formalism

    International Nuclear Information System (INIS)

    Tao Yong

    2011-01-01

    To describe the physical reality, there are two ways of constructing the dynamical equation of field, differential formalism and integral formalism. The importance of this fact is firstly emphasized by Yang in case of gauge field [Phys. Rev. Lett. 33 (1974) 445], where the fact has given rise to a deeper understanding for Aharonov-Bohm phase and magnetic monopole [Phys. Rev. D 12 (1975) 3845]. In this paper we shall point out that such a fact also holds in general wave function of matter, it may give rise to a deeper understanding for Berry phase. Most importantly, we shall prove a point that, for general wave function of matter, in the adiabatic limit, there is an intrinsic difference between its integral formalism and differential formalism. It is neglect of this difference that leads to an inconsistency of quantum adiabatic theorem pointed out by Marzlin and Sanders [Phys. Rev. Lett. 93 (2004) 160408]. It has been widely accepted that there is no physical difference of using differential operator or integral operator to construct the dynamical equation of field. Nevertheless, our study shows that the Schrödinger differential equation (i.e., differential formalism for wave function) shall lead to vanishing Berry phase and that the Schrödinger integral equation (i.e., integral formalism for wave function), in the adiabatic limit, can satisfactorily give the Berry phase. Therefore, we reach a conclusion: There are two ways of describing physical reality, differential formalism and integral formalism; but the integral formalism is a unique way of complete description. (general)

  10. Rational consensus under uncertainty: Expert judgment in the EC-USNRC uncertainty study

    International Nuclear Information System (INIS)

    Cooke, R.; Kraan, B.; Goossens, L.

    1999-01-01

    ? Simply choosing a maximally feasible pool of experts and combining their views by some method of equal representation might achieve a form of political consensus among the experts involved, but will not achieve rational consensus. If expert viewpoints are related to the institutions at which the experts are employed, then numerical representation of viewpoints in the pool may be, and/or may be perceived to be influenced by the size of the interests funding the institutes. We collect a number of conclusions regarding the use of structured expert judgment. 1 . Experts' subjective uncertainties may be used to advance rational consensus in the face of large uncertainties, in so far as the necessary conditions for rational consensus are satisfied. 2. Empirical control of experts' subjective uncertainties is possible. 3. Experts' performance as subjective probability assessors is not uniform, there are significant differences in performance. 4. Experts as a group may show poor performance. 5. A structured combination of expert judgment may show satisfactory performance, even though the experts individually perform poorly. 6. The performance based combination generally outperforms the equal weight combination. 7. The combination of experts' subjective probabilities, according to the schemes discussed here, generally has wider 90% central confidence intervals than the experts individually; particularly in the case of the equal weight combination. We note that poor performance as a subjective probability assessor does not indicate a lack of substantive expert knowledge. Rather, it indicates unfamiliarity with quantifying subjective uncertainty in terms of subjective probability distributions. Experts were provided with training in subjective probability assessment, but of course their formal training does not (yet) prepare them for such tasks

  11. Pragmatics for formal semantics

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2011-01-01

    This tech talk describes how to write and how to inter-derive formal semantics for sequential programming languages. The progress reported here is (1) concrete guidelines to write each formal semantics to alleviate their proof obligations, and (2) simple calculational tools to obtain a formal...

  12. Vehicle navigation in populated areas using predictive control with environmental uncertainty handling

    Directory of Open Access Journals (Sweden)

    Skrzypczyk Krzysztof

    2017-06-01

    Full Text Available This paper addresses the problem of navigating an autonomous vehicle using environmental dynamics prediction. The usefulness of the Game Against Nature formalism adapted to modelling environmental prediction uncertainty is discussed. The possibility of the control law synthesis on the basis of strategies against Nature is presented. The properties and effectiveness of the approach presented are verified by simulations carried out in MATLAB.

  13. Multi-attribute evaluation and choice of alternatives for surplus weapons-usable plutonium disposition at uncertainty

    International Nuclear Information System (INIS)

    Kosterev, V.V.; Bolyatko, V.V.; Khajretdinov, S.I.; Averkin, A.N.

    2014-01-01

    The problem of surplus weapons-usable plutonium disposition is formalized as a multi-attribute problem of a choice of alternatives from a set of possible alternatives under fuzzy conditions. Evaluation and ordering of alternatives for the surplus weapons-usable plutonium disposition and sensitivity analysis are carried out at uncertainty [ru

  14. Industrial use of formal methods formal verification

    CERN Document Server

    Boulanger, Jean-Louis

    2012-01-01

    At present the literature gives students and researchers of the very general books on the formal technics. The purpose of this book is to present in a single book, a return of experience on the used of the "formal technics" (such proof and model-checking) on industrial examples for the transportation domain. This book is based on the experience of people which are completely involved in the realization and the evaluation of safety critical system software based.  The implication of the industrialists allows to raise the problems of confidentiality which could appear and so allow

  15. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  16. Formalization and Implementation of Algebraic Methods in Geometry

    Directory of Open Access Journals (Sweden)

    Filip Marić

    2012-02-01

    Full Text Available We describe our ongoing project of formalization of algebraic methods for geometry theorem proving (Wu's method and the Groebner bases method, their implementation and integration in educational tools. The project includes formal verification of the algebraic methods within Isabelle/HOL proof assistant and development of a new, open-source Java implementation of the algebraic methods. The project should fill-in some gaps still existing in this area (e.g., the lack of formal links between algebraic methods and synthetic geometry and the lack of self-contained implementations of algebraic methods suitable for integration with dynamic geometry tools and should enable new applications of theorem proving in education.

  17. Determination of prescription dose for Cs-131 permanent implants using the BED formalism including resensitization correction

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei, E-mail: wei.luo@uky.edu; Molloy, Janelle; Aryal, Prakash; Feddock, Jonathan; Randall, Marcus [Department of Radiation Medicine, University of Kentucky, Lexington, Kentucky 40536 (United States)

    2014-02-15

    Purpose: The current widely used biological equivalent dose (BED) formalism for permanent implants is based on the linear-quadratic model that includes cell repair and repopulation but not resensitization (redistribution and reoxygenation). The authors propose a BED formalism that includes all the four biological effects (4Rs), and the authors propose how it can be used to calculate appropriate prescription doses for permanent implants with Cs-131. Methods: A resensitization correction was added to the BED calculation for permanent implants to account for 4Rs. Using the same BED, the prescription doses with Au-198, I-125, and Pd-103 were converted to the isoeffective Cs-131 prescription doses. The conversion factor F, ratio of the Cs-131 dose to the equivalent dose with the other reference isotope (F{sub r}: with resensitization, F{sub n}: without resensitization), was thus derived and used for actual prescription. Different values of biological parameters such as α, β, and relative biological effectiveness for different types of tumors were used for the calculation. Results: Prescription doses with I-125, Pd-103, and Au-198 ranging from 10 to 160 Gy were converted into prescription doses with Cs-131. The difference in dose conversion factors with (F{sub r}) and without (F{sub n}) resensitization was significant but varied with different isotopes and different types of tumors. The conversion factors also varied with different doses. For I-125, the average values of F{sub r}/F{sub n} were 0.51/0.46, for fast growing tumors, and 0.88/0.77 for slow growing tumors. For Pd-103, the average values of F{sub r}/F{sub n} were 1.25/1.15 for fast growing tumors, and 1.28/1.22 for slow growing tumors. For Au-198, the average values of F{sub r}/F{sub n} were 1.08/1.25 for fast growing tumors, and 1.00/1.06 for slow growing tumors. Using the biological parameters for the HeLa/C4-I cells, the averaged value of F{sub r} was 1.07/1.11 (rounded to 1.1), and the averaged value of F

  18. Determination of prescription dose for Cs-131 permanent implants using the BED formalism including resensitization correction

    International Nuclear Information System (INIS)

    Luo, Wei; Molloy, Janelle; Aryal, Prakash; Feddock, Jonathan; Randall, Marcus

    2014-01-01

    Purpose: The current widely used biological equivalent dose (BED) formalism for permanent implants is based on the linear-quadratic model that includes cell repair and repopulation but not resensitization (redistribution and reoxygenation). The authors propose a BED formalism that includes all the four biological effects (4Rs), and the authors propose how it can be used to calculate appropriate prescription doses for permanent implants with Cs-131. Methods: A resensitization correction was added to the BED calculation for permanent implants to account for 4Rs. Using the same BED, the prescription doses with Au-198, I-125, and Pd-103 were converted to the isoeffective Cs-131 prescription doses. The conversion factor F, ratio of the Cs-131 dose to the equivalent dose with the other reference isotope (F r : with resensitization, F n : without resensitization), was thus derived and used for actual prescription. Different values of biological parameters such as α, β, and relative biological effectiveness for different types of tumors were used for the calculation. Results: Prescription doses with I-125, Pd-103, and Au-198 ranging from 10 to 160 Gy were converted into prescription doses with Cs-131. The difference in dose conversion factors with (F r ) and without (F n ) resensitization was significant but varied with different isotopes and different types of tumors. The conversion factors also varied with different doses. For I-125, the average values of F r /F n were 0.51/0.46, for fast growing tumors, and 0.88/0.77 for slow growing tumors. For Pd-103, the average values of F r /F n were 1.25/1.15 for fast growing tumors, and 1.28/1.22 for slow growing tumors. For Au-198, the average values of F r /F n were 1.08/1.25 for fast growing tumors, and 1.00/1.06 for slow growing tumors. Using the biological parameters for the HeLa/C4-I cells, the averaged value of F r was 1.07/1.11 (rounded to 1.1), and the averaged value of F n was 1.75/1.18. F r of 1.1 has been applied to

  19. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  20. Should Student Evaluation of Teaching Play a Significant Role in the Formal Assessment of Dental Faculty? Two Viewpoints: Viewpoint 1: Formal Faculty Assessment Should Include Student Evaluation of Teaching and Viewpoint 2: Student Evaluation of Teaching Should Not Be Part of Formal Faculty Assessment.

    Science.gov (United States)

    Rowan, Susan; Newness, Elmer J; Tetradis, Sotirios; Prasad, Joanne L; Ko, Ching-Chang; Sanchez, Arlene

    2017-11-01

    Student evaluation of teaching (SET) is often used in the assessment of faculty members' job performance and promotion and tenure decisions, but debate over this use of student evaluations has centered on the validity, reliability, and application of the data in assessing teaching performance. Additionally, the fear of student criticism has the potential of influencing course content delivery and testing measures. This Point/Counterpoint article reviews the potential utility of and controversy surrounding the use of SETs in the formal assessment of dental school faculty. Viewpoint 1 supports the view that SETs are reliable and should be included in those formal assessments. Proponents of this opinion contend that SETs serve to measure a school's effectiveness in support of its core mission, are valid measures based on feedback from the recipients of educational delivery, and provide formative feedback to improve faculty accountability to the institution. Viewpoint 2 argues that SETs should not be used for promotion and tenure decisions, asserting that higher SET ratings do not correlate with improved student learning. The advocates of this viewpoint contend that faculty members may be influenced to focus on student satisfaction rather than pedagogy, resulting in grade inflation. They also argue that SETs are prone to gender and racial biases and that SET results are frequently misinterpreted by administrators. Low response rates and monotonic response patterns are other factors that compromise the reliability of SETs.

  1. New product development projects evaluation under time uncertainty

    Directory of Open Access Journals (Sweden)

    Thiago Augusto de Oliveira Silva

    2009-12-01

    Full Text Available The development time is one of the key factors that contribute to the new product development success. In spite of that, the impact of the time uncertainty on the development has been not fully exploited, as far as decision supporting models to evaluate this kind of projects is concerned. In this context, the objective of the present paper is to evaluate the development process of new technologies under time uncertainty. We introduce a model which captures this source of uncertainty and develop an algorithm to evaluate projects that incorporates Monte Carlo Simulation and Dynamic Programming. The novelty in our approach is to thoroughly blend the stochastic time with a formal approach to the problem, which preserves the Markov property. We base our model on the distinction between the decision epoch and the stochastic time. We discuss and illustrate the applicability of our model through an empirical example.O tempo de desenvolvimento é um dos fatores-chave que contribuem para o sucesso do desenvolvimento de novos produtos. Apesar disso, o impacto da incerteza de tempo no desenvolvimento tem sido pouco considerado em modelos de avaliação e valoração deste tipo de projetos. Neste contexto, este trabalho tem como objetivo avaliar projetos de desenvolvimento de novas tecnologias mediante o tempo incerto. Introduzimos um modelo capaz de captar esta fonte de incerteza e desenvolvemos um algoritmo para a valoração do projeto que integra Simulação de Monte Carlo e Programação Dinâmica. A novidade neste trabalho é conseguir integrar meticulosamente o tempo estocástico a uma estrutura formal para tomada de decisão que preserva a propriedade de Markov. O principal ponto para viabilizar este fato é distinção entre o momento de revisão e o tempo estocástico. Ilustramos e discutimos a aplicabilidade deste modelo por meio de um exemplo empírico.

  2. A Comparison of Participation Patterns in Selected Formal, Non-Formal, and Informal Online Learning Environments

    Science.gov (United States)

    Schwier, Richard A.; Seaton, J. X.

    2013-01-01

    Does learner participation vary depending on the learning context? Are there characteristic features of participation evident in formal, non-formal, and informal online learning environments? Six online learning environments were chosen as epitomes of formal, non-formal, and informal learning contexts and compared. Transcripts of online…

  3. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  4. Improving Learner Outcomes in Lifelong Education: Formal Pedagogies in Non-Formal Learning Contexts?

    Science.gov (United States)

    Zepke, Nick; Leach, Linda

    2006-01-01

    This article explores how far research findings about successful pedagogies in formal post-school education might be used in non-formal learning contexts--settings where learning may not lead to formal qualifications. It does this by examining a learner outcomes model adapted from a synthesis of research into retention. The article first…

  5. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib; Galassi, R. Malpica; Valorani, M.

    2016-01-01

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  6. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  7. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  8. Improved pion pion scattering amplitude from dispersion relation formalism

    International Nuclear Information System (INIS)

    Cavalcante, I.P.; Coutinho, Y.A.; Borges, J. Sa

    2005-01-01

    Pion-pion scattering amplitude is obtained from Chiral Perturbation Theory at one- and two-loop approximations. Dispersion relation formalism provides a more economic method, which was proved to reproduce the analytical structure of that amplitude at both approximation levels. This work extends the use of the formalism in order to compute further unitarity corrections to partial waves, including the D-wave amplitude. (author)

  9. Informal work and formal plans

    DEFF Research Database (Denmark)

    Dalsted, Rikke Juul; Hølge-Hazelton, Bibi; Kousgaard, Marius Brostrøm

    2012-01-01

    trajectories. METHODS AND THEORY: An in-depth case study of patient trajectories at a Danish hospital and surrounding municipality using individual interviews with patients. Theory about trajectory and work by Strauss was included. RESULTS: Patients continuously took initiatives to organize their treatment...... and care. They initiated processes in the trajectories, and acquired information, which they used to form their trajectories. Patients presented problems to the healthcare professionals in order to get proper help when needed. DISCUSSION: Work done by patients was invisible and not perceived as work....... The patients' requests were not sufficiently supported in the professional organisation of work or formal planning. Patients' insertion and use of information in their trajectories challenged professional views and working processes. And the design of the formal pathway models limits the patients' active...

  10. Proceedings of the First NASA Formal Methods Symposium

    Science.gov (United States)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  11. The fuzzy set theory application to the analysis of accident progression event trees with phenomenological uncertainty issues

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Ahn, Kwang-Il

    1991-01-01

    Fuzzy set theory provides a formal framework for dealing with the imprecision and vagueness inherent in the expert judgement, and therefore it can be used for more effective analysis of accident progression of PRA where experts opinion is a major means for quantifying some event probabilities and uncertainties. In this paper, an example application of the fuzzy set theory is first made to a simple portion of a given accident progression event tree with typical qualitative fuzzy input data, and thereby computational algorithms suitable for application of the fuzzy set theory to the accident progression event tree analysis are identified and illustrated with example applications. Then the procedure used in the simple example is extended to extremely complex accident progression event trees with a number of phenomenological uncertainty issues, i.e., a typical plant damage state 'SEC' of the Zion Nuclear Power Plant risk assessment. The results show that the fuzzy averages of the fuzzy outcomes are very close to the mean values obtained by current methods. The main purpose of this paper is to provide a formal procedure for application of the fuzzy set theory to accident progression event trees with imprecise and qualitative branch probabilities and/or with a number of phenomenological uncertainty issues. (author)

  12. Beyond formalism

    Science.gov (United States)

    Denning, Peter J.

    1991-01-01

    The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.

  13. Uncertainties in the Norwegian greenhouse gas emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Flugsrud, Ketil; Hoem, Britta

    2011-11-15

    The national greenhouse gas (GHG) emission inventory is compiled from estimates based on emission factors and activity data and from direct measurements by plants. All these data and parameters will contribute to the overall inventory uncertainty. The uncertainties and probability distributions of the inventory input parameters have been assessed based on available data and expert judgements.Finally, the level and trend uncertainties of the national GHG emission inventory have been estimated using Monte Carlo simulation. The methods used in the analysis correspond to an IPCC tier 2 method, as described in the IPCC Good Practice Guidance (IPCC 2000) (IPCC 2000). Analyses have been made both excluding and including the sector LULUCF (land use, land-use change and forestry). The uncertainty analysis performed in 2011 is an update of the uncertainty analyses performed for the greenhouse gas inventory in 2006 and 2000. During the project we have been in contact with experts, and have collected information about uncertainty from them. Main focus has been on the source categories where changes have occured since the last uncertainty analysis was performed in 2006. This includes new methodology for several source categories (for example for solvents and road traffic) as well as revised uncertainty estimates. For the installations included in the emission trading system, new information from the annual ETS reports about uncertainty in activity data and CO2 emission factor (and N2O emission factor for nitric acid production) has been used. This has improved the quality of the uncertainty estimates for the energy and manufacturing sectors. The results show that the uncertainty level in the total calculated greenhouse gas emissions for 2009 is around 4 per cent. When including the LULUCF sector, the total uncertainty is around 17 per cent in 2009. The uncertainty estimate is lower now than previous analyses have shown. This is partly due to a considerable work made to improve

  14. Integrated formal operations plan

    Energy Technology Data Exchange (ETDEWEB)

    Cort, G.; Dearholt, W.; Donahue, S.; Frank, J.; Perkins, B.; Tyler, R.; Wrye, J.

    1994-01-05

    The concept of formal operations (that is, a collection of business practices to assure effective, accountable operations) has vexed the Laboratory for many years. To date most attempts at developing such programs have been based upon rigid, compliance-based interpretations of a veritable mountain of Department of Energy (DOE) orders, directives, notices, and standards. These DOE dictates seldom take the broad view but focus on highly specialized programs isolated from the overall context of formal operations. The result is a confusing array of specific, and often contradictory, requirements that produce a patchwork of overlapping niche programs. This unnecessary duplication wastes precious resources, dramatically increases the complexity of our work processes, and communicates a sense of confusion to our customers and regulators. Coupled with the artificial divisions that have historically existed among the Laboratory`s formal operations organizations (quality assurance, configuration management, records management, training, etc.), this approach has produced layers of increasingly vague and complex formal operations plans, each of which interprets its parent and adds additional requirements of its own. Organizational gridlock ensues whenever an activity attempts to implement these bureaucratic monstrosities. The integrated formal operations plan presented is to establish a set of requirements that must be met by an integrated formal operations program, assign responsibilities for implementation and operation of the program, and specify criteria against which the performance of the program will be measured. The accountable line manager specifies the items, processes, and information (the controlled elements) to which the formal operations program specified applies. The formal operations program is implemented using a graded approach based on the level of importance of the various controlled elements and the scope of the activities in which they are involved.

  15. Real-Time Flood Control by Tree-Based Model Predictive Control Including Forecast Uncertainty: A Case Study Reservoir in Turkey

    Directory of Open Access Journals (Sweden)

    Gökçen Uysal

    2018-03-01

    Full Text Available Optimal control of reservoirs is a challenging task due to conflicting objectives, complex system structure, and uncertainties in the system. Real time control decisions suffer from streamflow forecast uncertainty. This study aims to use Probabilistic Streamflow Forecasts (PSFs having a lead-time up to 48 h as input for the recurrent reservoir operation problem. A related technique for decision making is multi-stage stochastic optimization using scenario trees, referred to as Tree-based Model Predictive Control (TB-MPC. Deterministic Streamflow Forecasts (DSFs are provided by applying random perturbations on perfect data. PSFs are synthetically generated from DSFs by a new approach which explicitly presents dynamic uncertainty evolution. We assessed different variables in the generation of stochasticity and compared the results using different scenarios. The developed real-time hourly flood control was applied to a test case which had limited reservoir storage and restricted downstream condition. According to hindcasting closed-loop experiment results, TB-MPC outperforms the deterministic counterpart in terms of decreased downstream flood risk according to different independent forecast scenarios. TB-MPC was also tested considering different number of tree branches, forecast horizons, and different inflow conditions. We conclude that using synthetic PSFs in TB-MPC can provide more robust solutions against forecast uncertainty by resolution of uncertainty in trees.

  16. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    Science.gov (United States)

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  17. Rapid research and implementation priority setting for wound care uncertainties.

    Directory of Open Access Journals (Sweden)

    Trish A Gray

    Full Text Available People with complex wounds are more likely to be elderly, living with multimorbidity and wound related symptoms. A variety of products are available for managing complex wounds and a range of healthcare professionals are involved in wound care, yet there is a lack of good evidence to guide practice and services. These factors create uncertainty for those who deliver and those who manage wound care. Formal priority setting for research and implementation topics is needed to more accurately target the gaps in treatment and services. We solicited practitioner and manager uncertainties in wound care and held a priority setting workshop to facilitate a collaborative approach to prioritising wound care-related uncertainties.We recruited healthcare professionals who regularly cared for patients with complex wounds, were wound care specialists or managed wound care services. Participants submitted up to five wound care uncertainties in consultation with their colleagues, via an on-line survey and attended a priority setting workshop. Submitted uncertainties were collated, sorted and categorised according professional group. On the day of the workshop, participants were divided into four groups depending on their profession. Uncertainties submitted by their professional group were viewed, discussed and amended, prior to the first of three individual voting rounds. Participants cast up to ten votes for the uncertainties they judged as being high priority. Continuing in the professional groups, the top 10 uncertainties from each group were displayed, and the process was repeated. Groups were then brought together for a plenary session in which the final priorities were individually scored on a scale of 0-10 by participants. Priorities were ranked and results presented. Nominal group technique was used for generating the final uncertainties, voting and discussions.Thirty-three participants attended the workshop comprising; 10 specialist nurses, 10 district

  18. Rapid research and implementation priority setting for wound care uncertainties

    Science.gov (United States)

    Dumville, Jo C.; Christie, Janice; Cullum, Nicky A.

    2017-01-01

    Introduction People with complex wounds are more likely to be elderly, living with multimorbidity and wound related symptoms. A variety of products are available for managing complex wounds and a range of healthcare professionals are involved in wound care, yet there is a lack of good evidence to guide practice and services. These factors create uncertainty for those who deliver and those who manage wound care. Formal priority setting for research and implementation topics is needed to more accurately target the gaps in treatment and services. We solicited practitioner and manager uncertainties in wound care and held a priority setting workshop to facilitate a collaborative approach to prioritising wound care-related uncertainties. Methods We recruited healthcare professionals who regularly cared for patients with complex wounds, were wound care specialists or managed wound care services. Participants submitted up to five wound care uncertainties in consultation with their colleagues, via an on-line survey and attended a priority setting workshop. Submitted uncertainties were collated, sorted and categorised according professional group. On the day of the workshop, participants were divided into four groups depending on their profession. Uncertainties submitted by their professional group were viewed, discussed and amended, prior to the first of three individual voting rounds. Participants cast up to ten votes for the uncertainties they judged as being high priority. Continuing in the professional groups, the top 10 uncertainties from each group were displayed, and the process was repeated. Groups were then brought together for a plenary session in which the final priorities were individually scored on a scale of 0–10 by participants. Priorities were ranked and results presented. Nominal group technique was used for generating the final uncertainties, voting and discussions. Results Thirty-three participants attended the workshop comprising; 10 specialist nurses

  19. Treatment and reporting of uncertainties for environmental radiation measurements

    International Nuclear Information System (INIS)

    Colle, R.

    1980-01-01

    Recommendations for a practical and uniform method for treating and reporting uncertainties in environmental radiation measurements data are presented. The method requires that each reported measurement result include the value, a total propagated random uncertainty expressed as the standard deviation, and a combined overall uncertainty. The uncertainty assessment should be based on as nearly a complete assessment as possible and should include every conceivable or likely source of inaccuracy in the result. Guidelines are given for estimating random and systematic uncertainty components, and for propagating and combining them to form an overall uncertainty

  20. NON-FORMAL EDUCATION WITHIN THE FUNCTION OF RESPONSIBLE PARENTING

    Directory of Open Access Journals (Sweden)

    Dragana Bogavac

    2017-06-01

    Full Text Available The aim of this survey was to discover to what degree parental non-formal education is present within the function of responsible parenting. The questionnaire research method was used in the survey. For the purpose of this research a questionnaire of 13 questions was constructed relating to the forms of non-formal education, and another questionnaire of 10 questions relating to the parents’ expectations of non-formal education. The sample included 198 parents. Examination of the scores concerning the presence of certain forms of parental non-formal education realized in cooperation with the school leads to the conclusion that the parents possess a positive attitude towards non-formal education. The analysis showed that the parents’ expectations were not on a satisfactory level. According to the results, the fathers displayed a greater interest towards non-formal education (7.72±1.35 than the mothers (6.93±1.85, (p<0.05. Unemployed parents had a greater score (7.85±1.30 than the employed parents (7.22±1.71, (p<0.05. A difference in the acceptance of non-formal education in accordance with the level of formal education was also noticeable (p<0.001. Respondents with a high school degree displayed the highest level of acceptance (7.97±0.78, while the lowest interest was seen in respondents with an associate degree (6.41±2.29. Univariate linear regression analysis showed that statistically important predictors were: gender (OR: -0.23 (-1.24 – -0.33, p< 0.001, work status (OR: -0.14 (-1.24 – -0.01, < 0.05 and the level of formal education (OR: -0.33 (-0.81 – -0.34, p< 0.001. The final results lead to the conclusion that parental non-formal education supports the concept of lifelong education.

  1. Some reflections on uncertainty analysis and management

    International Nuclear Information System (INIS)

    Aven, Terje

    2010-01-01

    A guide to quantitative uncertainty analysis and management in industry has recently been issued. The guide provides an overall framework for uncertainty modelling and characterisations, using probabilities but also other uncertainty representations (including the Dempster-Shafer theory). A number of practical applications showing how to use the framework are presented. The guide is considered as an important contribution to the field, but there is a potential for improvements. These relate mainly to the scientific basis and clarification of critical issues, for example, concerning the meaning of a probability and the concept of model uncertainty. A reformulation of the framework is suggested using probabilities as the only representation of uncertainty. Several simple examples are included to motivate and explain the basic ideas of the modified framework.

  2. Integrating semi-formal and formal requirements

    NARCIS (Netherlands)

    Wieringa, Roelf J.; Olivé, Antoni; Dubois, Eric; Pastor, Joan Antoni; Huyts, Sander

    1997-01-01

    In this paper, we report on the integration of informal, semiformal and formal requirements specification techniques. We present a framework for requirements specification called TRADE, within which several well-known semiformal specification techniques are placed. TRADE is based on an analysis of

  3. Pure spinor formalism as an N = 2 topological string

    International Nuclear Information System (INIS)

    Berkovits, Nathan

    2005-01-01

    Following suggestions of Nekrasov and Siegel, a non-minimal set of fields are added to the pure spinor formalism for the superstring. Twisted c-circumflex = 3 N = 2 generators are then constructed where the pure spinor BRST operator is the fermionic spin-one generator, and the formalism is interpreted as a critical topological string. Three applications of this topological string theory include the super-Poincare covariant computation of multiloop superstring amplitudes without picture-changing operators, the construction of a cubic open superstring field theory without contact-term problems, and a new four-dimensional version of the pure spinor formalism which computes F-terms in the spacetime action

  4. Unexpected uncertainty, volatility and decision-making

    Directory of Open Access Journals (Sweden)

    Amy Rachel Bland

    2012-06-01

    Full Text Available The study of uncertainty in decision making is receiving greater attention in the fields of cognitive and computational neuroscience. Several lines of evidence are beginning to elucidate different variants of uncertainty. Particularly, risk, ambiguity and expected and unexpected forms of uncertainty are well articulated in the literature. In this article we review both empirical and theoretical evidence arguing for the potential distinction between three forms of uncertainty; expected uncertainty, unexpected uncertainty and volatility. Particular attention will be devoted to exploring the distinction between unexpected uncertainty and volatility which has been less appreciated in the literature. This includes evidence from computational modelling, neuromodulation, neuroimaging and electrophysiological studies. We further address the possible differentiation of cognitive control mechanisms used to deal with these forms of uncertainty. Particularly we explore a role for conflict monitoring and the temporal integration of information into working memory. Finally, we explore whether the Dual Modes of Control theory provides a theoretical framework for understanding the distinction between unexpected uncertainty and volatility.

  5. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  6. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes

  7. Psychologist in non-formal education

    OpenAIRE

    Pavićević Miljana S.

    2011-01-01

    Learning is not limited to school time. It starts at birth and continues throughout the entire life. Equally important as formal education there are also non-formal and informal education. Any kind of learning outside the traditional school can be called informal. However, it is not easy to define non-formal education because it is being described differently, for example as an education movement, process, system… Projects and programs implemented under the name of non-formal education are of...

  8. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  9. Topical Roots of Formal Dialectic

    NARCIS (Netherlands)

    Krabbe, Erik C. W.

    Formal dialectic has its roots in ancient dialectic. We can trace this influence in Charles Hamblin's book on fallacies, in which he introduced his first formal dialectical systems. Earlier, Paul Lorenzen proposed systems of dialogical logic, which were in fact formal dialectical systems avant la

  10. Lending Policies of Informal, Formal, and Semi-formal Lenders: Evidence from Vietnam

    NARCIS (Netherlands)

    Lensink, B.W.; Pham, T.T.T.

    2007-01-01

    This paper compares lending policies of formal, informal and semiformal lenders with respect to household lending in Vietnam. The analysis suggests that the probability of using formal or semiformal credit increases if borrowers provide collateral, a guarantor and/or borrow for business-related

  11. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  12. Hump-shape Uncertainty, Agency Costs and Aggregate Fluctuations

    OpenAIRE

    Lee, Gabriel; Kevin, Salyer; Strobel, Johannes

    2016-01-01

    Previously measured uncertainty shocks using the U.S. data show a hump-shape time path: Uncertainty rises for two years before its decline. Current literature on the effects uncertainty on macroeconomics, including housing, has not accounted for this observation. Consequently, the literature on uncertainty and macroeconomics is divided on the effcts and the propagation mechanism of uncertainty on aggregate uctuations. This paper shows that when uncertainty rises and falls over time, th...

  13. Concepciones acerca de la maternidad en la educación formal y no formal

    Directory of Open Access Journals (Sweden)

    Alvarado Calderón, Kathia

    2005-06-01

    Full Text Available Este artículo presenta algunos resultados de la investigación desarrollada en el Instituto de Investigación en Educación (INIE, bajo el nombre "Construcción del concepto de maternidad en la educación formal y no formal". Utilizando un enfoque cualitativo de investigación, recurrimos a las técnicas de elaboración de dibujos, entrevistas y grupo focal como recursos para la recolección de la información. De esta manera, podemos acercarnos a las concepciones de la maternidad que utilizan los participantes de las diferentes instancias educativas (formal y no formal con quienes se trabajó. This article presents some results the research developed in the Instituto de Investigación en Educación (INIE, named "Construcción del concepto de maternidad en la educación formal y no formal". It begins with a theoretical analysis about social conceptions regarding motherhood in the occidental societies. Among the techniques for gathering information were thematic drawing, interview and focus group, using a qualitative approach research method. This is followed by a brief summary of main findings. The article concludes with a proposal of future working lines for the deconstruction of the motherhood concept in formal and informal education contexts.

  14. MODELS OF AIR TRAFFIC CONTROLLERS ERRORS PREVENTION IN TERMINAL CONTROL AREAS UNDER UNCERTAINTY CONDITIONS

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-03-01

    Full Text Available Purpose: the aim of this study is to research applied models of air traffic controllers’ errors prevention in terminal control areas (TMA under uncertainty conditions. In this work the theoretical framework descripting safety events and errors of air traffic controllers connected with the operations in TMA is proposed. Methods: optimisation of terminal control area formal description based on the Threat and Error management model and the TMA network model of air traffic flows. Results: the human factors variables associated with safety events in work of air traffic controllers under uncertainty conditions were obtained. The Threat and Error management model application principles to air traffic controller operations and the TMA network model of air traffic flows were proposed. Discussion: Information processing context for preventing air traffic controller errors, examples of threats in work of air traffic controllers, which are relevant for TMA operations under uncertainty conditions.

  15. Formal modelling and analysis of socio-technical systems

    DEFF Research Database (Denmark)

    Probst, Christian W.; Kammüller, Florian; Hansen, Rene Rydhof

    2016-01-01

    systems are still mostly identified through brainstorming of experts. In this work we discuss several approaches to formalising socio-technical systems and their analysis. Starting from a flow logic-based analysis of the insider threat, we discuss how to include the socio aspects explicitly, and show......Attacks on systems and organisations increasingly exploit human actors, for example through social engineering. This non-technical aspect of attacks complicates their formal treatment and automatic identification. Formalisation of human behaviour is difficult at best, and attacks on socio-technical...... a formalisation that proves properties of this formalisation. On the formal side, our work closes the gap between formal and informal approaches to socio-technical systems. On the informal side, we show how to steal a birthday cake from a bakery by social engineering....

  16. Sensitivity functions for uncertainty analysis: Sensitivity and uncertainty analysis of reactor performance parameters

    International Nuclear Information System (INIS)

    Greenspan, E.

    1982-01-01

    This chapter presents the mathematical basis for sensitivity functions, discusses their physical meaning and information they contain, and clarifies a number of issues concerning their application, including the definition of group sensitivities, the selection of sensitivity functions to be included in the analysis, and limitations of sensitivity theory. Examines the theoretical foundation; criticality reset sensitivities; group sensitivities and uncertainties; selection of sensitivities included in the analysis; and other uses and limitations of sensitivity functions. Gives the theoretical formulation of sensitivity functions pertaining to ''as-built'' designs for performance parameters of the form of ratios of linear flux functionals (such as reaction-rate ratios), linear adjoint functionals, bilinear functions (such as reactivity worth ratios), and for reactor reactivity. Offers a consistent procedure for reducing energy-dependent or fine-group sensitivities and uncertainties to broad group sensitivities and uncertainties. Provides illustrations of sensitivity functions as well as references to available compilations of such functions and of total sensitivities. Indicates limitations of sensitivity theory originating from the fact that this theory is based on a first-order perturbation theory

  17. Does formal research training lead to academic success in otolaryngology?

    Science.gov (United States)

    Bobian, Michael R; Shah, Noor; Svider, Peter F; Hong, Robert S; Shkoukani, Mahdi A; Folbe, Adam J; Eloy, Jean Anderson

    2017-01-01

    To evaluate whether formalized research training is associated with higher researcher productivity, academic rank, and acquisition of National Institutes of Health (NIH) grants within academic otolaryngology departments. Each of the 100 civilian otolaryngology program's departmental websites were analyzed to obtain a comprehensive list of faculty members credentials and characteristics, including academic rank, completion of a clinical fellowship, completion of a formal research fellowship, and attainment of a doctorate in philosophy (PhD) degree. We also recorded measures of scholarly impact and successful acquisition of NIH funding. A total of 1,495 academic physicians were included in our study. Of these, 14.1% had formal research training. Bivariate associations showed that formal research training was associated with a greater h-index, increased probability of acquiring NIH funding, and higher academic rank. Using a linear regression model, we found that otolaryngologists possessing a PhD had an associated h-index of 1.8 points higher, and those who completed a formal research fellowship had an h-index of 1.6 points higher. A PhD degree or completion of a research fellowship was not associated with a higher academic rank; however, a higher h-index and previous acquisition of an NIH grant were associated with a higher academic rank. The attainment of NIH funding was three times more likely for those with a formal research fellowship and 8.6 times more likely for otolaryngologists with a PhD degree. Formalized research training is associated with academic success in otolaryngology. Such dedicated research training accompanies greater scholarly impact, acquisition of NIH funding, and a higher academic rank. NA Laryngoscope, 127:E15-E21, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  18. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    Energy Technology Data Exchange (ETDEWEB)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  19. Eliciting geologists' tacit model of the uncertainty of mapped geological boundaries

    Science.gov (United States)

    Lark, R. M.; Lawley, R. S.; Barron, A. J. M.; Aldiss, D. T.; Ambrose, K.; Cooper, A. H.; Lee, J. R.; Waters, C. N.

    2015-01-01

    It is generally accepted that geological linework, such as mapped boundaries, are uncertain for various reasons. It is difficult to quantify this uncertainty directly, because the investigation of error in a boundary at a single location may be costly and time consuming, and many such observations are needed to estimate an uncertainty model with confidence. However, it is also recognized across many disciplines that experts generally have a tacit model of the uncertainty of information that they produce (interpretations, diagnoses etc.) and formal methods exist to extract this model in usable form by elicitation. In this paper we report a trial in which uncertainty models for mapped boundaries in six geological scenarios were elicited from a group of five experienced geologists. In five cases a consensus distribution was obtained, which reflected both the initial individually elicted distribution and a structured process of group discussion in which individuals revised their opinions. In a sixth case a consensus was not reached. This concerned a boundary between superficial deposits where the geometry of the contact is hard to visualize. The trial showed that the geologists' tacit model of uncertainty in mapped boundaries reflects factors in addition to the cartographic error usually treated by buffering linework or in written guidance on its application. It suggests that further application of elicitation, to scenarios at an appropriate level of generalization, could be useful to provide working error models for the application and interpretation of linework.

  20. Formalized Informal Learning

    DEFF Research Database (Denmark)

    Levinsen, Karin Tweddell; Sørensen, Birgitte Holm

    2013-01-01

    are examined and the relation between network society competences, learners’ informal learning strategies and ICT in formalized school settings over time is studied. The authors find that aspects of ICT like multimodality, intuitive interaction design and instant feedback invites an informal bricoleur approach....... When integrated into certain designs for teaching and learning, this allows for Formalized Informal Learning and support is found for network society competences building....

  1. Uncertainties in risk assessment and decision making

    International Nuclear Information System (INIS)

    Starzec, Peter; Purucker, Tom; Stewart, Robert

    2008-02-01

    confidence interval under different assumptions regarding the data structure. The results stress the importance to invoke statistical methods and also illustrate how the choice of a wrong methodology may affect the quality of risk assessment and foundations for decision making. The uncertainty in assessing the volume of contaminated soil was shown to be dependant only to a low extent on the interpolation technique used for the specific case study analyzed. It is, however, expected that the uncertainty may increase significantly, if more restrictive risk criteria (lower guideline value) are applied. Despite a possible low uncertainty in assessing the contaminated soil volume, the uncertainty in its localization can be substantial. Based on the demo example presented, it comes out that the risk-based input for decision on soil treatment may vary depending on what assumptions were adopted during interpolation process. Uncertainty in an ecological exposure model with regard to the moving pattern of a receptor in relation till spatial distribution of contaminant has been demonstrated by studies on pronghorn (Antilocapra americana). The results from numerical simulations show that a lack in knowledge on the receptor moving routes may bring about substantial uncertainty in exposure assessment. The presented concept is mainly applicable for 'mobile' receptors on relatively large areas. A number of statistical definitions/methods/concepts are presented in the report of which some are not elaborated on in detail, while readers are referred to proper literature. The mail goal with the study has been rather to shed more light on aspects related to uncertainty in risk assessment and to demonstrate potential consequences of wrong approach than to provide readers with formal guideline and recommendations. However, the outcome from the study will hopefully contribute to the further work on novel approaches towards more reliable risk assessments

  2. An Intelligent Information Retrieval Approach Based on Two Degrees of Uncertainty Fuzzy Ontology

    OpenAIRE

    Maryam Hourali; Gholam Ali Montazer

    2011-01-01

    In spite of the voluminous studies in the field of intelligent retrieval systems, effective retrieving of information has been remained an important unsolved problem. Implementations of different conceptual knowledge in the information retrieval process such as ontology have been considered as a solution to enhance the quality of results. Furthermore, the conceptual formalism supported by typical ontology may not be sufficient to represent uncertainty information due to the lack of clear-cut ...

  3. On a systematic perspective on risk for formal safety assessment (FSA)

    International Nuclear Information System (INIS)

    Montewka, Jakub; Goerlandt, Floris; Kujala, Pentti

    2014-01-01

    In the maritime domain, risk is evaluated within the framework of the Formal Safety Assessment (FSA), introduced by the International Maritime Organization in 2002. Although the FSA has become an internationally recognized and recommended method, the definition, which is adopted there, to describe the risk, seems to be too narrow to reflect the actual content of the FSA. Therefore this article discusses methodological requirements for the risk perspective, which is appropriate for risk management in the maritime domain with special attention to maritime transportation systems. A perspective that is proposed here considers risk as a set encompassing the following: a set of plausible scenarios leading to an accident, the likelihoods of unwanted events within the scenarios, the consequences of the events and description of uncertainty. All these elements are conditional upon the available knowledge (K) about the analyzed system and understanding (N) of the system behavior. Therefore, the quality of K and the level of N of a risk model should be reflected in the uncertainty description. For this purpose we introduce a qualitative scoring system, and we show its applicability on an exemplary risk model for a RoPax ship. - Highlights: • We present a risk perspective for the maritime domain. • A distinction between knowledge and understanding is made. • We describe risk as (Scenario, Consequences, Uncertainty/Knowledge, Understanding). • The perspective highlights the strength and weaknesses of a given risk analysis

  4. One-boson exchange model in the Tobocman-Chulick formalism

    International Nuclear Information System (INIS)

    Chulick, G.S.

    1988-01-01

    An alternative method to the standard techniques of field theory for the derivation of few-body dynamical equations is presented here. This new formalism gives rise to a set of coupled, three-dimensional, relativistic equations which represent one or more (coupled channel) nuclear interactive processes. The particles represented by these equations are dressed and/or are composite, with mass and vertex renormalization done in a simple, straightforward manner. The n-boson Tamm-Dancoff approximation is then used to restrict to a reasonable amount the number of coupled equations to be solved. In the one-boson Tamm-Dancoff approximation, the formalism gives rise to relativistic One-Boson Exchange time-ordered perturbation theory: i.e., the basic Bonn potential. Moreover, the formalism gives the Bonn potential a firmer theoretical basis, with physical particles, and with mass and vertex renormalization systematically taken into account. The formalism was tested numerically at two levels. First, it was tested for the simple model of elastic scalar NN scattering via the exchange of a single scalar boson. The resultant phase shifts, when compared to those for the Bethe-Salpeter equation and several of its three-dimensional reductions for the same model, were found to be reasonable. Next, the formalism was tested for the same model expanded to include non-elastic NN scattering processes. Even though the resultant scattering cross-sections were not compatible to the empirical scattering cross-sections, it was possible to discern what must be included in the model to obtain qualitative agreement

  5. The base of the iceberg: informal learning and its impact on formal and non-formal learning

    OpenAIRE

    Rogers, Alan

    2014-01-01

    The author looks at learning (formal, non-formal and informal) and examines the hidden world of informal (unconscious, unplanned) learning. He points out the importance of informal learning for creating tacit attitudes and values, knowledge and skills which influence (conscious, planned) learning - formal and non-formal. Moreover, he explores the implications of informal learning for educational planners and teachers in the context of lifelong learning. While mainly aimed at adult educators, ...

  6. Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.

    Science.gov (United States)

    Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin

    2013-08-01

    The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.

  7. Photon scattering from a system of multilevel quantum emitters. I. Formalism

    Science.gov (United States)

    Das, Sumanta; Elfving, Vincent E.; Reiter, Florentin; Sørensen, Anders S.

    2018-04-01

    We introduce a formalism to solve the problem of photon scattering from a system of multilevel quantum emitters. Our approach provides a direct solution of the scattering dynamics. As such the formalism gives the scattered fields' amplitudes in the limit of a weak incident intensity. Our formalism is equipped to treat both multiemitter and multilevel emitter systems, and is applicable to a plethora of photon-scattering problems, including conditional state preparation by photodetection. In this paper, we develop the general formalism for an arbitrary geometry. In the following paper (part II) S. Das et al. [Phys. Rev. A 97, 043838 (2018), 10.1103/PhysRevA.97.043838], we reduce the general photon-scattering formalism to a form that is applicable to one-dimensional waveguides and show its applicability by considering explicit examples with various emitter configurations.

  8. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to keff sensitivity data, cross-section uncertainty data, how keff sensitivity data and keff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  9. SU-F-BRE-14: Uncertainty Analysis for Dose Measurements Using OSLD NanoDots

    Energy Technology Data Exchange (ETDEWEB)

    Kry, S; Alvarez, P; Stingo, F; Followill, D [UT MD Anderson Cancer Center, Houston, TX (United States)

    2014-06-15

    Purpose: Optically stimulated luminescent dosimeters (OSLD) are an increasingly popular dosimeter for research and clinical applications. It is also used by the Radiological Physics Center for remote auditing of machine output. In this work we robustly calculated the reproducibility and uncertainty of the OSLD nanoDot. Methods: For the RPC dose calculation, raw readings are corrected for depletion, element sensitivity, fading, linearity, and energy. System calibration is determined for the experimental OSLD irradiated at different institutions by using OSLD irradiated by the RPC under reference conditions (i.e., standards): 1 Gy in a Cobalt beam. The intra-dot and inter-dot reproducibilities (coefficient of variation) were determined from the history of RPC readings of these standards. The standard deviation of the corrected OSLD signal was then calculated analytically using a recursive formalism that did not rely on the normality assumption of the underlying uncertainties, or on any type of mathematical approximation. This analytical uncertainty was compared to that empirically estimated from >45,000 RPC beam audits. Results: The intra-dot variability was found to be 0.59%, with only a small variation between readers. Inter-dot variability was found to be 0.85%. The uncertainty in each of the individual correction factors was empirically determined. When the raw counts from each OSLD were adjusted for the appropriate correction factors, the analytically determined coefficient of variation was 1.8% over a range of institutional irradiation conditions that are seen at the RPC. This is reasonably consistent with the empirical observations of the RPC, where the coefficient of variation of the measured beam outputs is 1.6% (photons) and 1.9% (electrons). Conclusion: OSLD nanoDots provide sufficiently good precision for a wide range of applications, including the RPC remote monitoring program for megavoltage beams. This work was supported by PHS grant CA10953 awarded by

  10. Non-Formal Educator Use of Evaluation Results

    Science.gov (United States)

    Baughman, Sarah; Boyd, Heather H.; Franz, Nancy K.

    2012-01-01

    Increasing demands for accountability in educational programming have resulted in increasing calls for program evaluation in educational organizations. Many organizations include conducting program evaluations as part of the job responsibilities of program staff. Cooperative Extension is a complex organization offering non-formal educational…

  11. Matching biomedical ontologies based on formal concept analysis.

    Science.gov (United States)

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign

  12. Joint measurements of spin, operational locality and uncertainty

    International Nuclear Information System (INIS)

    Andersson, E.; Barnett, S.M.; Aspect, A.

    2005-01-01

    Full text: Joint measurements of non-commuting observables are possible within quantum mechanics, if one accepts an increase in the variances of the jointly measured observables. In this contribution, we discuss joint measurements of spin 1/2 along any two directions. Starting from an operational locality principle, we show how to obtain the known bound on how sharp the joint measurement can be. Operational locality here means, that no operation performed at a quantum system at one location can instantaneously affect a system at another location. The measurement bound is general and is here obtained without reference to any quantum measurement formalism. We find that the bound is formally identical to a Bell inequality of the CHSH type, and we also give a direct interpretation of the measurement bound in terms of an uncertainty relation. A simple way to realise the joint measurement for the case of photon polarization is presented. Further to their fundamental interest, quantum joint measurements of non-commuting observables can be related to state estimation. They are also of interest in quantum information, e.g. as strategies for eavesdropping in quantum cryptography. (author)

  13. Uncertainty in Measurement: Procedures for Determining Uncertainty With Application to Clinical Laboratory Calculations.

    Science.gov (United States)

    Frenkel, Robert B; Farrance, Ian

    2018-01-01

    The "Guide to the Expression of Uncertainty in Measurement" (GUM) is the foundational document of metrology. Its recommendations apply to all areas of metrology including metrology associated with the biomedical sciences. When the output of a measurement process depends on the measurement of several inputs through a measurement equation or functional relationship, the propagation of uncertainties in the inputs to the uncertainty in the output demands a level of understanding of the differential calculus. This review is intended as an elementary guide to the differential calculus and its application to uncertainty in measurement. The review is in two parts. In Part I, Section 3, we consider the case of a single input and introduce the concepts of error and uncertainty. Next we discuss, in the following sections in Part I, such notions as derivatives and differentials, and the sensitivity of an output to errors in the input. The derivatives of functions are obtained using very elementary mathematics. The overall purpose of this review, here in Part I and subsequently in Part II, is to present the differential calculus for those in the medical sciences who wish to gain a quick but accurate understanding of the propagation of uncertainties. © 2018 Elsevier Inc. All rights reserved.

  14. Formalized Epistemology, Logic, and Grammar

    Science.gov (United States)

    Bitbol, Michel

    The task of a formal epistemology is defined. It appears that a formal epistemology must be a generalization of "logic" in the sense of Wittgenstein's Tractatus. The generalization is required because, whereas logic presupposes a strict relation between activity and language, this relation may be broken in some domains of experimental enquiry (e.g., in microscopic physics). However, a formal epistemology should also retain a major feature of Wittgenstein's "logic": It must not be a discourse about scientific knowledge, but rather a way of making manifest the structures usually implicit in knowledge-gaining activity. This strategy is applied to the formalism of quantum mechanics.

  15. Use of health effect risk estimates and uncertainty in formal regulatory proceedings: a case study involving atmospheric particulates

    International Nuclear Information System (INIS)

    Habegger, L.J.; Oezkaynak, A.H.

    1984-01-01

    Coal combustion particulates are released to the atmosphere by power plants supplying electrical to the nuclear fuel cycle. This paper presents estimates of the public health risks associated with the release of these particulates at a rate associated with the annual nuclear fuel production requirements for a nuclear power plan. Utilization of these risk assessments as a new component in the formal evaluation of total risks from nuclear power plants is discussed. 23 references, 3 tables

  16. Interacting hadron resonance gas model in the K -matrix formalism

    Science.gov (United States)

    Dash, Ashutosh; Samanta, Subhasis; Mohanty, Bedangadas

    2018-05-01

    An extension of hadron resonance gas (HRG) model is constructed to include interactions using relativistic virial expansion of partition function. The noninteracting part of the expansion contains all the stable baryons and mesons and the interacting part contains all the higher mass resonances which decay into two stable hadrons. The virial coefficients are related to the phase shifts which are calculated using K -matrix formalism in the present work. We have calculated various thermodynamics quantities like pressure, energy density, and entropy density of the system. A comparison of thermodynamic quantities with noninteracting HRG model, calculated using the same number of hadrons, shows that the results of the above formalism are larger. A good agreement between equation of state calculated in K -matrix formalism and lattice QCD simulations is observed. Specifically, the lattice QCD calculated interaction measure is well described in our formalism. We have also calculated second-order fluctuations and correlations of conserved charges in K -matrix formalism. We observe a good agreement of second-order fluctuations and baryon-strangeness correlation with lattice data below the crossover temperature.

  17. Probability of Loss of Assured Safety in Systems with Multiple Time-Dependent Failure Modes: Incorporation of Delayed Link Failure in the Presence of Aleatory Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon C. [Arizona State Univ., Tempe, AZ (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sallaberry, Cedric Jean-Marie. [Engineering Mechanics Corp. of Columbus, OH (United States)

    2018-02-01

    Probability of loss of assured safety (PLOAS) is modeled for weak link (WL)/strong link (SL) systems in which one or more WLs or SLs could potentially degrade into a precursor condition to link failure that will be followed by an actual failure after some amount of elapsed time. The following topics are considered: (i) Definition of precursor occurrence time cumulative distribution functions (CDFs) for individual WLs and SLs, (ii) Formal representation of PLOAS with constant delay times, (iii) Approximation and illustration of PLOAS with constant delay times, (iv) Formal representation of PLOAS with aleatory uncertainty in delay times, (v) Approximation and illustration of PLOAS with aleatory uncertainty in delay times, (vi) Formal representation of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, (vii) Approximation and illustration of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, and (viii) Procedures for the verification of PLOAS calculations for the three indicated definitions of delayed link failure.

  18. The role of models in managing the uncertainty of software-intensive systems

    International Nuclear Information System (INIS)

    Littlewood, Bev; Neil, Martin; Ostrolenk, Gary

    1995-01-01

    It is increasingly argued that uncertainty is an inescapable feature of the design and operational behaviour of software-intensive systems. This paper elaborates the role of models in managing such uncertainty, in relation to evidence and claims for dependability. Personal and group models are considered with regard to abstraction, consensus and corroboration. The paper focuses on the predictive property of models, arguing for the need for empirical validation of their trustworthiness through experimentation and observation. The impact on trustworthiness of human fallibility, formality of expression and expressiveness is discussed. The paper identifies two criteria for deciding the degree of trust to be placed in a model, and hence also for choosing between models, namely accuracy and informativeness. Finally, analogy and reuse are proposed as the only means by which empirical evidence can be established for models in software engineering

  19. A systematic framework for effective uncertainty assessment of severe accident calculations; Hybrid qualitative and quantitative methodology

    International Nuclear Information System (INIS)

    Hoseyni, Seyed Mohsen; Pourgol-Mohammad, Mohammad; Tehranifard, Ali Abbaspour; Yousefpour, Faramarz

    2014-01-01

    This paper describes a systematic framework for characterizing important phenomena and quantifying the degree of contribution of each parameter to the output in severe accident uncertainty assessment. The proposed methodology comprises qualitative as well as quantitative phases. The qualitative part so called Modified PIRT, being a robust process of PIRT for more precise quantification of uncertainties, is a two step process for identifying and ranking based on uncertainty importance in severe accident phenomena. In this process identified severe accident phenomena are ranked according to their effect on the figure of merit and their level of knowledge. Analytical Hierarchical Process (AHP) serves here as a systematic approach for severe accident phenomena ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the severe accident model(s) used to represent the important phenomena. The methodology uses subjective justification by evaluating available information and data from experiments, and code predictions for this step. The quantitative part utilizes uncertainty importance measures for the quantification of the effect of each input parameter to the output uncertainty. A response surface fitting approach is proposed for estimating associated uncertainties with less calculation cost. The quantitative results are used to plan in reducing epistemic uncertainty in the output variable(s). The application of the proposed methodology is demonstrated for the ACRR MP-2 severe accident test facility. - Highlights: • A two stage framework for severe accident uncertainty analysis is proposed. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • Uncertainty importance measure quantitatively calculates effect of each uncertainty source. • Methodology is applied successfully on ACRR MP-2 severe accident test facility

  20. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  1. New procedure for departure formalities

    CERN Multimedia

    HR & GS Departments

    2011-01-01

    As part of the process of simplifying procedures and rationalising administrative processes, the HR and GS Departments have introduced new personalised departure formalities on EDH. These new formalities have applied to students leaving CERN since last year and from 17 October 2011 this procedure will be extended to the following categories of CERN personnel: Staff members, Fellows and Associates. It is planned to extend this electronic procedure to the users in due course. What purpose do departure formalities serve? The departure formalities are designed to ensure that members of the personnel contact all the relevant services in order to return any necessary items (equipment, cards, keys, dosimeter, electronic equipment, books, etc.) and are aware of all the benefits to which they are entitled on termination of their contract. The new departure formalities on EDH have the advantage of tailoring the list of services that each member of the personnel must visit to suit his individual contractual and p...

  2. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  3. [How to write an article: formal aspects].

    Science.gov (United States)

    Corral de la Calle, M A; Encinas de la Iglesia, J

    2013-06-01

    Scientific research and the publication of the results of the studies go hand in hand. Exquisite research methods can only be adequately reflected in formal publication with the optimum structure. To ensure the success of this process, it is necessary to follow orderly steps, including selecting the journal in which to publish and following the instructions to authors strictly as well as the guidelines elaborated by diverse societies of editors and other institutions. It is also necessary to structure the contents of the article in a logical and attractive way and to use an accurate, clear, and concise style of language. Although not all the authors are directly involved in the actual writing, elaborating a scientific article is a collective undertaking that does not finish until the article is published. This article provides practical advice about formal and not-so-formal details to take into account when writing a scientific article as well as references that will help readers find more information in greater detail. Copyright © 2012 SERAM. Published by Elsevier Espana. All rights reserved.

  4. SPATIAL UNCERTAINTY IN LINE-SURFACE INTERSECTIONS WITH APPLICATIONS TO PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    J. Marshall

    2012-07-01

    Full Text Available The fields of photogrammetry and computer vision routinely use line-surface intersections to determine the point where a line intersects with a surface. The object coordinates of the intersection point can be found using standard geometric and numeric algorithms, however expressing the spatial uncertainty at the intersection point may be challenging, especially when the surface morphology is complex. This paper describes an empirical method to characterize the unknown spatial uncertainty at the intersection point by propagating random errors in the stochastic model using repeated random sampling methods. These methods accommodate complex surface morphology and nonlinearities in the functional model, however the penalty is the resulting probability density function associated with the intersection point may be non-Gaussian in nature. A formal hypothesis test is presented to show that straightforward statistical inference tools are available whether the data is Gaussian or not. The hypothesis test determines whether the computed intersection point is consistent with an externally-derived known truth point. A numerical example demonstrates the approach in a photogrammetric setting with a single frame image and a gridded terrain elevation model. The results show that uncertainties produced by the proposed empirical method are intuitive and can be assessed with conventional methods found in textbook hypothesis testing.

  5. Cultural distance, political risk, or governance quality? Towards a more accurate conceptualization and measurement of external uncertainty in foreign entry mode research

    NARCIS (Netherlands)

    Slangen, A.H.L.; van Tulder, R.J.M.

    2009-01-01

    It is well accepted that multinational enterprises (MNEs) prefer equity joint ventures (JVs) over wholly owned subsidiaries (WOSs) in foreign countries where the formal and informal external environment is highly uncertain. Many entry mode studies have modeled the external uncertainty faced by MNEs

  6. Concepts of formal concept analysis

    Science.gov (United States)

    Žáček, Martin; Homola, Dan; Miarka, Rostislav

    2017-07-01

    The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.

  7. Rehabilitating Ex-Offenders Through Non-Formal Education in Lesotho

    Directory of Open Access Journals (Sweden)

    Nomazulu Ngozwana

    2017-03-01

    Full Text Available This paper reports on the rehabilitation of ex-offenders through non-formal education. It examines how non-formal education has addressed the ex-offenders’ adaptive and transformative needs. Using an interpretive paradigm and qualitative approach, individual interviews were conducted with five ex-offenders who were chosen through purposive and snowball sampling. Qualitative data analysis was used to generate the themes from the data. The findings revealed that ex-offenders were taught basic literacy and life skills through non-formal education. Moreover, non-formal education facilitated the ex-offenders’ transformed attitudes, including recognizing their identity as a result of transformative non-formal education. Some ex-offenders in Lesotho demonstrated how by tailoring programs and utilizing their own personal knowledge, they were able to share skills in spite of the prison bureaucracy and have consequently established an organization that serves as a link between prison and society. However, there should be a holistic approach to learning, which can target the immediate application of skills once offenders are released from prison. Similarly, offenders need access to educational resources once they leave prison that can build on what they already know/have learned so that they can turn their lives around.

  8. Critical formalism or digital biomorphology. The contemporary architecture formal dilema

    Directory of Open Access Journals (Sweden)

    Beatriz Villanueva Cajide

    2018-05-01

    Full Text Available With the dawn of digital media the architecture’s formal possibilities reached a level unknown before. The Guggenheim Museo branch in Bilbao appears in 1993 as the materialisation of the possibilities of the use of digital tools in architecture’s design, starting the development of a digital based architecture which currently has reached an exhaustion level that is evident in the repetition biomorphologic shapes emerged from the digital determinism to which some contemporary architectural practices have converged. While the digitalisation of the architectural process is irreversible and desirable, it is necessary to rethink the terms of this collaboration beyond the possibilities of the digital tools themselves. This article proposes to analyse seven texts written in the very moment when digitalisation became a real possibility, between Gehry’s conception of the Guggenheim Museum in 1992 and the Congress on Morphogenesis hold in the Architectural Association in 2004, in order to explore the possibility of reversing the process that has led to the formal exhaustion of digital architecture, from the acceptance of incorporating strategies coming from a contemporary critical formalism.

  9. Y-formalism and b ghost in the non-minimal pure spinor formalism of superstrings

    International Nuclear Information System (INIS)

    Oda, Ichiro; Tonin, Mario

    2007-01-01

    We present the Y-formalism for the non-minimal pure spinor quantization of superstrings. In the framework of this formalism we compute, at the quantum level, the explicit form of the compound operators involved in the construction of the b ghost, their normal-ordering contributions and the relevant relations among them. We use these results to construct the quantum-mechanical b ghost in the non-minimal pure spinor formalism. Moreover we show that this non-minimal b ghost is cohomologically equivalent to the non-covariant b ghost

  10. Formal and Informal Continuing Education Activities and Athletic Training Professional Practice

    Science.gov (United States)

    Armstrong, Kirk J.; Weidner, Thomas G.

    2010-01-01

    Abstract Context: Continuing education (CE) is intended to promote professional growth and, ultimately, to enhance professional practice. Objective: To determine certified athletic trainers' participation in formal (ie, approved for CE credit) and informal (ie, not approved for CE credit) CE activities and the perceived effect these activities have on professional practice with regard to improving knowledge, clinical skills and abilities, attitudes toward patient care, and patient care itself. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographic, stratified random sample of 1000 athletic trainers, 427 (42.7%) completed the survey. Main Outcome Measure(s): The Survey of Formal and Informal Athletic Training Continuing Education Activities was developed and administered electronically. The survey consisted of demographic characteristics and Likert-scale items regarding CE participation and perceived effect of CE on professional practice. Internal consistency of survey items was determined using the Cronbach α (α  =  0.945). Descriptive statistics were computed for all items. An analysis of variance and dependent t tests were calculated to determine differences among respondents' demographic characteristics and their participation in, and perceived effect of, CE activities. The α level was set at .05. Results: Respondents completed more informal CE activities than formal CE activities. Participation in informal CE activities included reading athletic training journals (75.4%), whereas formal CE activities included attending a Board of Certification–approved workshop, seminar, or professional conference not conducted by the National Athletic Trainers' Association or affiliates or committees (75.6%). Informal CE activities were perceived to improve clinical skills or abilities and attitudes toward patient care. Formal CE activities were perceived to enhance knowledge. Conclusions: More

  11. Geometry and Formal Linguistics.

    Science.gov (United States)

    Huff, George A.

    This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…

  12. Agent-based analysis of organizations : formalization and simulation

    NARCIS (Netherlands)

    Dignum, M.V.; Tick, C.

    2007-01-01

    Organizational effectiveness depends on many factors, including individual excellence, efficient structures, effective planning and capability to understand and match context requirements. We propose a way to model organizational performance based on a combination of formal models and

  13. Interactions between perceived uncertainty types in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2018-01-01

    to avoid business failure. A conceptual framework of four uncertainty types is investigated: environmental, technological, organisational, and relational uncertainty. We present insights from four empirical cases of service dyads collected via multiple sources of evidence including 54 semi-structured...... interviews, observations, and secondary data. The cases show seven interaction paths with direct knock-on effects between two uncertainty types and indirect knock-on effects between three or four uncertainty types. The findings suggest a causal chain from environmental, technological, organisational......, to relational uncertainty. This research contributes to the servitization literature by (i) con-firming the existence of uncertainty types, (ii) providing an in-depth characterisation of technological uncertainty, and (iii) showing the interaction paths between four uncertainty types in the form of a causal...

  14. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  15. The Transition to Formal Thinking in Mathematics

    Science.gov (United States)

    Tall, David

    2008-01-01

    This paper focuses on the changes in thinking involved in the transition from school mathematics to formal proof in pure mathematics at university. School mathematics is seen as a combination of visual representations, including geometry and graphs, together with symbolic calculations and manipulations. Pure mathematics in university shifts…

  16. Formal Testing of Correspondence Carrying Software

    NARCIS (Netherlands)

    Bujorianu, M.C.; Bujorianu, L.M.; Maharaj, S.

    2008-01-01

    Nowadays formal software development is characterised by use of multitude formal specification languages. Test case generation from formal specifications depends in general on a specific language, and, moreover, there are competing methods for each language. There is a need for a generic approach to

  17. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    Science.gov (United States)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  18. Formal Women-only Networks

    DEFF Research Database (Denmark)

    Villesèche, Florence; Josserand, Emmanuel

    2017-01-01

    /organisations and the wider social group of women in business. Research limitations/implications: The authors focus on the distinction between external and internal formal women-only networks while also acknowledging the broader diversity that can characterise such networks. Their review provides the reader with an insight...... member level, the authors suggest that such networks can be of value for organisations and the wider social group of women in management and leadership positions.......Purpose: The purpose of this paper is to review the emerging literature on formal women-only business networks and outline propositions to develop this under-theorised area of knowledge and stimulate future research. Design/methodology/approach: The authors review the existing literature on formal...

  19. Evacuation decision-making: process and uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs.

  20. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  1. Evacuation decision-making: process and uncertainty

    International Nuclear Information System (INIS)

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs

  2. An Intelligent Information Retrieval Approach Based on Two Degrees of Uncertainty Fuzzy Ontology

    Directory of Open Access Journals (Sweden)

    Maryam Hourali

    2011-01-01

    Full Text Available In spite of the voluminous studies in the field of intelligent retrieval systems, effective retrieving of information has been remained an important unsolved problem. Implementations of different conceptual knowledge in the information retrieval process such as ontology have been considered as a solution to enhance the quality of results. Furthermore, the conceptual formalism supported by typical ontology may not be sufficient to represent uncertainty information due to the lack of clear-cut boundaries between concepts of the domains. To tackle this type of problems, one possible solution is to insert fuzzy logic into ontology construction process. In this article, a novel approach for fuzzy ontology generation with two uncertainty degrees is proposed. Hence, by implementing linguistic variables, uncertainty level in domain's concepts (Software Maintenance Engineering (SME domain has been modeled, and ontology relations have been modeled by fuzzy theory consequently. Then, we combined these uncertain models and proposed a new ontology with two degrees of uncertainty both in concept expression and relation expression. The generated fuzzy ontology was implemented for expansion of initial user's queries in SME domain. Experimental results showed that the proposed model has better overall retrieval performance comparing to keyword-based or crisp ontology-based retrieval systems.

  3. Concept similarity and related categories in information retrieval using formal concept analysis

    Science.gov (United States)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  4. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  5. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  6. Understanding visualization: a formal approach using category theory and semiotics.

    Science.gov (United States)

    Vickers, Paul; Faith, Joe; Rossiter, Nick

    2013-06-01

    This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.

  7. Formalizing the concept of sound.

    Energy Technology Data Exchange (ETDEWEB)

    Kaper, H. G.; Tipei, S.

    1999-08-03

    The notion of formalized music implies that a musical composition can be described in mathematical terms. In this article we explore some formal aspects of music and propose a framework for an abstract approach.

  8. Formal Analysis of Domain Models

    National Research Council Canada - National Science Library

    Bharadwaj, Ramesh

    2002-01-01

    Recently, there has been a great deal of interest in the application of formal methods, in particular, precise formal notations and automatic analysis tools for the creation and analysis of requirements specifications (i.e...

  9. Notes on the implementation of the TG-43 formalism in high-rate brachytherapy; Notas sobre la implementacion del formalismo TG-43 en braquiterapia de lata tasa

    Energy Technology Data Exchange (ETDEWEB)

    Sendon del Rio, J. R.; Gonzalez Ruiz, C.; Garcia Marcos, R.; Jimenez Rojas, R.; Lopez Bote, M. A.

    2011-07-01

    The TG-43 formalism is based on dosimetric parameters depend on the specific font design extracted from dose distributions calculated by Monte Carlo in water. Relatively easy to implement, yet provides a degree of uncertainty, making it necessary to verify the calculation algorithm in the planning system to assess its behavior.

  10. Variational formalism for kinetic-MHD instabilities in tokamaks

    International Nuclear Information System (INIS)

    Edery, D.; Garbet, X.; Roubin, J.P.; Samain, A.

    1991-07-01

    A variational formalism that includes in a consistent way the tokamak plasma fluid response to an electromagnetic field as well as the particle-field resonant interaction effects is presented. The integrability of the unperturbed motion of the particles is used to establish a general functional similar to the classical Lagrangian for the electromagnetic field, which is extremum with respect to the field potentials. This functional is the sum of fluid terms closely related to the classical MHD energy and of resonant terms describing the kinetic effects. The formalism is used to study a critical issue in tokamak confinement, namely the sawteeth stabilization by energetic particles

  11. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the

  12. Formal Methods for Life-Critical Software

    Science.gov (United States)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  13. A Survey of Formal Methods in Software Development

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2012-01-01

    The use of formal methods and formal techniques in industry is steadily growing. In this survey we shall characterise what we mean by software development and by a formal method; briefly overview a history of formal specification languages - some of which are: VDM (Vienna Development Method, 1974...... need for multi-language formalisation (Petri Nets, MSC, StateChart, Temporal Logics); the sociology of university and industry acceptance of formal methods; the inevitability of the use of formal software development methods; while referring to seminal monographs and textbooks on formal methods....

  14. Contribution to uncertainties evaluation for fast reactors neutronic cross sections

    International Nuclear Information System (INIS)

    Privas, Edwin

    2015-01-01

    The thesis has been motivated by a wish to increase the uncertainty knowledge on nuclear data, for safety criteria. It aims the cross sections required by core calculation for sodium fast reactors (SFR), and new tools to evaluate its.The main objective of this work is to provide new tools in order to create coherent evaluated files, with reliable and mastered uncertainties. To answer those problematic, several methods have been implemented within the CONRAD code, which is developed at CEA of Cadarache. After a summary of all the elements required to understand the evaluation world, stochastic methods are presented in order to solve the Bayesian inference. They give the evaluator more information about probability density and they also can be used as validation tools. The algorithms have been successfully tested, despite long calculation time. Then, microscopic constraints have been implemented in CONRAD. They are defined as new information that should be taken into account during the evaluation process. An algorithm has been developed in order to solve, for example, continuity issues between two energy domains, with the Lagrange multiplier formalism. Another method is given by using a marginalization procedure, in order to either complete an existing evaluation with new covariance or add systematic uncertainty on an experiment described by two theories. The algorithms are well performed along examples, such the 238 U total cross section. The last parts focus on the integral data feedback, using methods of integral data assimilation to reduce the uncertainties on cross sections. This work ends with uncertainty reduction on key nuclear reactions, such the capture and fission cross sections of 238 U and 239 Pu, thanks to PROFIL and PROFIL-2 experiments in Phenix and the Jezebel benchmark. (author) [fr

  15. The Integration of Formal and Non-formal Education: The Dutch “brede school”

    Directory of Open Access Journals (Sweden)

    du Bois-Reymond, Manuela

    2009-12-01

    Full Text Available The Dutch “brede school” (BS development originates in the 1990s and has spread unevenly since: quicker in the primary than secondary educational sector. In 2007, there were about 1000 primary and 350 secondary BS schools and it is the intention of the government as well as the individual municipalities to extend that number and make the BS the dominant school form of the near future. In the primary sector, a BS cooperates with crèche and preschool facilities, besides possible other neighborhood partners. The main targets are, first, to enhance educational opportunities, particularly for children with little (western- cultural capital, and secondly to increase women’s labor market participation by providing extra familial care for babies and small children. All primary schools are now obliged to provide such care. In the secondary sector, a BS is less neighborhood-orientated than a primary BS because those schools are bigger and more often located in different buildings. As in the primary sector, there are broad and more narrow BS, the first profile cooperating with many non-formal and other partners and facilities and the second with few. On the whole, there is a wide variety of BS schools, with different profiles and objectives, dependent on the needs and wishes of the initiators and the neighborhood. A BS is always the result of initiatives of the respective school and its partners: parents, other neighborhood associations, municipality etc. BS schools are not enforced by the government although the general trend will be that existing school organizations transform into BS. The integration of formal and non-formal education and learning is more advanced in primary than secondary schools. In secondary education, vocational as well as general, there is a clear dominance of formal education; the non-formal curriculum serves mainly two lines and objectives: first, provide attractive leisure activities and second provide compensatory

  16. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  17. Sea-Level Trend Uncertainty With Pacific Climatic Variability and Temporally-Correlated Noise

    Science.gov (United States)

    Royston, Sam; Watson, Christopher S.; Legrésy, Benoît; King, Matt A.; Church, John A.; Bos, Machiel S.

    2018-03-01

    Recent studies have identified climatic drivers of the east-west see-saw of Pacific Ocean satellite altimetry era sea level trends and a number of sea-level trend and acceleration assessments attempt to account for this. We investigate the effect of Pacific climate variability, together with temporally-correlated noise, on linear trend error estimates and determine new time-of-emergence (ToE) estimates across the Indian and Pacific Oceans. Sea-level trend studies often advocate the use of auto-regressive (AR) noise models to adequately assess formal uncertainties, yet sea level often exhibits colored but non-AR(1) noise. Standard error estimates are over- or under-estimated by an AR(1) model for much of the Indo-Pacific sea level. Allowing for PDO and ENSO variability in the trend estimate only reduces standard errors across the tropics and we find noise characteristics are largely unaffected. Of importance for trend and acceleration detection studies, formal error estimates remain on average up to 1.6 times those from an AR(1) model for long-duration tide gauge data. There is an even chance that the observed trend from the satellite altimetry era exceeds the noise in patches of the tropical Pacific and Indian Oceans and the south-west and north-east Pacific gyres. By including climate indices in the trend analysis, the time it takes for the observed linear sea-level trend to emerge from the noise reduces by up to 2 decades.

  18. Supporting Qualified Database for Uncertainty Evaluation

    International Nuclear Information System (INIS)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; Lisovyy, O.; D'Auria, F.

    2013-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The 'RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization

  19. Supporting qualified database for uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D' Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)

    2012-07-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  20. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  1. Uncertainties and severe-accident management

    International Nuclear Information System (INIS)

    Kastenberg, W.E.

    1991-01-01

    Severe-accident management can be defined as the use of existing and or alternative resources, systems, and actions to prevent or mitigate a core-melt accident. Together with risk management (e.g., changes in plant operation and/or addition of equipment) and emergency planning (off-site actions), accident management provides an extension of the defense-indepth safety philosophy for severe accidents. A significant number of probabilistic safety assessments have been completed, which yield the principal plant vulnerabilities, and can be categorized as (a) dominant sequences with respect to core-melt frequency, (b) dominant sequences with respect to various risk measures, (c) dominant threats that challenge safety functions, and (d) dominant threats with respect to failure of safety systems. Severe-accident management strategies can be generically classified as (a) use of alternative resources, (b) use of alternative equipment, and (c) use of alternative actions. For each sequence/threat and each combination of strategy, there may be several options available to the operator. Each strategy/option involves phenomenological and operational considerations regarding uncertainty. These include (a) uncertainty in key phenomena, (b) uncertainty in operator behavior, (c) uncertainty in system availability and behavior, and (d) uncertainty in information availability (i.e., instrumentation). This paper focuses on phenomenological uncertainties associated with severe-accident management strategies

  2. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  3. Nonextensive formalism and continuous Hamiltonian systems

    International Nuclear Information System (INIS)

    Boon, Jean Pierre; Lutsko, James F.

    2011-01-01

    A recurring question in nonequilibrium statistical mechanics is what deviation from standard statistical mechanics gives rise to non-Boltzmann behavior and to nonlinear response, which amounts to identifying the emergence of 'statistics from dynamics' in systems out of equilibrium. Among several possible analytical developments which have been proposed, the idea of nonextensive statistics introduced by Tsallis about 20 years ago was to develop a statistical mechanical theory for systems out of equilibrium where the Boltzmann distribution no longer holds, and to generalize the Boltzmann entropy by a more general function S q while maintaining the formalism of thermodynamics. From a phenomenological viewpoint, nonextensive statistics appeared to be of interest because maximization of the generalized entropy S q yields the q-exponential distribution which has been successfully used to describe distributions observed in a large class of phenomena, in particular power law distributions for q>1. Here we re-examine the validity of the nonextensive formalism for continuous Hamiltonian systems. In particular we consider the q-ideal gas, a model system of quasi-particles where the effect of the interactions are included in the particle properties. On the basis of exact results for the q-ideal gas, we find that the theory is restricted to the range q<1, which raises the question of its formal validity range for continuous Hamiltonian systems.

  4. Probabilistic assessment of fatigue life including statistical uncertainties in the S-N curve

    International Nuclear Information System (INIS)

    Sudret, B.; Hornet, P.; Stephan, J.-M.; Guede, Z.; Lemaire, M.

    2003-01-01

    A probabilistic framework is set up to assess the fatigue life of components of nuclear power plants. It intends to incorporate all kinds of uncertainties such as those appearing in the specimen fatigue life, design sub-factor, mechanical model and applied loading. This paper details the first step, which corresponds to the statistical treatment of the fatigue specimen test data. The specimen fatigue life at stress amplitude S is represented by a lognormal random variable whose mean and standard deviation depend on S. This characterization is then used to compute the random fatigue life of a component submitted to a single kind of cycles. Precisely the mean and coefficient of variation of this quantity are studied, as well as the reliability associated with the (deterministic) design value. (author)

  5. Communicating spatial uncertainty to non-experts using R

    Science.gov (United States)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R

  6. Probabilistic and Fuzzy Arithmetic Approaches for the Treatment of Uncertainties in the Installation of Torpedo Piles

    Directory of Open Access Journals (Sweden)

    Denise Margareth Kazue Nishimura Kunitaki

    2008-01-01

    Full Text Available The “torpedo” pile is a foundation system that has been recently considered to anchor mooring lines and risers of floating production systems for offshore oil exploitation. The pile is installed in a free fall operation from a vessel. However, the soil parameters involved in the penetration model of the torpedo pile contain uncertainties that can affect the precision of analysis methods to evaluate its final penetration depth. Therefore, this paper deals with methodologies for the assessment of the sensitivity of the response to the variation of the uncertain parameters and mainly to incorporate into the analysis method techniques for the formal treatment of the uncertainties. Probabilistic and “possibilistic” approaches are considered, involving, respectively, the Monte Carlo method (MC and concepts of fuzzy arithmetic (FA. The results and performance of both approaches are compared, stressing the ability of the latter approach to efficiently deal with the uncertainties of the model, with outstanding computational efficiency, and therefore, to comprise an effective design tool.

  7. Augmenting Reality and Formality of Informal and Non-Formal Settings to Enhance Blended Learning

    Science.gov (United States)

    Pérez-Sanagustin, Mar; Hernández-Leo, Davinia; Santos, Patricia; Kloos, Carlos Delgado; Blat, Josep

    2014-01-01

    Visits to museums and city tours have been part of higher and secondary education curriculum activities for many years. However these activities are typically considered "less formal" when compared to those carried out in the classroom, mainly because they take place in informal or non-formal settings. Augmented Reality (AR) technologies…

  8. Superfield formalism

    Indian Academy of Sciences (India)

    dimensional superfields, is a clear signature of the presence of the (anti-)BRST invariance in the original. 4D theory. Keywords. Non-Abelian 1-form gauge theory; Dirac fields; (anti-)Becchi–Roucet–Stora–. Tyutin invariance; superfield formalism; ...

  9. Itô-SDE MCMC method for Bayesian characterization of errors associated with data limitations in stochastic expansion methods for uncertainty quantification

    Science.gov (United States)

    Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.

    2017-11-01

    This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.

  10. Uncovering the triple omeron vertex from Wilson line formalism

    International Nuclear Information System (INIS)

    Chirilli, G. A.; Szymanowski, L.; Wallon, S.

    2011-01-01

    We compute the triple omeron vertex from the Wilson line formalism, including both planar and nonplanar contributions, and get perfect agreement with the result obtained in the Extended Generalized Logarithmic Approximation based on Reggeon calculus.

  11. A Survey of Formal Methods for Intelligent Swarms

    Science.gov (United States)

    Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.

    2004-01-01

    cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.

  12. Towards a different attitude to uncertainty

    Directory of Open Access Journals (Sweden)

    Guy Pe'er

    2014-10-01

    Full Text Available The ecological literature deals with uncertainty primarily from the perspective of how to reduce it to acceptable levels. However, the current rapid and ubiquitous environmental changes, as well as anticipated rates of change, pose novel conditions and complex dynamics due to which many sources of uncertainty are difficult or even impossible to reduce. These include both uncertainty in knowledge (epistemic uncertainty and societal responses to it. Under these conditions, an increasing number of studies ask how one can deal with uncertainty as it is. Here, we explore the question how to adopt an overall alternative attitude to uncertainty, which accepts or even embraces it. First, we show that seeking to reduce uncertainty may be counterproductive under some circumstances. It may yield overconfidence, ignoring early warning signs, policy- and societal stagnation, or irresponsible behaviour if personal certainty is offered by externalization of environmental costs. We then demonstrate that uncertainty can have positive impacts by driving improvements in knowledge, promoting cautious action, contributing to keeping societies flexible and adaptable, enhancing awareness, support and involvement of the public in nature conservation, and enhancing cooperation and communication. We discuss the risks of employing a certainty paradigm on uncertain knowledge, the potential benefits of adopting an alternative attitude to uncertainty, and the need to implement such an attitude across scales – from adaptive management at the local scale, to the evolving Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES at the global level.

  13. MOOC & B-Learning: Students' Barriers and Satisfaction in Formal and Non-Formal Learning Environments

    Science.gov (United States)

    Gutiérrez-Santiuste, Elba; Gámiz-Sánchez, Vanesa-M.; Gutiérrez-Pérez, Jose

    2015-01-01

    The study presents a comparative analysis of two virtual learning formats: one non-formal through a Massive Open Online Course (MOOC) and the other formal through b-learning. We compare the communication barriers and the satisfaction perceived by the students (N = 249) by developing a qualitative analysis using semi-structured questionnaires and…

  14. Formal training in forensic mental health: psychiatry and psychology.

    Science.gov (United States)

    Sadoff, Robert L; Dattilio, Frank M

    2012-01-01

    The field of forensic mental health has grown exponentially in the past decades to include forensic psychiatrists and psychologists serving as the primary experts to the court systems. However, many colleagues have chosen to pursue the avenue of serving as forensic experts without obtaining formal training and experience. This article discusses the importance of formal education, training and experience for psychiatrists and psychologists working in forensic settings and the ethical implications that befall those who fail to obtain such credentials. Specific aspects of training and supervised experience are discussed in detail. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. An Ontology for State Analysis: Formalizing the Mapping to SysML

    Science.gov (United States)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  16. Schwinger–Keldysh canonical formalism for electronic Raman scattering

    Energy Technology Data Exchange (ETDEWEB)

    Su, Yuehua, E-mail: suyh@ytu.edu.cn

    2016-03-01

    Inelastic low-energy Raman and high-energy X-ray scatterings have made great progress in instrumentation to investigate the strong electronic correlations in matter. However, theoretical study of the relevant scattering spectrum is still a challenge. In this paper, we present a Schwinger–Keldysh canonical perturbation formalism for the electronic Raman scattering, where all the resonant, non-resonant and mixed responses are considered uniformly. We show how to use this formalism to evaluate the cross section of the electronic Raman scattering off an one-band superconductor. All the two-photon scattering processes from electrons, the non-resonant charge density response, the elastic Rayleigh scattering, the fluorescence, the intrinsic energy-shift Raman scattering and the mixed response, are included. In the mean-field superconducting state, Cooper pairs contribute only to the non-resonant response. All the other responses are dominated by the single-particle excitations and are strongly suppressed due to the opening of the superconducting gap. Our formalism for the electronic Raman scattering can be easily extended to study the high-energy resonant inelastic X-ray scattering.

  17. 40 years of formal methods

    DEFF Research Database (Denmark)

    Bjørner, Dines; Havelund, Klaus

    2014-01-01

    In this "40 years of formal methods" essay we shall first delineate, Sect. 1, what we mean by method, formal method, computer science, computing science, software engineering, and model-oriented and algebraic methods. Based on this, we shall characterize a spectrum from specification-oriented met...

  18. Still Elegantly Muddling Through? NICE and Uncertainty in Decision Making About the Rationing of Expensive Medicines in England.

    Science.gov (United States)

    Calnan, Michael; Hashem, Ferhana; Brown, Patrick

    2017-07-01

    This article examines the "technological appraisals" carried out by the National Institute for Health and Care Excellence as it regulates the provision of expensive new drugs within the English National Health Service on cost-effectiveness grounds. Ostensibly this is a highly rational process by which the regulatory mechanisms absorb uncertainty, but in practice, decision making remains highly complex and uncertain. This article draws on ethnographic data-interviews with a range of stakeholders and decision makers (n = 41), observations of public and closed appraisal meetings, and documentary analysis-regarding the decision-making processes involving three pharmaceutical products. The study explores the various ways in which different forms of uncertainty are perceived and tackled within these Single Technology Appraisals. Difficulties of dealing with the various levels of uncertainty were manifest and often rendered straightforward decision making problematic. Uncertainties associated with epistemology, procedures, interpersonal relations, and technicality were particularly evident. The need to exercise discretion within a more formal institutional framework shaped a pragmatic combining of strategies tactics-explicit and informal, collective and individual-to navigate through the layers of complexity and uncertainty in making decisions.

  19. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    in each measured/observed datapoint; an issue that is commonly overlooked in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  20. A formal safety analysis for PLC software-based safety critical system using Z

    International Nuclear Information System (INIS)

    Koh, Jung Soo

    1997-02-01

    This paper describes a formal safety analysis technique which is demonstrated by performing empirical formal safety analysis with the case study of beamline hutch door Interlock system that is developed by using PLC (Programmable Logic Controller) systems at the Pohang Accelerator Laboratory. In order to perform formal safety analysis, we have built the Z formal specifications representation from user requirement written in ambiguous natural language and target PLC ladder logic, respectively. We have also studied the effective method to express typical PLC timer component by using specific Z formal notation which is supported by temporal history. We present a formal proof technique specifying and verifying that the hazardous states are not introduced into ladder logic in the PLC-based safety critical system. And also, we have found that some errors or mismatches in user requirement and final implemented PLC ladder logic while analyzing the process of the consistency and completeness of Z translated formal specifications. In the case of relatively small systems like Beamline hutch door interlock system, a formal safety analysis including explicit proof is highly recommended so that the safety of PLC-based critical system may be enhanced and guaranteed. It also provides a helpful benefits enough to comprehend user requirement expressed by ambiguous natural language

  1. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  2. OpenTURNS, an open source uncertainty engineering software

    International Nuclear Information System (INIS)

    Popelin, A.L.; Dufoy, A.

    2013-01-01

    The needs to assess robust performances for complex systems have lead to the emergence of a new industrial simulation challenge: to take into account uncertainties when dealing with complex numerical simulation frameworks. EDF has taken part in the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk and Statistics. OpenTURNS includes a large variety of qualified algorithms in order to manage uncertainties in industrial studies, from the uncertainty quantification step (with possibilities to model stochastic dependence thanks to the copula theory and stochastic processes), to the uncertainty propagation step (with some innovative simulation algorithms as the ziggurat method for normal variables) and the sensitivity analysis one (with some sensitivity index based on the evaluation of means conditioned to the realization of a particular event). It also enables to build some response surfaces that can include the stochastic modeling (with the chaos polynomial method for example). Generic wrappers to link OpenTURNS to the modeling software are proposed. At last, OpenTURNS is largely documented to provide rules to help use and contribution

  3. A Formalization of Linkage Analysis

    DEFF Research Database (Denmark)

    Ingolfsdottir, Anna; Christensen, A.I.; Hansen, Jens A.

    In this report a formalization of genetic linkage analysis is introduced. Linkage analysis is a computationally hard biomathematical method, which purpose is to locate genes on the human genome. It is rooted in the new area of bioinformatics and no formalization of the method has previously been ...

  4. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  5. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  6. The use of error and uncertainty methods in the medical laboratory.

    Science.gov (United States)

    Oosterhuis, Wytze P; Bayat, Hassan; Armbruster, David; Coskun, Abdurrahman; Freeman, Kathleen P; Kallner, Anders; Koch, David; Mackenzie, Finlay; Migliarino, Gabriel; Orth, Matthias; Sandberg, Sverre; Sylte, Marit S; Westgard, Sten; Theodorsson, Elvar

    2018-01-26

    Error methods - compared with uncertainty methods - offer simpler, more intuitive and practical procedures for calculating measurement uncertainty and conducting quality assurance in laboratory medicine. However, uncertainty methods are preferred in other fields of science as reflected by the guide to the expression of uncertainty in measurement. When laboratory results are used for supporting medical diagnoses, the total uncertainty consists only partially of analytical variation. Biological variation, pre- and postanalytical variation all need to be included. Furthermore, all components of the measuring procedure need to be taken into account. Performance specifications for diagnostic tests should include the diagnostic uncertainty of the entire testing process. Uncertainty methods may be particularly useful for this purpose but have yet to show their strength in laboratory medicine. The purpose of this paper is to elucidate the pros and cons of error and uncertainty methods as groundwork for future consensus on their use in practical performance specifications. Error and uncertainty methods are complementary when evaluating measurement data.

  7. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.

    Science.gov (United States)

    Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya

    2018-06-17

    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.

  8. On treatment of uncertainty in system planning

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2009-01-01

    In system planning and operation considerable efforts and resources are spent to reduce uncertainties, as a part of project management, uncertainty management and safety management. The basic idea seems to be that uncertainties are purely negative and should be reduced. In this paper we challenge this way of thinking, using a common industry practice as an example. In accordance with this industry practice, three uncertainty interval categories are used: ±40% intervals for the feasibility phase, ±30% intervals for the concept development phase and ±20% intervals for the engineering phase. The problem is that such a regime could easily lead to a conservative management regime encouraging the use of existing methods and tools, as new activities and novel solutions and arrangements necessarily mean increased uncertainties. In the paper we suggest an alternative approach based on uncertainty and risk descriptions, but having no predefined uncertainty reduction structures. The approach makes use of risk assessments and economic optimisation tools such as the expected net present value, but acknowledges the need for broad risk management processes which extend beyond the analyses. Different concerns need to be balanced, including economic aspects, uncertainties and risk, and practicability

  9. Fundamentals of the Pure Spinor Formalism

    CERN Document Server

    Hoogeveen, Joost

    2010-01-01

    This thesis presents recent developments within the pure spinor formalism, which has simplified amplitude computations in perturbative string theory, especially when spacetime fermions are involved. Firstly the worldsheet action of both the minimal and the non-minimal pure spinor formalism is derived from first principles, i.e. from an action with two dimensional diffeomorphism and Weyl invariance. Secondly the decoupling of unphysical states in the minimal pure spinor formalism is proved

  10. Evaluation of cutting force uncertainty components in turning

    DEFF Research Database (Denmark)

    Axinte, Dragos Aurelian; Belluco, Walter; De Chiffre, Leonardo

    2000-01-01

    A procedure is proposed for the evaluation of those uncertainty components of a single cutting force measurement in turning that are related to the contributions of the dynamometer calibration and the cutting process itself. Based on an empirical model including errors form both sources......, the uncertainty for a single measurement of cutting force is presented, and expressions for the expected uncertainty vs. cutting parameters are proposed. This approach gives the possibility of evaluating cutting force uncertainty components in turning, for a defined range of cutting parameters, based on few...

  11. Sources of uncertainty in flood inundation maps

    Science.gov (United States)

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  12. Construction of a case for expert judgement of uncertainty in early health effects models

    International Nuclear Information System (INIS)

    Grupa, J.

    1997-11-01

    The contribution of ECN to a joint study of the European Commission (EC) and the US Nuclear Regulatory Commission (NRC), in which the uncertainty in risks and consequences of severe accidents at nuclear power plants are evaluated, is described. The procedure used to obtain these uncertainties is called expert judgement. In a formal expert judgement procedure a panel of experts has provided quantitative information about the uncertainty in given observables: a quantity that describes an observation concerning the phenomenon of interest, in this paper the relation between dose and health effects, without information or assumptions about any model describing this phenomenon. The observables are defined in a case structure, a questionnaire provided to all experts. ECN has contributed to the selection of the experts for the early health effects panel, and provided assistance for drafting the case structure for this panel. This paper describes the radiological information provided by ECN and the analyses necessary for constructing the case structure. The deliverables of the expert elicitation are uncertainty distributions of the observables requested in the case structure. The results are intended to be unbiased, i.e. it should be applicable to any model describing the relation between dose and health effects. They will be published by the project team in a joint publication of the NRC and the EC. In this way the resulting uncertainty distributions are available for further work in the joint project and available to a more general public. 2 figs., 4 refs

  13. Uncertainty quantification of fast sodium current steady-state inactivation for multi-scale models of cardiac electrophysiology.

    Science.gov (United States)

    Pathmanathan, Pras; Shotwell, Matthew S; Gavaghan, David J; Cordeiro, Jonathan M; Gray, Richard A

    2015-01-01

    Perhaps the most mature area of multi-scale systems biology is the modelling of the heart. Current models are grounded in over fifty years of research in the development of biophysically detailed models of the electrophysiology (EP) of cardiac cells, but one aspect which is inadequately addressed is the incorporation of uncertainty and physiological variability. Uncertainty quantification (UQ) is the identification and characterisation of the uncertainty in model parameters derived from experimental data, and the computation of the resultant uncertainty in model outputs. It is a necessary tool for establishing the credibility of computational models, and will likely be expected of EP models for future safety-critical clinical applications. The focus of this paper is formal UQ of one major sub-component of cardiac EP models, the steady-state inactivation of the fast sodium current, INa. To better capture average behaviour and quantify variability across cells, we have applied for the first time an 'individual-based' statistical methodology to assess voltage clamp data. Advantages of this approach over a more traditional 'population-averaged' approach are highlighted. The method was used to characterise variability amongst cells isolated from canine epi and endocardium, and this variability was then 'propagated forward' through a canine model to determine the resultant uncertainty in model predictions at different scales, such as of upstroke velocity and spiral wave dynamics. Statistically significant differences between epi and endocardial cells (greater half-inactivation and less steep slope of steady state inactivation curve for endo) was observed, and the forward propagation revealed a lack of robustness of the model to underlying variability, but also surprising robustness to variability at the tissue scale. Overall, the methodology can be used to: (i) better analyse voltage clamp data; (ii) characterise underlying population variability; (iii) investigate

  14. Sensitivity/uncertainty analysis for the Hiroshima dosimetry reevaluation effort

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Lillie, R.A.; Pace, J.V. III; Cacuci, D.G.

    1987-01-01

    Uncertainty estimates and cross correlations by range/survivor location have been obtained for the free-in-air (FIA) tissue kerma for the Hiroshima atomic event. These uncertainties in the FIA kerma include contributions due to various modeling parameters and the basic cross section data and are given at three ground ranges, 700, 1000 and 1500 m. The estimated uncertainties are nearly constant over the given ground ranges and are approximately 27% for the prompt neutron kerma and secondary gamma kerma and 35% for the prompt gamma kerma. The total kerma uncertainty is dominated by the secondary gamma kerma uncertainties which are in turn largely due to the modeling parameter uncertainties

  15. Leibniz' First Formalization of Syllogistics

    DEFF Research Database (Denmark)

    Robering, Klaus

    2014-01-01

    of letters just those which belong to the useful, i.e., valid, modes. The set of codes of valid modes turns out to be a so-called "regular" language (in the sense of formal-language-theory). Leibniz' formalization of syllogistics in his Dissertatio thus contains an estimation of the computational complexity...

  16. Formalizing the concept phase of product development

    NARCIS (Netherlands)

    Schuts, M.; Hooman, J.

    2015-01-01

    We discuss the use of formal techniques to improve the concept phase of product realisation. As an industrial application, a new concept of interventional X-ray systems has been formalized, using model checking techniques and the simulation of formal models. cop. Springer International Publishing

  17. A survey of formal languages for contracts

    DEFF Research Database (Denmark)

    Hvitved, Tom

    2010-01-01

    In this short paper we present the current status on formal languages and models for contracts. By a formal model is meant an unambiguous and rigorous representation of contracts, in order to enable their automatic validation, execution, and analysis — activates that are collectively referred...... to as contract lifecycle management (CLM). We present a set of formalism requirements, which represent features that any ideal contract model should support, based on which we present a comparative survey of existing contract formalisms....

  18. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  19. Improving Project Management Using Formal Models and Architectures

    Science.gov (United States)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  20. 20 CFR 702.336 - Formal hearings; new issues.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Formal hearings; new issues. 702.336 Section... Procedures Formal Hearings § 702.336 Formal hearings; new issues. (a) If, during the course of the formal hearing, the evidence presented warrants consideration of an issue or issues not previously considered...

  1. The Interrelatedness of Formal, Non-Formal and Informal Learning: Evidence from Labour Market Program Participants

    Science.gov (United States)

    Cameron, Roslyn; Harrison, Jennifer L.

    2012-01-01

    Definitions, differences and relationships between formal, non-formal and informal learning have long been contentious. There has been a significant change in language and reference from adult education to what amounts to forms of learning categorised by their modes of facilitation. Nonetheless, there is currently a renewed interest in the…

  2. Digital Resource Developments for Mathematics Education Involving Homework across Formal, Non-Formal and Informal Settings

    Science.gov (United States)

    Radovic, Slaviša; Passey, Don

    2016-01-01

    The aim of this paper is to explore further an under-developed area--how drivers of curriculum, pedagogy and assessment conceptions and practices shape the creation and uses of technologically based resources to support mathematics learning across informal, non-formal and formal learning environments. The paper considers: the importance of…

  3. NON-FORMAL EDUCATION, OVEREDUCATION AND WAGES

    OpenAIRE

    SANDRA NIETO; RAÚL RAMOS

    2013-01-01

    Why do overeducated workers participate in non-formal education activities? Do not they suffer from an excess of education? Using microdata from the Spanish sample of the 2007 Adult Education Survey, we have found that overeducated workers participate more than the rest in non-formal education and that they earn higher wages than overeducated workers who did not participate. This result can be interpreted as evidence that non-formal education allows overeducated workers to acquire new abiliti...

  4. Climate change decision-making: Model & parameter uncertainties explored

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.; Linville, C.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.

  5. The Law, Policy, and Politics of Formal Hypnosis in the Public Community College Classroom.

    Science.gov (United States)

    Sachs, Steven Mark

    Information from printed sources, legal documents, and interviews with community college administrators formed the basis of an investigation of the legal, policy, and political implications of the use of formal hypnosis as an instructional augmentation in the community college classroom. Study findings included the following: (1) no formal policy…

  6. Influences of Formal Learning, Personal Learning Orientation, and Supportive Learning Environment on Informal Learning

    Science.gov (United States)

    Choi, Woojae; Jacobs, Ronald L.

    2011-01-01

    While workplace learning includes formal and informal learning, the relationship between the two has been overlooked, because they have been viewed as separate entities. This study investigated the effects of formal learning, personal learning orientation, and supportive learning environment on informal learning among 203 middle managers in Korean…

  7. 40 CFR 35.938-4 - Formal advertising.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Formal advertising. 35.938-4 Section 35... advertising. Each contract shall be awarded after formal advertising, unless negotiation is permitted in accordance with § 35.936-18. Formal advertising shall be in accordance with the following: (a) Adequate...

  8. Being Included and Excluded

    DEFF Research Database (Denmark)

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural...... politics. It analyzes how formal education and mobility either challenge or reinforce traditional gendered norms which dictate a lowly position for young married women in the household and their absence from community politics. The article concludes that women are simultaneously excluded and included from...... community politics. On the one hand, their mobility and decision-making powers decrease with the increase in the labor mobility of men and their newly gained education is politically devalued when compared to the informal education that men gain through mobility, but on the other hand, schooling strengthens...

  9. A FORMALISM FOR FUZZY BUSINESS RULES

    Directory of Open Access Journals (Sweden)

    Vasile Mazilescu

    2015-05-01

    Full Text Available The aim of this paper is to provide a formalism for fuzzy rule bases, included in our prototype system FUZZY_ENTERPRISE. This framework can be used in Distributed Knowledge Management Systems (DKMSs, real-time interdisciplinary decision making systems, that often require increasing technical support to high quality decisions in a timely manner. The language of the first-degree predicates facilitates the formulation of complex knowledge in a rigorous way, imposing appropriate reasoning techniques.

  10. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  11. Modeling Uncertainty in Climate Change: A Multi-Model Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul

    2015-10-01

    The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO2 concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.

  12. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  13. Seniority in projection operator formalism

    International Nuclear Information System (INIS)

    Ullah, N.

    1976-01-01

    It is shown that the concept of seniority can be introduced in projection operator formalism through the use of the operator Q, which has been defined by de-Shalit and Talmi. The usefulness of seniority concept in projection operator formalism is discussed. An example of four nucleons in j=3/2 configuration is given for illustrative purposes

  14. A Mathematical Formalization Proposal for Business Growth

    Directory of Open Access Journals (Sweden)

    Gheorghe BAILESTEANU

    2013-01-01

    Full Text Available Economic sciences have known a spectacular evolution in the last century; beginning to use axiomatic methods, applying mathematical instruments as a decision-making tool. The quest to formalization needs to be addressed from various different angles, reducing entry and operating formal costs, increasing the incentives for firms to operate formally, reducing obstacles to their growth, and searching for inexpensive approaches through which to enforce compliancy with government regulations. This paper proposes a formalized approach to business growth, based on mathematics and logics, taking into consideration the particularities of the economic sector.

  15. What Determines Firms’ Decisions to Formalize?

    OpenAIRE

    Neil McCulloch; Günther G. Schulze; Janina Voss

    2010-01-01

    In this paper we analyze the decision of small and micro firms to formalize, i.e. to obtain business and other licenses in rural Indonesia. We use the rural investment climate survey (RICS) that consists of non-farm rural enterprises, most of them microenterprises, and analyze the effect of formalization on tax payments, corruption, access to credit and revenue, taking into account the endogeneity of the formalization decision to such benefits and costs. We show, contrary to most of the liter...

  16. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  17. Evaluating measurement uncertainty in fluid phase equilibrium calculations

    Science.gov (United States)

    van der Veen, Adriaan M. H.

    2018-04-01

    The evaluation of measurement uncertainty in accordance with the ‘Guide to the expression of uncertainty in measurement’ (GUM) has not yet become widespread in physical chemistry. With only the law of the propagation of uncertainty from the GUM, many of these uncertainty evaluations would be cumbersome, as models are often non-linear and require iterative calculations. The methods from GUM supplements 1 and 2 enable the propagation of uncertainties under most circumstances. Experimental data in physical chemistry are used, for example, to derive reference property data and support trade—all applications where measurement uncertainty plays an important role. This paper aims to outline how the methods for evaluating and propagating uncertainty can be applied to some specific cases with a wide impact: deriving reference data from vapour pressure data, a flash calculation, and the use of an equation-of-state to predict the properties of both phases in a vapour-liquid equilibrium. The three uncertainty evaluations demonstrate that the methods of GUM and its supplements are a versatile toolbox that enable us to evaluate the measurement uncertainty of physical chemical measurements, including the derivation of reference data, such as the equilibrium thermodynamical properties of fluids.

  18. Lifelong Learning to Empowerment: Beyond Formal Education

    Science.gov (United States)

    Carr, Alexis; Balasubramanian, K.; Atieno, Rosemary; Onyango, James

    2018-01-01

    This paper discusses the relevance of lifelong learning vis-à-vis the Sustainable Development Goals (SDGs) and stresses the need for an approach blending formal education, non-formal and informal learning. The role of Open and Distance Learning (ODL) in moving beyond formal education and the importance of integrating pedagogy, andragogy and…

  19. Incorporating outcome uncertainty and prior outcome beliefs in stated preferences

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Jacobsen, Jette Bredahl; Hanley, Nick

    2015-01-01

    Stated preference studies tell respondents that policies create environmental changes with varying levels of uncertainty. However, respondents may include their own a priori assessments of uncertainty when making choices among policy options. Using a choice experiment eliciting respondents......’ preferences for conservation policies under climate change, we find that higher outcome uncertainty reduces utility. When accounting for endogeneity, we find that prior beliefs play a significant role in this cost of uncertainty. Thus, merely stating “objective” levels of outcome uncertainty...

  20. $\\delta N$ formalism from superpotential and holography

    CERN Document Server

    Garriga, Jaume; Vernizzi, Filippo

    2016-02-16

    We consider the superpotential formalism to describe the evolution of scalar fields during inflation, generalizing it to include the case with non-canonical kinetic terms. We provide a characterization of the attractor behaviour of the background evolution in terms of first and second slow-roll parameters (which need not be small). We find that the superpotential is useful in justifying the separate universe approximation from the gradient expansion, and also in computing the spectra of primordial perturbations around attractor solutions in the $\\delta N$ formalism. As an application, we consider a class of models where the background trajectories for the inflaton fields are derived from a product separable superpotential. In the perspective of the holographic inflation scenario, such models are dual to a deformed CFT boundary theory, with $D$ mutually uncorrelated deformation operators. We compute the bulk power spectra of primordial adiabatic and entropy cosmological perturbations, and show that the results...

  1. Formal Symplectic Groupoid of a Deformation Quantization

    Science.gov (United States)

    Karabegov, Alexander V.

    2005-08-01

    We give a self-contained algebraic description of a formal symplectic groupoid over a Poisson manifold M. To each natural star product on M we then associate a canonical formal symplectic groupoid over M. Finally, we construct a unique formal symplectic groupoid ‘with separation of variables’ over an arbitrary Kähler-Poisson manifold.

  2. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    Full Text Available Abstract Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of

  3. UNCERTAINTIES IN GALACTIC CHEMICAL EVOLUTION MODELS

    International Nuclear Information System (INIS)

    Côté, Benoit; Ritter, Christian; Herwig, Falk; O’Shea, Brian W.; Pignatari, Marco; Jones, Samuel; Fryer, Chris L.

    2016-01-01

    We use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of the following seven basic parameters: the lower and upper mass limits of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number of SNe Ia per M ⊙ formed, the total stellar mass formed, and the final mass of gas. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of 16 elements in a statistical manner by identifying the most probable solutions, along with their 68% and 95% confidence levels. Our results show that the overall uncertainties are shaped by several input parameters that individually contribute at different metallicities, and thus at different galactic ages. The level of uncertainty then depends on the metallicity and is different from one element to another. Among the seven input parameters considered in this work, the slope of the IMF and the number of SNe Ia are currently the two main sources of uncertainty. The thicknesses of the uncertainty bands bounded by the 68% and 95% confidence levels are generally within 0.3 and 0.6 dex, respectively. When looking at the evolution of individual elements as a function of galactic age instead of metallicity, those same thicknesses range from 0.1 to 0.6 dex for the 68% confidence levels and from 0.3 to 1.0 dex for the 95% confidence levels. The uncertainty in our chemical evolution model

  4. A tool for efficient, model-independent management optimization under uncertainty

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  5. Conclusions on measurement uncertainty in microbiology.

    Science.gov (United States)

    Forster, Lynne I

    2009-01-01

    Since its first issue in 1999, testing laboratories wishing to comply with all the requirements of ISO/IEC 17025 have been collecting data for estimating uncertainty of measurement for quantitative determinations. In the microbiological field of testing, some debate has arisen as to whether uncertainty needs to be estimated for each method performed in the laboratory for each type of sample matrix tested. Queries also arise concerning the estimation of uncertainty when plate/membrane filter colony counts are below recommended method counting range limits. A selection of water samples (with low to high contamination) was tested in replicate with the associated uncertainty of measurement being estimated from the analytical results obtained. The analyses performed on the water samples included total coliforms, fecal coliforms, fecal streptococci by membrane filtration, and heterotrophic plate counts by the pour plate technique. For those samples where plate/membrane filter colony counts were > or =20, uncertainty estimates at a 95% confidence level were very similar for the methods, being estimated as 0.13, 0.14, 0.14, and 0.12, respectively. For those samples where plate/membrane filter colony counts were <20, estimated uncertainty values for each sample showed close agreement with published confidence limits established using a Poisson distribution approach.

  6. Analysis of Formal Methods for Specification of E-Commerce Applications

    Directory of Open Access Journals (Sweden)

    Sadiq Ali Khan

    2016-01-01

    Full Text Available E-commerce based application characteristics portray elevated dynamics while incorporating decentralized nature. Extreme emphasis influencing structural design plus implementation, positions such applications highly appreciated. Significant research articles reveal that, applying formal methods addressing challenges incumbent with E-commerce based applications, contribute towards reliability and robustness obliging the system. Anticipating and designing sturdy e-process and concurrent implementation, allows application behavior extra strength against errors, frauds and hacking, minimizing program faults during application operations. Programmers find extreme difficulty guaranteeing correct processing under all circumstances, however, not impossible. Concealed flaws and errors, triggered only under unexpected and unanticipated scenarios, pilot subtle mistakes and appalling failures. Code authors utilize various formal methods for reducing these flaws. Mentioning prominent methods would include, ASM (Abstract State Machines, B-Method, z-Language, UML (Unified Modelling Language etc. This paper primarily focuses different formal methods applied while deliberating specification and verification techniques for cost effective.

  7. Mitigating Provider Uncertainty in Service Provision Contracts

    Science.gov (United States)

    Smith, Chris; van Moorsel, Aad

    Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.

  8. multiPDEVS: A Parallel Multicomponent System Specification Formalism

    Directory of Open Access Journals (Sweden)

    Damien Foures

    2018-01-01

    Full Text Available Based on multiDEVS formalism, we introduce multiPDEVS, a parallel and nonmodular formalism for discrete event system specification. This formalism provides combined advantages of PDEVS and multiDEVS approaches, such as excellent simulation capabilities for simultaneously scheduled events and components able to influence each other using exclusively their state transitions. We next show the soundness of the formalism by giving a construction showing that any multiPDEVS model is equivalent to a PDEVS atomic model. We then present the simulation procedure associated, usually called abstract simulator. As a well-adapted formalism to express cellular automata, we finally propose to compare an implementation of multiPDEVS formalism with a more classical Cell-DEVS implementation through a fire spread application.

  9. Modeling of uncertainties in biochemical reactions.

    Science.gov (United States)

    Mišković, Ljubiša; Hatzimanikatis, Vassily

    2011-02-01

    Mathematical modeling is an indispensable tool for research and development in biotechnology and bioengineering. The formulation of kinetic models of biochemical networks depends on knowledge of the kinetic properties of the enzymes of the individual reactions. However, kinetic data acquired from experimental observations bring along uncertainties due to various experimental conditions and measurement methods. In this contribution, we propose a novel way to model the uncertainty in the enzyme kinetics and to predict quantitatively the responses of metabolic reactions to the changes in enzyme activities under uncertainty. The proposed methodology accounts explicitly for mechanistic properties of enzymes and physico-chemical and thermodynamic constraints, and is based on formalism from systems theory and metabolic control analysis. We achieve this by observing that kinetic responses of metabolic reactions depend: (i) on the distribution of the enzymes among their free form and all reactive states; (ii) on the equilibrium displacements of the overall reaction and that of the individual enzymatic steps; and (iii) on the net fluxes through the enzyme. Relying on this observation, we develop a novel, efficient Monte Carlo sampling procedure to generate all states within a metabolic reaction that satisfy imposed constrains. Thus, we derive the statistics of the expected responses of the metabolic reactions to changes in enzyme levels and activities, in the levels of metabolites, and in the values of the kinetic parameters. We present aspects of the proposed framework through an example of the fundamental three-step reversible enzymatic reaction mechanism. We demonstrate that the equilibrium displacements of the individual enzymatic steps have an important influence on kinetic responses of the enzyme. Furthermore, we derive the conditions that must be satisfied by a reversible three-step enzymatic reaction operating far away from the equilibrium in order to respond to

  10. Representation of analysis results involving aleatory and epistemic uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  11. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  12. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  13. A matricial approach for the Dirac-Kahler formalism

    International Nuclear Information System (INIS)

    Goto, M.

    1987-01-01

    A matricial approach for the Dirac-Kahler formalism is considered. It is shown that the matrical approach i) brings a great computational simplification compared to the common use of differential forms and that ii) by an appropriate choice of notation, it can be extended to the lattice, including a matrix Dirac-Kahler equation. (author) [pt

  14. The role of formal specifications

    International Nuclear Information System (INIS)

    McHugh, J.

    1994-01-01

    The role of formal requirements specification is discussed under the premise that the primary purpose of such specifications is to facilitate clear and unambiguous communications among the communities of interest for a given project. An example is presented in which the failure to reach such an understanding resulted in an accident at a chemical plant. Following the example, specification languages based on logical formalisms and notations are considered. These are rejected as failing to serve the communications needs of diverse communities. The notion of a specification as a surrogate for a program is also considered and rejected. The paper ends with a discussion of the type of formal notation that will serve the communications role and several encouraging developments are noted

  15. Skinner-Rusk unified formalism for optimal control systems and applications

    International Nuclear Information System (INIS)

    Barbero-Linan, MarIa; EcheverrIa-EnrIquez, Arturo; Diego, David MartIn de; Munoz-Lecanda, Miguel C; Roman-Roy, Narciso

    2007-01-01

    A geometric approach to time-dependent optimal control problems is proposed. This formulation is based on the Skinner and Rusk formalism for Lagrangian and Hamiltonian systems. The corresponding unified formalism developed for optimal control systems allows us to formulate geometrically the necessary conditions given by a weak form of Pontryagin's maximum principle, provided that the differentiability with respect to controls is assumed and the space of controls is open. Furthermore, our method is also valid for implicit optimal control systems and, in particular, for the so-called descriptor systems (optimal control problems including both differential and algebraic equations)

  16. Bridging In-school and Out-of-school Learning: Formal, Non-Formal, and Informal Education

    Science.gov (United States)

    Eshach, Haim

    2007-04-01

    The present paper thoroughly examines how one can effectively bridge in-school and out-of-school learning. The first part discusses the difficulty in defining out-of-school learning. It proposes to distinguish three types of learning: formal, informal, and non-formal. The second part raises the question of whether out-of-school learning should be dealt with in the in-school system, in view of the fact that we experience informal learning anyway as well as considering the disadvantages and difficulties teachers are confronted with when planning and carrying out scientific fieldtrips. The voices of the teachers, the students, and the non-formal institution staff are heard to provide insights into the problem. The third part discusses the cognitive and affective aspects of non-formal learning. The fourth part presents some models explaining scientific fieldtrip learning and based on those models, suggests a novel explanation. The fifth part offers some recommendations of how to bridge in and out-of-school learning. The paper closes with some practical ideas as to how one can bring the theory described in the paper into practice. It is hoped that this paper will provide educators with an insight so that they will be able to fully exploit the great potential that scientific field trips may offer.

  17. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions.

    Science.gov (United States)

    Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith

    2018-01-02

    Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour

  18. The impact of inflation uncertainty on interest rates

    OpenAIRE

    Cheong, Chongcheul; Kim, Gi-Hong; Podivinsky, Jan M.

    2010-01-01

    In this paper, the impact of inflation uncertainty on interest rates is investigated for the case of the U.S. three-month Treasury bill rate. We emphasize how consistentOLS estimation can be applied to an empirical equation which includes a proxy variable of inflation uncertainty measured by an ARCH-type model. A significant negative relationship between the two variables is provided. This evidence is contrasted with the view of the inflation risk premium in which inflation uncertainty positi...

  19. Toward a formal ontology for narrative

    Directory of Open Access Journals (Sweden)

    Ciotti, Fabio

    2016-03-01

    Full Text Available In this paper the rationale and the first draft of a formal ontology for modeling narrative texts are presented. Building on the semiotic and structuralist narratology, and on the work carried out in the late 1980s by Giuseppe Gigliozzi in Italy, the focus of my research are the concepts of character and of narrative world/space. This formal model is expressed in the OWL 2 ontology language. The main reason to adopt a formal modeling approach is that I consider the purely probabilistic-quantitative methods (now widespread in digital literary studies inadequate. An ontology, on one hand provides a tool for the analysis of strictly literary texts. On the other hand (though beyond the scope of the present work, its formalization can also represent a significant contribution towards grounding the application of storytelling methods outside of scholarly contexts.

  20. Industrial Practice in Formal Methods : A Review

    DEFF Research Database (Denmark)

    Bicarregui, Juan C.; Fitzgerald, John; Larsen, Peter Gorm

    2009-01-01

    We examine the the industrial application of formal methods using data gathered in a review of 62 projects taking place over the last 25 years. The review suggests that formal methods are being applied in a wide range of application domains, with increasingly strong tool support. Significant chal...... challenges remain in providing usable tools that can be integrated into established development processes; in education and training; in taking formal methods from first use to second use, and in gathering and evidence to support informed selection of methods and tools.......We examine the the industrial application of formal methods using data gathered in a review of 62 projects taking place over the last 25 years. The review suggests that formal methods are being applied in a wide range of application domains, with increasingly strong tool support. Significant...

  1. External Service Providers to the National Security Technology Incubator: Formalization of Relationships

    Energy Technology Data Exchange (ETDEWEB)

    None

    2008-04-30

    This report documents the formalization of relationships with external service providers in the development of the National Security Technology Incubator (NSTI). The technology incubator is being developed as part of the National Security Preparedness Project (NSPP), funded by a Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. This report summarizes the process in developing and formalizing relationships with those service providers and includes a sample letter of cooperation executed with each provider.

  2. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  3. Perceptions of registered nurses in four state health insititutions on continuing formal education.

    Science.gov (United States)

    Richards, L; Potgieter, E

    2010-06-01

    This study investigated registered nurses in four selected state health institutions' perceptions with regard to continuing formal education. The relevance of continuing formal education is being emphasised globally by the increasing quest for quality assurance and quality management systems within an ethos of continuous improvement. According to Tlholoe (2006:5), it is important to be committed to continual learning, as people's knowledge become less relevant because skills gained early in a career are insufficient to avoid costly mistakes made through ignorance. Continuing formal education in nursing is a key element to the maintenance of quality in health care delivery. The study described: registered nurses' views on continuing formal education. Registered nurses' perceived barriers to continuing formal education. A quantitative descriptive survey design was chosen using a questionnaire for data collection. The sample consisted of 40 registered nurses working at four state health institutions in the Western Cape Province, South Africa. Convenience sampling was selected to include registered nurses who were on duty on the days during which the researcher visited the health institutions to distribute the questionnaires. The questionnaire contained mainly closed-ended and a few open-ended questions. Content validity of the instrument was ensured by doing a thorough literature review before construction of items and a pretest. Reliability was established by the pretest and providing the same information to all respondents before completion of the questionnaires. The ethical considerations of informed consent, anonymity and confidentiality were adhered to and consent to conduct the study was obtained from relevant authorities. Descriptive statistics, based on calculations using the Microsoft (MS) Excel (for Windows 2000) programme, were used to summarise and describe the research results. The research results indicated that most registered nurses perceive continuing

  4. Formal verification - Robust and efficient code: Introduction to Formal Verification

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    In general, FV means "proving that certain properties hold for a given system using formal mathematics". This definition can certainly feel daunting, however, as we will learn, we can reap benefits from the paradigm without digging too deep into ...

  5. Formalization of the Resolution Calculus for First-Order Logic

    DEFF Research Database (Denmark)

    Schlichtkrull, Anders

    2016-01-01

    A formalization in Isabelle/HOL of the resolution calculus for first-order logic is presented. Its soundness and completeness are formally proven using the substitution lemma, semantic trees, Herbrand’s theorem, and the lifting lemma. In contrast to previous formalizations of resolution, it consi......A formalization in Isabelle/HOL of the resolution calculus for first-order logic is presented. Its soundness and completeness are formally proven using the substitution lemma, semantic trees, Herbrand’s theorem, and the lifting lemma. In contrast to previous formalizations of resolution...

  6. Uncertainty principle for angular position and angular momentum

    International Nuclear Information System (INIS)

    Franke-Arnold, Sonja; Barnett, Stephen M; Yao, Eric; Leach, Jonathan; Courtial, Johannes; Padgett, Miles

    2004-01-01

    The uncertainty principle places fundamental limits on the accuracy with which we are able to measure the values of different physical quantities (Heisenberg 1949 The Physical Principles of the Quantum Theory (New York: Dover); Robertson 1929 Phys. Rev. 34 127). This has profound effects not only on the microscopic but also on the macroscopic level of physical systems. The most familiar form of the uncertainty principle relates the uncertainties in position and linear momentum. Other manifestations include those relating uncertainty in energy to uncertainty in time duration, phase of an electromagnetic field to photon number and angular position to angular momentum (Vaccaro and Pegg 1990 J. Mod. Opt. 37 17; Barnett and Pegg 1990 Phys. Rev. A 41 3427). In this paper, we report the first observation of the last of these uncertainty relations and derive the associated states that satisfy the equality in the uncertainty relation. We confirm the form of these states by detailed measurement of the angular momentum of a light beam after passage through an appropriate angular aperture. The angular uncertainty principle applies to all physical systems and is particularly important for systems with cylindrical symmetry

  7. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  8. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  9. Sensitivity and uncertainty analyses applied to criticality safety validation, methods development. Volume 1

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Childs, R.L.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the available S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently used by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The S/U methods that are presented in this volume are designed to provide a formal means of establishing the range (or area) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters forms the key to the technique. These parameters are the D parameters, which represent the differences by group of sensitivity profiles, and the ck parameters, which are the correlation coefficients for the calculational uncertainties between systems; each set of parameters gives information relative to the similarity between pairs of selected systems, e.g., a critical experiment and a specific real-world system (the application)

  10. Formal Development of the HRP Prover - Part 1: Syntax and Semantics

    International Nuclear Information System (INIS)

    Sivertsen, Terje

    1996-01-01

    This report describes the formal development of a new version of the HRP Prover. The new version of the tool will have functionality almost identical to the current version, but is developed in accordance to established principles for applying algebraic specification in formal software development. The development project provides results of relevance to the formal development of a wide range of language-oriented tools, including programming language compilers, as well as to the automatic generation of code from specifications. Since the overall scope of this report is the analysis of algebraic specifications, emphasis is given to topics related to what is usually understood as the ''front end'' of compilers. This includes lexical and syntax analysis of the specifications, static semantics through type checking, and dynamic semantics through evaluation. All the different phases are specified in algebraic specification and supported by the current version of the HRP Prover. In subsequent work, the completed parts of the new version will complement the tool support in the development. The work presented will be followed up by formal specification of theorem proving and transformation, as well as code generation into conventional programming languages. The new version of the HRP Prover is incrementally developed in coherence with the specifications produced in these activities. At the same time, the development of the tool demonstrates the efficient use of the methodology through real application on an increasingly important class of software. (author)

  11. Formalizing Evaluation in Music Information Retrieval

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2013-01-01

    We develop a formalism to disambiguate the evaluation of music information retrieval systems. We define a ``system,'' what it means to ``analyze'' one, and make clear the aims, parts, design, execution, interpretation, and assumptions of its ``evaluation.'' We apply this formalism to discuss...

  12. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  13. Formal specification level concepts, methods, and algorithms

    CERN Document Server

    Soeken, Mathias

    2015-01-01

    This book introduces a new level of abstraction that closes the gap between the textual specification of embedded systems and the executable model at the Electronic System Level (ESL). Readers will be enabled to operate at this new, Formal Specification Level (FSL), using models which not only allow significant verification tasks in this early stage of the design flow, but also can be extracted semi-automatically from the textual specification in an interactive manner.  The authors explain how to use these verification tasks to check conceptual properties, e.g. whether requirements are in conflict, as well as dynamic behavior, in terms of execution traces. • Serves as a single-source reference to a new level of abstraction for embedded systems, known as the Formal Specification Level (FSL); • Provides a variety of use cases which can be adapted to readers’ specific design flows; • Includes a comprehensive illustration of Natural Language Processing (NLP) techniques, along with examples of how to i...

  14. A review of uncertainty research in impact assessment

    International Nuclear Information System (INIS)

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-01

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  15. A review of uncertainty research in impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  16. Meteorological uncertainty of atmospheric dispersion model results (MUD)

    International Nuclear Information System (INIS)

    Havskov Soerensen, J.; Amstrup, B.; Feddersen, H.

    2013-08-01

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be derived. (Author)

  17. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  18. 37 CFR 251.41 - Formal hearings.

    Science.gov (United States)

    2010-07-01

    ... ARBITRATION ROYALTY PANEL RULES AND PROCEDURES COPYRIGHT ARBITRATION ROYALTY PANEL RULES OF PROCEDURE Procedures of Copyright Arbitration Royalty Panels § 251.41 Formal hearings. (a) The formal hearings that will be conducted under the rules of this subpart are rate adjustment hearings and royalty fee...

  19. Formal Engineering Hybrid Systems: Semantic Underpinnings

    NARCIS (Netherlands)

    Bujorianu, M.C.; Bujorianu, L.M.

    2008-01-01

    In this work we investigate some issues in applying formal methods to hybrid system development and develop a categorical framework. We study the themes of stochastic reasoning, heterogeneous formal specification and retrenchment. Hybrid systems raise a rich pallets of aspects that need to be

  20. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-01-01

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community

  1. Formal Analysis Of Use Case Diagrams

    Directory of Open Access Journals (Sweden)

    Radosław Klimek

    2010-01-01

    Full Text Available Use case diagrams play an important role in modeling with UML. Careful modeling is crucialin obtaining a correct and efficient system architecture. The paper refers to the formalanalysis of the use case diagrams. A formal model of use cases is proposed and its constructionfor typical relationships between use cases is described. Two methods of formal analysis andverification are presented. The first one based on a states’ exploration represents a modelchecking approach. The second one refers to the symbolic reasoning using formal methodsof temporal logic. Simple but representative example of the use case scenario verification isdiscussed.

  2. The Q theory of investment : does uncertainty matter

    NARCIS (Netherlands)

    Hong Bo, [No Value

    1999-01-01

    This paper includes uncertainty in the Q-model of investment. A structural Q-type investment model is derived, which contains the information on uncertainty effects of random variables that affect the future profitability of a firm. We use a panel of 82 Dutch firms to test whether the presence of

  3. SCALE-6 Sensitivity/Uncertainty Methods and Covariance Data

    International Nuclear Information System (INIS)

    Williams, Mark L.; Rearden, Bradley T.

    2008-01-01

    Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on ENDF/B-VII and other nuclear data evaluations, supplemented by 'low-fidelity' approximate covariances. SCALE (Standardized Computer Analyses for Licensing Evaluation) is a modular code system developed by Oak Ridge National Laboratory (ORNL) to perform calculations for criticality safety, reactor physics, and radiation shielding applications. SCALE calculations typically use sequences that execute a predefined series of executable modules to compute particle fluxes and responses like the critical multiplication factor. SCALE also includes modules for sensitivity and uncertainty (S/U) analysis of calculated responses. The S/U codes in SCALE are collectively referred to as TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation). SCALE-6-scheduled for release in 2008-contains significant new capabilities, including important enhancements in S/U methods and data. The main functions of TSUNAMI are to (a) compute nuclear data sensitivity coefficients and response uncertainties, (b) establish similarity between benchmark experiments and design applications, and (c) reduce uncertainty in calculated responses by consolidating integral benchmark experiments. TSUNAMI includes easy-to-use graphical user interfaces for defining problem input and viewing three-dimensional (3D) geometries, as well as an integrated plotting package.

  4. Quantum maximum-entropy principle for closed quantum hydrodynamic transport within a Wigner function formalism

    International Nuclear Information System (INIS)

    Trovato, M.; Reggiani, L.

    2011-01-01

    By introducing a quantum entropy functional of the reduced density matrix, the principle of quantum maximum entropy is asserted as fundamental principle of quantum statistical mechanics. Accordingly, we develop a comprehensive theoretical formalism to construct rigorously a closed quantum hydrodynamic transport within a Wigner function approach. The theoretical formalism is formulated in both thermodynamic equilibrium and nonequilibrium conditions, and the quantum contributions are obtained by only assuming that the Lagrange multipliers can be expanded in powers of (ℎ/2π) 2 . In particular, by using an arbitrary number of moments, we prove that (1) on a macroscopic scale all nonlocal effects, compatible with the uncertainty principle, are imputable to high-order spatial derivatives, both of the numerical density n and of the effective temperature T; (2) the results available from the literature in the framework of both a quantum Boltzmann gas and a degenerate quantum Fermi gas are recovered as a particular case; (3) the statistics for the quantum Fermi and Bose gases at different levels of degeneracy are explicitly incorporated; (4) a set of relevant applications admitting exact analytical equations are explicitly given and discussed; (5) the quantum maximum entropy principle keeps full validity in the classical limit, when (ℎ/2π)→0.

  5. Incorporating uncertainty in predictive species distribution modelling.

    Science.gov (United States)

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  6. Formal balancing of chemical reaction networks

    NARCIS (Netherlands)

    van der Schaft, Abraham; Rao, S.; Jayawardhana, B.

    2016-01-01

    In this paper we recall and extend the main results of Van der Schaft, Rao, Jayawardhana (2015) concerning the use of Kirchhoff’s Matrix Tree theorem in the explicit characterization of complex-balanced reaction networks and the notion of formal balancing. The notion of formal balancing corresponds

  7. Formal methods in software development: A road less travelled

    Directory of Open Access Journals (Sweden)

    John A van der Poll

    2010-08-01

    Full Text Available An integration of traditional verification techniques and formal specifications in software engineering is presented. Advocates of such techniques claim that mathematical formalisms allow them to produce quality, verifiably correct, or at least highly dependable software and that the testing and maintenance phases are shortened. Critics on the other hand maintain that software formalisms are hard to master, tedious to use and not well suited for the fast turnaround times demanded by industry. In this paper some popular formalisms and the advantages of using these during the early phases of the software development life cycle are presented. Employing the Floyd-Hoare verification principles during the formal specification phase facilitates reasoning about the properties of a specification. Some observations that may help to alleviate the formal-methods controversy are established and a number of formal methods successes is presented. Possible conditions for an increased acceptance of formalisms in oftware development are discussed.

  8. Statistical Survey of Non-Formal Education

    Directory of Open Access Journals (Sweden)

    Ondřej Nývlt

    2012-12-01

    Full Text Available focused on a programme within a regular education system. Labour market flexibility and new requirements on employees create a new domain of education called non-formal education. Is there a reliable statistical source with a good methodological definition for the Czech Republic? Labour Force Survey (LFS has been the basic statistical source for time comparison of non-formal education for the last ten years. Furthermore, a special Adult Education Survey (AES in 2011 was focused on individual components of non-formal education in a detailed way. In general, the goal of the EU is to use data from both internationally comparable surveys for analyses of the particular fields of lifelong learning in the way, that annual LFS data could be enlarged by detailed information from AES in five years periods. This article describes reliability of statistical data aboutnon-formal education. This analysis is usually connected with sampling and non-sampling errors.

  9. Propagation of experimental uncertainties using the Lipari-Szabo model-free analysis of protein dynamics

    International Nuclear Information System (INIS)

    Jin Danqing; Andrec, Michael; Montelione, Gaetano T.; Levy, Ronald M.

    1998-01-01

    In this paper we make use of the graphical procedure previously described [Jin, D. et al. (1997) J. Am. Chem. Soc., 119, 6923-6924] to analyze NMR relaxation data using the Lipari-Szabo model-free formalism. The graphical approach is advantageous in that it allows the direct visualization of the experimental uncertainties in the motional parameter space. Some general 'rules' describing the relationship between the precision of the relaxation measurements and the precision of the model-free parameters and how this relationship changes with the overall tumbling time (τm) are summarized. The effect of the precision in the relaxation measurements on the detection of internal motions not close to the extreme narrowing limit is analyzed. We also show that multiple timescale internal motions may be obscured by experimental uncertainty, and that the collection of relaxation data at very high field strength can improve the ability to detect such deviations from the simple Lipari-Szabo model

  10. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    Science.gov (United States)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  11. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  12. WE-B-19A-01: SRT II: Uncertainties in SRT

    International Nuclear Information System (INIS)

    Dieterich, S; Schlesinger, D; Geneser, S

    2014-01-01

    SRS delivery has undergone major technical changes in the last decade, transitioning from predominantly frame-based treatment delivery to imageguided, frameless SRS. It is important for medical physicists working in SRS to understand the magnitude and sources of uncertainty involved in delivering SRS treatments for a multitude of technologies (Gamma Knife, CyberKnife, linac-based SRS and protons). Sources of SRS planning and delivery uncertainty include dose calculation, dose fusion, and intra- and inter-fraction motion. Dose calculations for small fields are particularly difficult because of the lack of electronic equilibrium and greater effect of inhomogeneities within and near the PTV. Going frameless introduces greater setup uncertainties that allows for potentially increased intra- and interfraction motion, The increased use of multiple imaging modalities to determine the tumor volume, necessitates (deformable) image and contour fusion, and the resulting uncertainties introduced in the image registration process further contribute to overall treatment planning uncertainties. Each of these uncertainties must be quantified and their impact on treatment delivery accuracy understood. If necessary, the uncertainties may then be accounted for during treatment planning either through techniques to make the uncertainty explicit, or by the appropriate addition of PTV margins. Further complicating matters, the statistics of 1-5 fraction SRS treatments differ from traditional margin recipes relying on Poisson statistics. In this session, we will discuss uncertainties introduced during each step of the SRS treatment planning and delivery process and present margin recipes to appropriately account for such uncertainties. Learning Objectives: To understand the major contributors to the total delivery uncertainty in SRS for Gamma Knife, CyberKnife, and linac-based SRS. Learn the various uncertainties introduced by image fusion, deformable image registration, and contouring

  13. Formalization of Many-Valued Logics

    DEFF Research Database (Denmark)

    Villadsen, Jørgen; Schlichtkrull, Anders

    2017-01-01

    Partiality is a key challenge for computational approaches to artificial intelligence in general and natural language in particular. Various extensions of classical two-valued logic to many-valued logics have been investigated in order to meet this challenge. We use the proof assistant Isabelle...... to formalize the syntax and semantics of many-valued logics with determinate as well as indeterminate truth values. The formalization allows for a concise presentation and makes automated verification possible....

  14. Helicity formalism and spin effects

    International Nuclear Information System (INIS)

    Anselmino, M.; Caruso, F.; Piovano, U.

    1990-01-01

    The helicity formalism and the technique to compute amplitudes for interaction processes involving leptons, quarks, photons and gluons are reviewed. Explicit calculations and examples of exploitation of symmetry properties are shown. The formalism is then applied to the discussion of several hadronic processes and spin effects: the experimental data, when related to the properties of the elementary constituent interactions, show many not understood features. Also the nucleon spin problem is briefly reviewed. (author)

  15. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  16. Explaining Delusions: Reducing Uncertainty Through Basic and Computational Neuroscience.

    Science.gov (United States)

    Feeney, Erin J; Groman, Stephanie M; Taylor, Jane R; Corlett, Philip R

    2017-03-01

    Delusions, the fixed false beliefs characteristic of psychotic illness, have long defied understanding despite their response to pharmacological treatments (e.g., D2 receptor antagonists). However, it can be challenging to discern what makes beliefs delusional compared with other unusual or erroneous beliefs. We suggest mapping the putative biology to clinical phenomenology with a cognitive psychology of belief, culminating in a teleological approach to beliefs and brain function supported by animal and computational models. We argue that organisms strive to minimize uncertainty about their future states by forming and maintaining a set of beliefs (about the organism and the world) that are robust, but flexible. If uncertainty is generated endogenously, beliefs begin to depart from consensual reality and can manifest into delusions. Central to this scheme is the notion that formal associative learning theory can provide an explanation for the development and persistence of delusions. Beliefs, in animals and humans, may be associations between representations (e.g., of cause and effect) that are formed by minimizing uncertainty via new learning and attentional allocation. Animal research has equipped us with a deep mechanistic basis of these processes, which is now being applied to delusions. This work offers the exciting possibility of completing revolutions of translation, from the bedside to the bench and back again. The more we learn about animal beliefs, the more we may be able to apply to human beliefs and their aberrations, enabling a deeper mechanistic understanding. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  18. Managing Innovation In View Of The Uncertainties

    Directory of Open Access Journals (Sweden)

    Anton Igorevich Mosalev

    2012-12-01

    Full Text Available Study of the problems of uncertainty in innovation is at present the most up to date. Approaches to its definition, arranged primarily on the assumption and include the known parameters, which essentially is a game approach to the assessment. Address specific issues of governance of innovation in accounting uncertainty still remains open and the most relevant, especially when the innovation represented by one of the drivers of growth of national economies. This paper presents a methodological approach to determining the degree of uncertainty and an approach to the management of innovation through a system of mathematical modeling on the criterion of gross errors.

  19. The Uncertainty Test for the MAAP Computer Code

    International Nuclear Information System (INIS)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J.

    2008-01-01

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident

  20. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  1. Research of Uncertainty Reasoning in Pineapple Disease Identification System

    Science.gov (United States)

    Liu, Liqun; Fan, Haifeng

    In order to deal with the uncertainty of evidences mostly existing in pineapple disease identification system, a reasoning model based on evidence credibility factor was established. The uncertainty reasoning method is discussed,including: uncertain representation of knowledge, uncertain representation of rules, uncertain representation of multi-evidences and update of reasoning rules. The reasoning can fully reflect the uncertainty in disease identification and reduce the influence of subjective factors on the accuracy of the system.

  2. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  3. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  4. Formalization of the classification pattern: survey of classification modeling in information systems engineering.

    Science.gov (United States)

    Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James

    2018-01-01

    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus

  5. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  6. Lived Experiences of "Illness Uncertainty" of Iranian Cancer Patients: A Phenomenological Hermeneutic Study.

    Science.gov (United States)

    Sajjadi, Moosa; Rassouli, Maryam; Abbaszadeh, Abbas; Brant, Jeannine; Majd, Hamid Alavi

    2016-01-01

    For cancer patients, uncertainty is a pervasive experience and a major psychological stressor that affects many aspects of their lives. Uncertainty is a multifaceted concept, and its understanding for patients depends on many factors, including factors associated with various sociocultural contexts. Unfortunately, little is known about the concept of uncertainty in Iranian society and culture. This study aimed to clarify the concept and explain lived experiences of illness uncertainty in Iranian cancer patients. In this hermeneutic phenomenological study, 8 cancer patients participated in semistructured in-depth interviews about their experiences of uncertainty in illness. Interviews continued until data saturation was reached. All interviews were recorded, transcribed, analyzed, and interpreted using 6 stages of the van Manen phenomenological approach. Seven main themes emerged from patients' experiences of illness uncertainty of cancer. Four themes contributed to uncertainty including "Complexity of Cancer," "Confusion About Cancer," "Contradictory Information," and "Unknown Future." Two themes facilitated coping with uncertainty including "Seeking Knowledge" and "Need for Spiritual Peace." One theme, "Knowledge Ambivalence," revealed the struggle between wanting to know and not wanting to know, especially if bad news was delivered. Uncertainty experience for cancer patients in different societies is largely similar. However, some experiences (eg, ambiguity in access to medical resources) seemed unique to Iranian patients. This study provided an outlook of cancer patients' experiences of illness uncertainty in Iran. Cancer patients' coping ability to deal with uncertainty can be improved.

  7. An exploration of student midwives' language to describe non-formal learning in professional practice.

    Science.gov (United States)

    Finnerty, Gina; Pope, Rosemary

    2005-05-01

    The essence of non-formal learning in midwifery practice has not been previously explored. This paper provides an in-depth analysis of the language of a sample of student midwives' descriptions of their practice learning in a range of clinical settings. The students submitted audio-diaries as part of a national study (Pope, R., Graham. L., Finnerty. G., Magnusson, C. 2003. An investigation of the preparation and assessment for midwifery practice within a range of settings. Project Report. University of Surrey). Participants detailed their learning activities and support obtained whilst working with their named mentors for approximately 10 days or shifts. The rich audio-diary data have been analysed using Discourse Analysis. A typology of non-formal learning (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136) has been used to provide a framework for the analysis. Non-formal learning is defined as any learning which does not take place within a formally organised learning programme (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136). Findings indicate that fear and ambiguity hindered students' learning. Recommendations include the protection of time by mentors within the clinical curriculum to guide and supervise students in both formal and non-formal elements of midwifery practice. This paper will explore the implications of the findings for practice-based education.

  8. Multiverse in the Third Quantized Formalism

    International Nuclear Information System (INIS)

    Faizal Mir

    2014-01-01

    In this paper we will analyze the third quantization of gravity in path integral formalism. We will use the time-dependent version of Wheeler—DeWitt equation to analyze the multiverse in this formalism. We will propose a mechanism for baryogenesis to occur in the multiverse, without violating the baryon number conservation. (general)

  9. Standardization and Confluence in Pure Lambda-Calculus Formalized for the Matita Theorem Prover

    Directory of Open Access Journals (Sweden)

    Ferruccio Guidi

    2012-01-01

    Full Text Available We present a formalization of pure lambda-calculus for the Matita interactive theorem prover, including the proofs of two relevant results in reduction theory: the confluence theorem and the standardization theorem. The proof of the latter is based on a new approach recently introduced by Xi and refined by Kashima that, avoiding the notion of development and having a neat inductive structure, is particularly suited for formalization in theorem provers.

  10. An Entry Point for Formal Methods: Specification and Analysis of Event Logs

    Directory of Open Access Journals (Sweden)

    Howard Barringer

    2010-03-01

    Full Text Available Formal specification languages have long languished, due to the grave scalability problems faced by complete verification methods. Runtime verification promises to use formal specifications to automate part of the more scalable art of testing, but has not been widely applied to real systems, and often falters due to the cost and complexity of instrumentation for online monitoring. In this paper we discuss work in progress to apply an event-based specification system to the logging mechanism of the Mars Science Laboratory mission at JPL. By focusing on log analysis, we exploit the "instrumentation" already implemented and required for communicating with the spacecraft. We argue that this work both shows a practical method for using formal specifications in testing and opens interesting research avenues, including a challenging specification learning problem.

  11. Uncertainty analysis of neutron transport calculation

    International Nuclear Information System (INIS)

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  12. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  13. Singular formalism and admissible control of spacecraft with rotating flexible solar array

    Directory of Open Access Journals (Sweden)

    Lu Dongning

    2014-02-01

    Full Text Available This paper is concerned with the attitude control of a three-axis-stabilized spacecraft which consists of a central rigid body and a flexible sun-tracking solar array driven by a solar array drive assembly. Based on the linearization of the dynamics of the spacecraft and the modal identities about the flexible and rigid coupling matrices, the spacecraft attitude dynamics is reduced to a formally singular system with periodically varying parameters, which is quite different from a spacecraft with fixed appendages. In the framework of the singular control theory, the regularity and impulse-freeness of the singular system is analyzed and then admissible attitude controllers are designed by Lyapunov’s method. To improve the robustness against system uncertainties, an H∞ optimal control is designed by optimizing the H∞ norm of the system transfer function matrix. Comparative numerical experiments are performed to verify the theoretical results.

  14. 43 CFR 30.235 - What will the judge's decision in a formal probate proceeding contain?

    Science.gov (United States)

    2010-10-01

    ....235 What will the judge's decision in a formal probate proceeding contain? The judge must decide the... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false What will the judge's decision in a formal... requirements of this section. (a) In all cases, the judge's decision must: (1) Include the name, birth date...

  15. A boundary integral formalism for stochastic ray tracing in billiards

    International Nuclear Information System (INIS)

    Chappell, David J.; Tanner, Gregor

    2014-01-01

    Determining the flow of rays or non-interacting particles driven by a force or velocity field is fundamental to modelling many physical processes. These include particle flows arising in fluid mechanics and ray flows arising in the geometrical optics limit of linear wave equations. In many practical applications, the driving field is not known exactly and the dynamics are determined only up to a degree of uncertainty. This paper presents a boundary integral framework for propagating flows including uncertainties, which is shown to systematically interpolate between a deterministic and a completely random description of the trajectory propagation. A simple but efficient discretisation approach is applied to model uncertain billiard dynamics in an integrable rectangular domain

  16. Meteorological uncertainty of atmospheric dispersion model results (MUD)

    Energy Technology Data Exchange (ETDEWEB)

    Havskov Soerensen, J.; Amstrup, B.; Feddersen, H. [Danish Meteorological Institute, Copenhagen (Denmark)] [and others

    2013-08-15

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be derived. (Author)

  17. Planning for robust reserve networks using uncertainty analysis

    Science.gov (United States)

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  18. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  19. Parametric uncertainty in optical image modeling

    Science.gov (United States)

    Potzick, James; Marx, Egon; Davidson, Mark

    2006-10-01

    Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.

  20. The Formalization of Cultural Psychology. Reasons and Functions.

    Science.gov (United States)

    Salvatore, Sergio

    2017-03-01

    In this paper I discuss two basic theses about the formalization of cultural psychology. First, I claim that formalization is a relevant, even necessary stage of development of this domain of science. This is so because formalization allows the scientific language to achieve a much needed autonomy from the commonsensical language of the phenomena that this science deals with. Second, I envisage the two main functions that formalization has to perform in the field of cultural psychology: on the one hand, it has to provide formal rules grounding and constraining the deductive construction of the general theory; on the other hand, it has to provide the devices for supporting the interpretation of local phenomena, in terms of the abductive reconstruction of the network of linkages among empirical occurrences comprising the local phenomena.

  1. Formal specifications for safety grade systems

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Smith, B.T.; Wojcik, A.S.

    1992-01-01

    The authors describe the findings of a study into the application of formal methods to the specification of a safety system for an operating nuclear reactor. They developed a formal specification that is used to verify and validate that no unsafe condition will result from action or inaction of the system. For this reason, the specification must facilitate thinking about, talking about, and implementing the system. In fact, the specification must provide a bridge between people (designers, engineers, policy makers) and diverse implementations (hardware, software, sensors, power supplies) at all levels. For a specification to serve as an effective linkage, it must have the following properties: (1) completeness, (2) conciseness, (3) unambiguity, and (4) communicativeness. In this paper they describe the development of a specification that has three properties. This development is based on the use of formal methods, i.e., methods that add mathematical rigor to the development, analysis and operation of computer systems and to applications based thereon (Neumann). They demonstrate that a specification derived from a formal basis facilitates development of the design and its subsequent verification

  2. Formal truncations of connected kernel equations

    International Nuclear Information System (INIS)

    Dixon, R.M.

    1977-01-01

    The Connected Kernel Equations (CKE) of Alt, Grassberger and Sandhas (AGS); Kouri, Levin and Tobocman (KLT); and Bencze, Redish and Sloan (BRS) are compared against reaction theory criteria after formal channel space and/or operator truncations have been introduced. The Channel Coupling Class concept is used to study the structure of these CKE's. The related wave function formalism of Sandhas, of L'Huillier, Redish and Tandy and of Kouri, Krueger and Levin are also presented. New N-body connected kernel equations which are generalizations of the Lovelace three-body equations are derived. A method for systematically constructing fewer body models from the N-body BRS and generalized Lovelace (GL) equations is developed. The formally truncated AGS, BRS, KLT and GL equations are analyzed by employing the criteria of reciprocity and two-cluster unitarity. Reciprocity considerations suggest that formal truncations of BRS, KLT and GL equations can lead to reciprocity-violating results. This study suggests that atomic problems should employ three-cluster connected truncations and that the two-cluster connected truncations should be a useful starting point for nuclear systems

  3. Formalizing Implementation Strategies for First-Class Continuations

    DEFF Research Database (Denmark)

    Danvy, Olivier

    1999-01-01

    We present the first formalization of implementation strategies for first-class continuations. The formalization hinges on abstract machines for continuation-passing style (CPS) programs with a special treatment for the current continuation, accounting for the essence of first-class continuations......-class continuations and that second-class continuations are stackable. A large body of work exists on implementing continuations, but it is predominantly empirical and implementation-oriented. In contrast, our formalization abstracts the essence of first-class continuations and provides a uniform setting...

  4. Formalizing Implementation Strategies for First-Class Continuations

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2000-01-01

    We present the first formalization of implementation strategies for first-class continuations. The formalization hinges on abstract machines for continuation-passing style (CPS) programs with a special treatment for the current continuation, accounting for the essence of first-class continuations......-class continuations and that second-class continuations are stackable. A large body of work exists on implementing continuations, but it is predominantly empirical and implementation-oriented. In contrast, our formalization abstracts the essence of first-class continuations and provides a uniform setting...

  5. Identification and communication of uncertainties of phenomenological models in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Simola, K.

    2001-11-01

    This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)

  6. Teaching Formal Reasoning in a College Biology Course for Preservice Teachers.

    Science.gov (United States)

    Lawson, Anton E.; Snitgen, Donald A.

    1982-01-01

    Assessed the effect of a one-semester college biology course on the development of students (N=72) ability to reason formally and interactions among intelligence, cognitive style, and cognitive level. Includes implications for science instruction. (SK)

  7. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  8. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  9. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  10. Formal education of patients about to undergo laparoscopic cholecystectomy.

    Science.gov (United States)

    Gurusamy, Kurinchi Selvan; Vaughan, Jessica; Davidson, Brian R

    2014-02-28

    Generally, before being operated on, patients will be given informal information by the healthcare providers involved in the care of the patients (doctors, nurses, ward clerks, or healthcare assistants). This information can also be provided formally in different formats including written information, formal lectures, or audio-visual recorded information. To compare the benefits and harms of formal preoperative patient education for patients undergoing laparoscopic cholecystectomy. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (Issue 2, 2013), MEDLINE, EMBASE, and Science Citation Index Expanded to March 2013. We included only randomised clinical trials irrespective of language and publication status. Two review authors independently extracted the data. We planned to calculate the risk ratio with 95% confidence intervals (CI) for dichotomous outcomes, and mean difference (MD) or standardised mean difference (SMD) with 95% CI for continuous outcomes based on intention-to-treat analyses when data were available. A total of 431 participants undergoing elective laparoscopic cholecystectomy were randomised to formal patient education (215 participants) versus standard care (216 participants) in four trials. The patient education included verbal education, multimedia DVD programme, computer-based multimedia programme, and Power Point presentation in the four trials. All the trials were of high risk of bias. One trial including 212 patients reported mortality. There was no mortality in either group in this trial. None of the trials reported surgery-related morbidity, quality of life, proportion of patients discharged as day-procedure laparoscopic cholecystectomy, the length of hospital stay, return to work, or the number of unplanned visits to the doctor. There were insufficient details to calculate the mean difference and 95% CI for the difference in pain scores at 9 to 24 hours (1 trial; 93 patients); and we did not identify clear evidence of

  11. New Technologies and Learning Environments: A Perspective from Formal and Non-Formal Education in Baja California, Mexico

    Science.gov (United States)

    Zamora, Julieta Lopez; Reynaga, Francisco Javier Arriaga

    2010-01-01

    This paper presents results of two research works, the first approaches non-formal education and the second addresses formal education. In both studies in-depth interview techniques were used. There were some points of convergence between them on aspects such as the implementation of learning environments and the integration of ICT. The interview…

  12. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  13. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  14. Perceptions of the value of traditional ecological knowledge to formal school curricula: opportunities and challenges from Malekula Island, Vanuatu

    Directory of Open Access Journals (Sweden)

    McCarter Joe

    2011-11-01

    Full Text Available Abstract Background The integration of traditional ecological knowledge (TEK into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Results Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. Conclusions TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such

  15. Towards a Formal Model of Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Vatrapu, Ravi; Hussain, Abid

    , transform, analyse, and report social data from social media platforms such as Facebook and twitter. Formal methods, models and tools for social data are largely limited to graph theoretical approaches informing conceptual developments in relational sociology and methodological developments in social...... network analysis. As far as we know, there are no integrated modeling approaches to social data across the conceptual, formal and software realms. Social media analytics can be undertaken in two main ways - ”Social Graph Analytics” and ”Social Text Analytics” (Vatrapu, in press/2013). Social graph......, we exemplify the semantics of the formal model with real-world social data examples. Third, we briefly present and discuss the Social Data Analytics Tool (SODATO) that realizes the conceptual model in software and provisions social data for computational social science analysis based on the formal...

  16. Asymmetrical peer interaction and formal operational development: Dialogue dimensions analysis

    Directory of Open Access Journals (Sweden)

    Stepanović-Ilić Ivana

    2015-01-01

    Full Text Available The main goal of the study is to define dialogue dimensions in order to describe the interaction within peer dyads and potentially connect them with formal operations development in the less competent participants. Its significance is related to rare investigations of this subject in the context of formal operations development and to practical implications regarding peer involvement in education process. The sample included 316 students aged 12 and 14. The research had an experimental design: pre-test, intervention and post-test. In the pre-test and the post-test phases students solved the formal operations test BLOT. According to the pre-test results, 47 dyads were formed where less and more competent students jointly solved tasks from BLOT. Their dialogues were coded by 14 dimensions operationalized for this purpose. Correlations between the dialogue dimensions indicate clearly distinguished positive and negative interaction patterns. There are no connections between dialogue dimensions and progress of less competent adolescents on BLOT in the entire sample, but several are found in the subsamples. Arguments exchange seems to be the most encouraging dialogue feature regarding formal operations development, particularly in older students. This confirms relevant research data and the expectations about peers’ constructive role in fostering cognitive development. [Projekat Ministarstva nauke Republike Srbije, br. 179018: Identification, measurement and development of cognitive and emotional competences important for a society oriented towards European integrations

  17. "Passing It On": Beyond Formal or Informal Pedagogies

    Science.gov (United States)

    Cain, Tim

    2013-01-01

    Informal pedagogies are a subject of debate in music education, and there is some evidence of teachers abandoning formal pedagogies in favour of informal ones. This article presents a case of one teacher's formal pedagogy and theorises it by comparing it with a case of informal pedagogy. The comparison reveals affordances of formal pedagogies…

  18. An approach of requirements tracing in formal refinement

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Leuschel, Michael

    2010-01-01

    Formal modeling of computing systems yields models that are intended to be correct with respect to the requirements that have been formalized. The complexity of typical computing systems can be addressed by formal refinement introducing all the necessary details piecemeal. We report on preliminar...... changes, making use of corresponding techniques already built into the Event-B method....

  19. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  20. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  1. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    Ekberg, C.

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  2. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  3. Uncertainty during breast diagnostic evaluation: state of the science.

    Science.gov (United States)

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  4. Y-formalism and curved {beta}-{gamma} systems

    Energy Technology Data Exchange (ETDEWEB)

    Grassi, Pietro Antonio [DISTA, Universita del Piemonte Orientale, via Bellini 25/g, 15100 Alessandria (Italy); INFN - Sezione di Torino (Italy)], E-mail: antonio.pietro.grassi@cern.ch; Oda, Ichiro [Department of Physics, Faculty of Science, University of the Ryukyus, Nishihara, Okinawa 903-0213 (Japan); Tonin, Mario [Dipartimento di Fisica, Universita degli Studi di Padova, INFN, Sezionedi Padova, Via F. Marzolo 8, 35131 Padova (Italy)

    2009-01-01

    We adopt the Y-formalism to study {beta}-{gamma} systems on hypersurfaces. We compute the operator product expansions of gauge-invariant currents and we discuss some applications of the Y-formalism to model on Calabi-Yau spaces.

  5. Y-formalism and curved β-γ systems

    International Nuclear Information System (INIS)

    Grassi, Pietro Antonio; Oda, Ichiro; Tonin, Mario

    2009-01-01

    We adopt the Y-formalism to study β-γ systems on hypersurfaces. We compute the operator product expansions of gauge-invariant currents and we discuss some applications of the Y-formalism to model on Calabi-Yau spaces

  6. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  7. The formal operations: Piaget’s concept, researches and main critics

    Directory of Open Access Journals (Sweden)

    Stepanović Ivana Ž.

    2004-01-01

    Full Text Available This paper deals with Piaget's concept of formal operations, formal operations researches and critics related to the concept. The first part of the work is dedicated to the formal operations concept. The main characteristics of formal operational thought and formal operations structure, as well as structure logical model are presented in that part of the work. The second part is a review of formal operational researches and it is divided in three parts: (1 problems of researches (2 characteristics of applied methodology and (3 author approaches as a specific research context. In the last part of the work the main critics of formal operations concept are presented and discussed.

  8. Calculations of the transport properties within the PAW formalism

    Energy Technology Data Exchange (ETDEWEB)

    Mazevet, S.; Torrent, M.; Recoules, V.; Jollet, F. [CEA Bruyeres-le-Chatel, DIF, 91 (France)

    2010-07-01

    We implemented the calculation of the transport properties within the PAW formalism in the ABINIT code. This feature allows the calculation of the electrical and optical properties, including the XANES spectrum, as well as the electronic contribution to the thermal conductivity. We present here the details of the implementation and results obtained for warm dense aluminum plasma. (authors)

  9. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  10. Formal Analysis of Graphical Security Models

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi

    , software components and human actors interacting with each other to form so-called socio-technical systems. The importance of socio-technical systems to modern societies requires verifying their security properties formally, while their inherent complexity makes manual analyses impracticable. Graphical...... models for security offer an unrivalled opportunity to describe socio-technical systems, for they allow to represent different aspects like human behaviour, computation and physical phenomena in an abstract yet uniform manner. Moreover, these models can be assigned a formal semantics, thereby allowing...... formal verification of their properties. Finally, their appealing graphical notations enable to communicate security concerns in an understandable way also to non-experts, often in charge of the decision making. This dissertation argues that automated techniques can be developed on graphical security...

  11. Uncertainty assessment of equations of state with application to an organic Rankine cycle

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Bell, Ian; O’Connell, John P.

    2017-01-01

    Evaluations of equations of state (EoS) should include uncertainty. This study presents a genericmethod to analyse EoS from a detailed uncertainty analysis of the mathematical form and the dataused to obtain EoS parameter values. The method is illustrated by comparison of Soave–Redlich–Kwong (SRK......) cubic EoS with perturbed-chain statistical associating fluid theory (PC-SAFT) EoS for anorganic Rankine cycle (ORC) for heat recovery to power fromthe exhaust gas of a marine diesel engineusing cyclopentane as working fluid. Uncertainties of the EoS input parameters including......Evaluations of equations of state (EoS) should include uncertainty. This study presents a genericmethod to analyse EoS from a detailed uncertainty analysis of the mathematical form and the dataused to obtain EoS parameter values. The method is illustrated by comparison of Soave–Redlich–Kwong (SRK...

  12. Potentiometric determination of the 'formal' hydrolysis ratio of aluminium species in aqueous solutions

    International Nuclear Information System (INIS)

    Fournier, Agathe C.; Shafran, Kirill L.; Perry, Carole C.

    2008-01-01

    The 'formal' hydrolysis ratio (h = C(OH - ) added /C(Al) total ) of hydrolysed aluminium-ions is an important parameter required for the exhaustive and quantitative speciation-fractionation of aluminium in aqueous solutions. This paper describes a potentiometric method for determination of the formal hydrolysis ratio based on an automated alkaline titration procedure. The method uses the point of precipitation of aluminium hydroxide as a reference (h = 3.0) in order to calculate the initial formal hydrolysis ratio of hydrolysed aluminium-ion solutions. Several solutions of pure hydrolytic species including aluminium monomers (AlCl 3 ), Al 13 polynuclear cluster ([Al 13 O 4 (OH) 24 (H 2 O) 12 ] 7+ ), Al 30 polynuclear cluster ([Al 30 O 8 (OH) 56 (H 2 O) 26 ] 18+ ) and a suspension of nanoparticulate aluminium hydroxide have been used as 'reference standards' to validate the proposed potentiometric method. Other important variables in the potentiometric determination of the hydrolysis ratio have also been optimised including the concentration of aluminium and the type and strength of alkali (Trizma-base, NH 3 , NaHCO 3 , Na 2 CO 3 and KOH). The results of the potentiometric analysis have been cross-verified by quantitative 27 Al solution nuclear magnetic resonance ( 27 Al NMR) measurements. The 'formal' hydrolysis ratio of a commercial basic aluminium chloride has been measured as an example of a practical application of the developed technique

  13. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  14. Adolescent thinking ála Piaget: The formal stage.

    Science.gov (United States)

    Dulit, E

    1972-12-01

    Two of the formal-stage experiments of Piaget and Inhelder, selected largely for their closeness to the concepts defining the stage, were replicated with groups of average and gifted adolescents. This report describes the relevant Piagetian concepts (formal stage, concrete stage) in context, gives the methods and findings of this study, and concludes with a section discussing implications and making some reformulations which generally support but significantly qualify some of the central themes of the Piaget-Inhelder work. Fully developed formal-stage thinking emerges as far from commonplace among normal or average adolescents (by marked contrast with the impression created by the Piaget-Inhelder text, which chooses to report no middle or older adolescents who function at less than fully formal levels). In this respect, the formal stage differs appreciably from the earlier Piagetian stages, and early adolescence emerges as the age for which a "single path" model of cognitive development becomes seriously inadequate and a more complex model becomes essential. Formal-stage thinking seems best conceptualized, like most other aspects of psychological maturity, as a potentiality only partially attained by most and fully attained only by some.

  15. Policy Uncertainty and the US Ethanol Industry

    Directory of Open Access Journals (Sweden)

    Jason P. H. Jones

    2017-11-01

    Full Text Available The Renewable Fuel Standard (RFS2, as implemented, has introduced uncertainty into US ethanol producers and the supporting commodity market. First, the fixed mandate for what is mainly cornstarch-based ethanol has increased feedstock price volatility and exerts a general effect across the agricultural sector. Second, the large discrepancy between the original Energy Independence and Security Act (EISA intentions and the actual RFS2 implementation for some fuel classes has increased the investment uncertainty facing investors in biofuel production, distribution, and consumption. Here we discuss and analyze the sources of uncertainty and evaluate the effect of potential RFS2 adjustments as they influence these uncertainties. This includes the use of a flexible, production dependent mandate on corn starch ethanol. We find that a flexible mandate on cornstarch ethanol relaxed during drought could significantly reduce commodity price spikes and alleviate the decline of livestock production in cases of feedstock production shortfalls, but it would increase the risk for ethanol investors.

  16. State or nature? Endogenous formal versus informal sanctions in the voluntary provision of public goods

    DEFF Research Database (Denmark)

    Kamei, Kenju; Putterman, Louis; Tyran, Jean-Robert Karl

    2015-01-01

    We investigate the endogenous formation of sanctioning institutions supposed to improve efficiency in the voluntary provision of public goods. Our paper parallels Markussen et al. (Rev Econ Stud 81:301–324, 2014) in that our experimental subjects vote over formal versus informal sanctions......, but it goes beyond that paper by endogenizing the formal sanction scheme. We find that self-determined formal sanctions schemes are popular and efficient when they carry no up-front cost, but as in Markussen et al. informal sanctions are more popular and efficient than formal sanctions when adopting...... the latter entails such a cost. Practice improves the performance of sanction schemes: they become more targeted and deterrent with learning. Voters’ characteristics, including their tendency to engage in perverse informal sanctioning, help to predict individual voting....

  17. Effects of utility demand-side management programs on uncertainty

    International Nuclear Information System (INIS)

    Hirst, E.

    1994-01-01

    Electric utilities face a variety of uncertainties that complicate their long-term resource planning. These uncertainties include future economic and load growths, fuel prices, environmental and economic regulations, performance of existing power plants, cost and availability of purchased power, and the costs and performance of new demand and supply resources. As utilities increasingly turn to demand-side management (DSM) programs to provide resources, it becomes more important to analyze the interactions between these programs and the uncertainties facing utilities. This paper uses a dynamic planning model to quantify the uncertainty effects of supply-only vs DSM + supply resource portfolios. The analysis considers four sets of uncertainties: economic growth, fuel prices, the costs to build new power plants, and the costs to operate DSM programs. The two types of portfolios are tested against these four sets of uncertainties for the period 1990 to 2010. Sensitivity, scenario, and worst-case analysis methods are used. The sensitivity analyses show that the DSM + supply resource portfolio is less sensitive to unanticipated changes in economic growth, fuel prices, and power-plant construction costs than is the supply-only portfolio. The supply-only resource mix is better only with respect to uncertainties about the costs of DSM programs. The base-case analysis shows that including DSM programs in the utility's resource portfolio reduces the net present value of revenue requirements (NPV-RR) by 490 million dollars. The scenario-analysis results show an additional 30 million dollars (6%) in benefits associated with reduction in these uncertainties. In the worst-case analysis, the DSM + supply portfolio again reduces the cost penalty associated with guessing wrong for both cases, when the utility plans for high needs and learns it has low needs and vice versa. 20 refs

  18. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  19. Formal connections in deformation quantization

    DEFF Research Database (Denmark)

    Masulli, Paolo

    The field of this thesis is deformation quantization, and we consider mainly symplectic manifolds equipped with a star product. After reviewing basics in complex geometry, we introduce quantization, focusing on geometric quantization and deformation quantization. The latter is defined as a star...... characteristic class, and that formal connections form an affine space over the derivations of the star products. Moreover, if the parameter space for the family of star products is contractible, we obtain that any two flat formal connections are gauge equivalent via a self-equivalence of the family of star...

  20. What Sensing Tells Us: Towards a Formal Theory of Testing for Dynamical Systems

    Science.gov (United States)

    McIlraith, Sheila; Scherl, Richard

    2005-01-01

    Just as actions can have indirect effects on the state of the world, so too can sensing actions have indirect effects on an agent's state of knowledge. In this paper, we investigate "what sensing actions tell us", i.e., what an agent comes to know indirectly from the outcome of a sensing action, given knowledge of its actions and state constraints that hold in the world. To this end, we propose a formalization of the notion of testing within a dialect of the situation calculus that includes knowledge and sensing actions. Realizing this formalization requires addressing the ramification problem for sensing actions. We formalize simple tests as sensing actions. Complex tests are expressed in the logic programming language Golog. We examine what it means to perform a test, and how the outcome of a test affects an agent's state of knowledge. Finally, we propose automated reasoning techniques for test generation and complex-test verification, under certain restrictions. The work presented in this paper is relevant to a number of application domains including diagnostic problem solving, natural language understanding, plan recognition, and active vision.

  1. Application of uncertainty analysis in conceptual fusion reactor design

    International Nuclear Information System (INIS)

    Wu, T.; Maynard, C.W.

    1979-01-01

    The theories of sensitivity and uncertainty analysis are described and applied to a new conceptual tokamak fusion reactor design--NUWMAK. The responses investigated in this study include the tritium breeding ratio, first wall Ti dpa and gas productions, nuclear heating in the blanket, energy leakage to the magnet, and the dpa rate in the superconducting magnet aluminum stabilizer. The sensitivities and uncertainties of these responses are calculated. The cost/benefit feature of proposed integral measurements is also studied through the uncertainty reductions of these responses

  2. A Conceptual Formalization of Crosscutting in AOSD

    NARCIS (Netherlands)

    van den Berg, Klaas; Conejero, J.M.

    2005-01-01

    We propose a formalization of crosscutting based on a conceptual framework for AOSD. Crosscutting is clearly distinguished from the related concepts scattering and tangling. The definitions of these concepts are formalized and visualized with matrices and matrix operations. This allows more precise

  3. Uncertainty and endogenous technical change in climate policy models

    International Nuclear Information System (INIS)

    Baker, Erin; Shittu, Ekundayo

    2008-01-01

    Until recently endogenous technical change and uncertainty have been modeled separately in climate policy models. In this paper, we review the emerging literature that considers both these elements together. Taken as a whole the literature indicates that explicitly including uncertainty has important quantitative and qualitative impacts on optimal climate change technology policy. (author)

  4. Comparative study of the uncertainties in parton distribution functions

    International Nuclear Information System (INIS)

    Alekhin, S.I.

    2003-01-01

    Comparison of the methods used to extract the uncertainties in parton distributions is given, including their statistical properties and practical issues of implementation. Advantages and disadvantages of different methods are illustrated using the examples based on the analysis of real data. Available PDFs sets with associated uncertainties are reviewed and critically compared

  5. An exact formalism for Doppler-broadened neutron cross-sections

    International Nuclear Information System (INIS)

    Catsaros, Nicolas.

    1985-07-01

    An exact formalism (Ψ, Φ) is proposed for the calculation of Breit-Wigner or Adler-Adler Doppler-broadened neutron cross-sections. The well-known (Ψ, Φ) formalism is shown to be a zero-order approximation of the generalized (Ψ, Φ) formalism. (author)

  6. User Interface Technology for Formal Specification Development

    Science.gov (United States)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  7. Modeling of uncertainties in statistical inverse problems

    International Nuclear Information System (INIS)

    Kaipio, Jari

    2008-01-01

    In all real world problems, the models that tie the measurements to the unknowns of interest, are at best only approximations for reality. While moderate modeling and approximation errors can be tolerated with stable problems, inverse problems are a notorious exception. Typical modeling errors include inaccurate geometry, unknown boundary and initial data, properties of noise and other disturbances, and simply the numerical approximations of the physical models. In principle, the Bayesian approach to inverse problems, in which all uncertainties are modeled as random variables, is capable of handling these uncertainties. Depending on the type of uncertainties, however, different strategies may be adopted. In this paper we give an overview of typical modeling errors and related strategies within the Bayesian framework.

  8. Evaluation of the Repeatability of the Delta Q Duct Leakage Testing TechniqueIncluding Investigation of Robust Analysis Techniques and Estimates of Weather Induced Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dickerhoff, Darryl; Walker, Iain

    2008-08-01

    found in for the pressure station approach. Walker and Dickerhoff also included estimates of DeltaQ test repeatability based on the results of field tests where two houses were tested multiple times. The two houses were quite leaky (20-25 Air Changes per Hour at 50Pa (0.2 in. water) (ACH50)) and were located in the San Francisco Bay area. One house was tested on a calm day and the other on a very windy day. Results were also presented for two additional houses that were tested by other researchers in Minneapolis, MN and Madison, WI, that had very tight envelopes (1.8 and 2.5 ACH50). These tight houses had internal duct systems and were tested without operating the central blower--sometimes referred to as control tests. The standard deviations between the multiple tests for all four houses were found to be about 1% of the envelope air flow at 50 Pa (0.2 in. water) (Q50) that led to the suggestion of this as a rule of thumb for estimating DeltaQ uncertainty. Because DeltaQ is based on measuring envelope air flows it makes sense for uncertainty to scale with envelope leakage. However, these tests were on a limited data set and one of the objectives of the current study is to increase the number of tested houses. This study focuses on answering two questions: (1) What is the uncertainty associated with changes in weather (primarily wind) conditions during DeltaQ testing? (2) How can these uncertainties be reduced? The first question is addressing issues of repeatability. To study this five houses were tested as many times as possible over a day. Weather data was recorded on-site--including the local windspeed. The result from these five houses were combined with the two Bay Area homes from the previous studies. The variability of the tests (represented by the standard deviation) is the repeatability of the test method for that house under the prevailing weather conditions. Because the testing was performed over a day a wide range of wind speeds was achieved following

  9. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...... the accounting of uncertainty are compared with respect to different objectives. (C) 2015 Elsevier Ltd. All rights reserved....

  10. Comparison of the uncertainties of several European low-dose calibration facilities

    Science.gov (United States)

    Dombrowski, H.; Cornejo Díaz, N. A.; Toni, M. P.; Mihelic, M.; Röttger, A.

    2018-04-01

    The typical uncertainty of a low-dose rate calibration of a detector, which is calibrated in a dedicated secondary national calibration laboratory, is investigated, including measurements in the photon field of metrology institutes. Calibrations at low ambient dose equivalent rates (at the level of the natural ambient radiation) are needed when environmental radiation monitors are to be characterised. The uncertainties of calibration measurements in conventional irradiation facilities above ground are compared with those obtained in a low-dose rate irradiation facility located deep underground. Four laboratories quantitatively evaluated the uncertainties of their calibration facilities, in particular for calibrations at low dose rates (250 nSv/h and 1 μSv/h). For the first time, typical uncertainties of European calibration facilities are documented in a comparison and the main sources of uncertainty are revealed. All sources of uncertainties are analysed, including the irradiation geometry, scattering, deviations of real spectra from standardised spectra, etc. As a fundamental metrological consequence, no instrument calibrated in such a facility can have a lower total uncertainty in subsequent measurements. For the first time, the need to perform calibrations at very low dose rates (< 100 nSv/h) deep underground is underpinned on the basis of quantitative data.

  11. DNA expressions - A formal notation for DNA

    NARCIS (Netherlands)

    Vliet, Rudy van

    2015-01-01

    We describe a formal notation for DNA molecules that may contain nicks and gaps. The resulting DNA expressions denote formal DNA molecules. Different DNA expressions may denote the same molecule. Such DNA expressions are called equivalent. We examine which DNA expressions are minimal, which

  12. Opinion dynamics model based on quantum formalism

    Energy Technology Data Exchange (ETDEWEB)

    Artawan, I. Nengah, E-mail: nengahartawan@gmail.com [Theoretical Physics Division, Department of Physics, Udayana University (Indonesia); Trisnawati, N. L. P., E-mail: nlptrisnawati@gmail.com [Biophysics, Department of Physics, Udayana University (Indonesia)

    2016-03-11

    Opinion dynamics model based on quantum formalism is proposed. The core of the quantum formalism is on the half spin dynamics system. In this research the implicit time evolution operators are derived. The analogy between the model with Deffuant dan Sznajd models is discussed.

  13. Formal analysis of a fair payment protocol

    NARCIS (Netherlands)

    J.G. Cederquist; M.T. Dashti (Mohammad)

    2004-01-01

    textabstractWe formally specify a payment protocol. This protocol is intended for fair exchange of time-sensitive data. Here the ?-CRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free ?-calculus. These properties are then verified

  14. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    .g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.

  15. Formal methods for dependable real-time systems

    Science.gov (United States)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  16. General many-body formalism for composite quantum particles.

    Science.gov (United States)

    Combescot, M; Betbeder-Matibet, O

    2010-05-21

    This Letter provides a formalism capable of exactly treating Pauli blocking between n-fermion particles. This formalism is based on an operator algebra made of commutators and anticommutators which contrasts with the usual scalar formalism of Green functions developed half a century ago for elementary quantum particles. We also provide the diagrams which visualize the very specific many-body physics induced by fermion exchanges between composite quantum particles.

  17. Integration of expert knowledge and uncertainty in natural risk assessment

    Science.gov (United States)

    Baruffini, Mirko; Jaboyedoff, Michel

    2010-05-01

    Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and

  18. Shear viscosity from Kubo formalism: NJL model study

    International Nuclear Information System (INIS)

    Lang, Robert; Weise, Wolfram

    2014-01-01

    A large-N c expansion is combined with the Kubo formalism to study the shear viscosity η of strongly interacting matter in the two-flavor NJL model. We discuss analytical and numerical approaches to η and investigate systematically its strong dependence on the spectral width and the momentum-space cutoff. Thermal effects on the constituent quark mass from spontaneous chiral symmetry breaking are included. The ratio η/s and its thermal dependence are derived for different parameterizations of the spectral width and for an explicit one-loop calculation including mesonic modes within the NJL model. (orig.)

  19. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  20. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    Science.gov (United States)

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  1. Fifth International Conference on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Janszky, J. (Editor); Kim, Y. S. (Editor); Man'ko, V. I. (Editor)

    1998-01-01

    The Fifth International Conference on Squeezed States and Uncertainty Relations was held at Balatonfured, Hungary, on 27-31 May 1997. This series was initiated in 1991 at the College Park Campus of the University of Maryland as the Workshop on Squeezed States and Uncertainty Relations. The scientific purpose of this series was to discuss squeezed states of light, but in recent years the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics including quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic. As the meeting attracted more participants and started covering more diversified subjects, the fourth meeting was called an international conference. The Fourth International Conference on Squeezed States and Uncertainty Relations was held in 1995 was hosted by Shanxi University in Taiyuan, China. The fifth meeting of this series, which was held at Balatonfured, Hungary, was also supported by the IUPAP. In 1999, the Sixth International Conference will be hosted by the University of Naples in 1999. The meeting will take place in Ravello near Naples.

  2. Sensitivity and uncertainty analysis of NET/ITER shielding blankets

    International Nuclear Information System (INIS)

    Hogenbirk, A.; Gruppelaar, H.; Verschuur, K.A.

    1990-09-01

    Results are presented of sensitivity and uncertainty calculations based upon the European fusion file (EFF-1). The effect of uncertainties in Fe, Cr and Ni cross sections on the nuclear heating in the coils of a NET/ITER shielding blanket has been studied. The analysis has been performed for the total cross section as well as partial cross sections. The correct expression for the sensitivity profile was used, including the gain term. The resulting uncertainty in the nuclear heating lies between 10 and 20 per cent. (author). 18 refs.; 2 figs.; 2 tabs

  3. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  4. Uncertainty covariances in robotics applications

    International Nuclear Information System (INIS)

    Smith, D.L.

    1984-01-01

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  5. Formal Analysis of a Fair Payment Protocol

    NARCIS (Netherlands)

    Cederquist, J.G.; Dashti, M.T.

    2004-01-01

    We formally specify a payment protocol. This protocol is intended for fair exchange of timesensitive data. Here the μCRL language is used to formalize the protocol. Fair exchange properties are expressed in the regular alternation-free μ-calculus. These properties are then verified using the finite

  6. Land grabbing and formalization in Africa : a critical inquiry

    NARCIS (Netherlands)

    Stein, H.; Cunningham, S.

    2015-01-01

    Two developments in Africa have generated an extensive literature. The first focuses on investment and land grabbing and the second on the formalization of rural property rights. Less has been written on the impact of formalization on land grabbing and of land grabbing on formalization. Recently,

  7. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  8. Sonic Boom Pressure Signature Uncertainty Calculation and Propagation to Ground Noise

    Science.gov (United States)

    West, Thomas K., IV; Bretl, Katherine N.; Walker, Eric L.; Pinier, Jeremy T.

    2015-01-01

    The objective of this study was to outline an approach for the quantification of uncertainty in sonic boom measurements and to investigate the effect of various near-field uncertainty representation approaches on ground noise predictions. These approaches included a symmetric versus asymmetric uncertainty band representation and a dispersion technique based on a partial sum Fourier series that allows for the inclusion of random error sources in the uncertainty. The near-field uncertainty was propagated to the ground level, along with additional uncertainty in the propagation modeling. Estimates of perceived loudness were obtained for the various types of uncertainty representation in the near-field. Analyses were performed on three configurations of interest to the sonic boom community: the SEEB-ALR, the 69o DeltaWing, and the LM 1021-01. Results showed that representation of the near-field uncertainty plays a key role in ground noise predictions. Using a Fourier series based dispersion approach can double the amount of uncertainty in the ground noise compared to a pure bias representation. Compared to previous computational fluid dynamics results, uncertainty in ground noise predictions were greater when considering the near-field experimental uncertainty.

  9. Infinitesimal Deformations of a Formal Symplectic Groupoid

    Science.gov (United States)

    Karabegov, Alexander

    2011-09-01

    Given a formal symplectic groupoid G over a Poisson manifold ( M, π 0), we define a new object, an infinitesimal deformation of G, which can be thought of as a formal symplectic groupoid over the manifold M equipped with an infinitesimal deformation {π_0 + \\varepsilon π_1} of the Poisson bivector field π 0. To any pair of natural star products {(ast,tildeast)} having the same formal symplectic groupoid G we relate an infinitesimal deformation of G. We call it the deformation groupoid of the pair {(ast,tildeast)} . To each star product with separation of variables {ast} on a Kähler-Poisson manifold M we relate another star product with separation of variables {hatast} on M. We build an algorithm for calculating the principal symbols of the components of the logarithm of the formal Berezin transform of a star product with separation of variables {ast} . This algorithm is based upon the deformation groupoid of the pair {(ast,hatast)}.

  10. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  11. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  12. Non-Formal Educational Empowerment of Nigeria Youths for ...

    African Journals Online (AJOL)

    Religion Dept

    discussed the concept of non-formal education, entrepreneurship and development, non-formal ... introducing some developmental programmes such as poverty alleviation .... aesthetic, cultural and civic education for public enlightenment.

  13. Uncertainty, probability and information-gaps

    International Nuclear Information System (INIS)

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  14. A Formal Methods Approach to the Analysis of Mode Confusion

    Science.gov (United States)

    Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.

    2004-01-01

    The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal

  15. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  16. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  17. Sound Computational Interpretation of Formal Encryption with Composed Keys

    NARCIS (Netherlands)

    Laud, P.; Corin, R.J.; In Lim, J.; Hoon Lee, D.

    2003-01-01

    The formal and computational views of cryptography have been related by the seminal work of Abadi and Rogaway. In their work, a formal treatment of encryption that uses atomic keys is justified in the computational world. However, many proposed formal approaches allow the use of composed keys, where

  18. Comparison between conservative perturbation and sampling based methods for propagation of Non-Neutronic uncertainties

    International Nuclear Information System (INIS)

    Campolina, Daniel de A.M.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2013-01-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using sampling based method is recent because of the huge computational effort required. In this work a sample space of MCNP calculations were used as a black box model to propagate the uncertainty of system parameters. The efficiency of the method was compared to a conservative method. Uncertainties in input parameters of the reactor considered non-neutronic uncertainties, including geometry dimensions and density. The effect of the uncertainties on the effective multiplication factor of the system was analyzed respect to the possibility of using many uncertainties in the same input. If the case includes more than 46 parameters with uncertainty in the same input, the sampling based method is proved to be more efficient than the conservative method. (author)

  19. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  20. Educación no formal

    Science.gov (United States)

    Tignanelli, H.

    Se comentan en esta comunicación, las principales contribuciones realizadas en el campo de la educación en astronomía en los niveles primario, secundario y terciario, como punto de partida para la discusión de la actual inserción de los contenidos astronómicos en los nuevos contenidos curriculares de la EGB - Educación General Básica- y Polimodal, de la Reforma Educativa. En particular, se discuten los alcances de la educación formal y no formal, su importancia para la capacitación de profesores y maestros, y perspectivas a futuro.