WorldWideScience

Sample records for event based uncertainty

  1. Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    Directory of Open Access Journals (Sweden)

    Soyoung Jeon

    2016-06-01

    Full Text Available Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR or the risk ratio (RR and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output based on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.

  2. Physically-based modelling of high magnitude torrent events with uncertainty quantification

    Science.gov (United States)

    Wing-Yuen Chow, Candace; Ramirez, Jorge; Zimmermann, Markus; Keiler, Margreth

    2017-04-01

    High magnitude torrent events are associated with the rapid propagation of vast quantities of water and available sediment downslope where human settlements may be established. Assessing the vulnerability of built structures to these events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. The specific contribution of the presented work describes a procedure simulate these damaging events by applying physically-based modelling and to include uncertainty information about the simulated results. This is a first step in the development of vulnerability curves based on several intensity parameters (i.e. maximum velocity, sediment deposition depth and impact pressure). The investigation process begins with the collection, organization and interpretation of detailed post-event documentation and photograph-based observation data of affected structures in three sites that exemplify the impact of highly destructive mudflows and flood occurrences on settlements in Switzerland. Hazard intensity proxies are then simulated with the physically-based FLO-2D model (O'Brien et al., 1993). Prior to modelling, global sensitivity analysis is conducted to support a better understanding of model behaviour, parameterization and the quantification of uncertainties (Song et al., 2015). The inclusion of information describing the degree of confidence in the simulated results supports the credibility of vulnerability curves developed with the modelled data. First, key parameters are identified and selected based on literature review. Truncated a priori ranges of parameter values were then defined by expert solicitation. Local sensitivity analysis is performed based on manual calibration to provide an understanding of the parameters relevant to the case studies of interest. Finally, automated parameter estimation is performed to comprehensively search for optimal parameter combinations and associated values, which are evaluated using the

  3. Warning and prevention based on estimates with large uncertainties: the case of low-frequency and large-impact events like tsunamis

    Science.gov (United States)

    Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo

    2013-04-01

    Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical

  4. Evidence theory and differential evolution based uncertainty ...

    Indian Academy of Sciences (India)

    Gap Theory: Decisions under severe uncertainty, Second edition. London: Academic Press. Byeng D Y, Choi K, Liu D and David G 2007 Integration of possibility-based optimization to robust design for epistemic uncertainty. ASME J. Mech. Des.

  5. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  6. Constraining the generalized uncertainty principle with the gravitational wave event GW150914

    Directory of Open Access Journals (Sweden)

    Zhong-Wen Feng

    2017-05-01

    Full Text Available In this letter, we show that the dimensionless parameters in the generalized uncertainty principle (GUP can be constrained by the gravitational wave event GW150914, which was discovered by the LIGO Scientific and Virgo Collaborations. Firstly, according to the Heisenberg uncertainty principle (HUP and the data of gravitational wave event GW150914, we derive the standard energy–momentum dispersion relation and calculate the difference between the propagation speed of gravitons and the speed of light, i.e., Δυ. Next, using two proposals regarding the GUP, we also generalize our study to the quantum gravity case and obtain the modified speed of gravitons. Finally, based on the modified speed of gravitons and Δυ, the improved upper bounds on the GUP parameters are obtained. The results show that the upper limits of the GUP parameters β0 and α0 are 2.3×1060 and 1.8×1020.

  7. Entropic uncertainty relation based on generalized uncertainty principle

    Science.gov (United States)

    Hsu, Li-Yi; Kawamoto, Shoichi; Wen, Wen-Yu

    2017-09-01

    We explore the modification of the entropic formulation of uncertainty principle in quantum mechanics which measures the incompatibility of measurements in terms of Shannon entropy. The deformation in question is the type so-called generalized uncertainty principle that is motivated by thought experiments in quantum gravity and string theory and is characterized by a parameter of Planck scale. The corrections are evaluated for small deformation parameters by use of the Gaussian wave function and numerical calculation. As the generalized uncertainty principle has proven to be useful in the study of the quantum nature of black holes, this study would be a step toward introducing an information theory viewpoint to black hole physics.

  8. Accept & Reject Statement-Based Uncertainty Models

    NARCIS (Netherlands)

    E. Quaeghebeur (Erik); G. de Cooman; F. Hermans (Felienne)

    2015-01-01

    textabstractWe develop a framework for modelling and reasoning with uncertainty based on accept and reject statements about gambles. It generalises the frameworks found in the literature based on statements of acceptability, desirability, or favourability and clarifies their relative position. Next

  9. Uncertainty related to Environmental Data and Estimated Extreme Events

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    The design loads on rubble mound breakwaters are almost entirely determined by the environmental conditions, i.e. sea state, water levels, sea bed characteristics, etc. It is the objective of sub-group B to identify the most important environmental parameters and evaluate the related uncertaintie...

  10. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related...... activities inside and outside an IT-system. We use event-activity diagrams to model activity. Such diagrams support the modeling of activity flow, object flow, shared events, triggering events, and interrupting events....

  11. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  12. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  13. Event-scale power law recession analysis: quantifying methodological uncertainty

    Science.gov (United States)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  14. Extreme Events in China under Climate Change: Uncertainty and related impacts (CSSP-FOREX)

    Science.gov (United States)

    Leckebusch, Gregor C.; Befort, Daniel J.; Hodges, Kevin I.

    2016-04-01

    Suitable adaptation strategies or the timely initiation of related mitigation efforts in East Asia will strongly depend on robust and comprehensive information about future near-term as well as long-term potential changes in the climate system. Therefore, understanding the driving mechanisms associated with the East Asian climate is of major importance. The FOREX project (Fostering Regional Decision Making by the Assessment of Uncertainties of Future Regional Extremes and their Linkage to Global Climate System Variability for China and East Asia) focuses on the investigation of extreme wind and rainfall related events over Eastern Asia and their possible future changes. Here, analyses focus on the link between local extreme events and their driving weather systems. This includes the coupling between local rainfall extremes and tropical cyclones, the Meiyu frontal system, extra-tropical teleconnections and monsoonal activity. Furthermore, the relation between these driving weather systems and large-scale variability modes, e.g. NAO, PDO, ENSO is analysed. Thus, beside analysing future changes of local extreme events, the temporal variability of their driving weather systems and related large-scale variability modes will be assessed in current CMIP5 global model simulations to obtain more robust results. Beyond an overview of FOREX itself, first results regarding the link between local extremes and their steering weather systems based on observational and reanalysis data are shown. Special focus is laid on the contribution of monsoonal activity, tropical cyclones and the Meiyu frontal system on the inter-annual variability of the East Asian summer rainfall.

  15. Defining Uncertainty : A Conceptual Basis for Uncertainty Management in Model-Based Decision Support

    NARCIS (Netherlands)

    Walker, W.E.; Harremoës, P.; Rotmans, J.; Van der Sluijs, J.P.; Van Asselt, M.B.A.; Janssen, P.; Krayer von Krauss, M.P.

    2003-01-01

    The aim of this paper is to provide a conceptual basis for the systematic treatment of uncertainty in model-based decision support activities such as policy analysis, integrated assessment and risk assessment. It focuses on the uncertainty perceived from the point of view of those providing

  16. Uncertainties.

    Science.gov (United States)

    Dalla Chiara, Maria Luisa

    2010-09-01

    In contemporary science uncertainty is often represented as an intrinsic feature of natural and of human phenomena. As an example we need only think of two important conceptual revolutions that occurred in physics and logic during the first half of the twentieth century: (1) the discovery of Heisenberg's uncertainty principle in quantum mechanics; (2) the emergence of many-valued logical reasoning, which gave rise to so-called 'fuzzy thinking'. I discuss the possibility of applying the notions of uncertainty, developed in the framework of quantum mechanics, quantum information and fuzzy logics, to some problems of political and social sciences.

  17. Testing and Development of the Onsite Earthquake Early Warning Algorithm to Reduce Event Uncertainties

    Science.gov (United States)

    Andrews, J. R.; Cochran, E. S.; Hauksson, E.; Felizardo, C.; Liu, T.; Ross, Z.; Heaton, T. H.

    2015-12-01

    Primary metrics for measuring earthquake early warning (EEW) system and algorithm performance are the rate of false alarms and the uncertainty in earthquake parameters. The Onsite algorithm, currently one of three EEW algorithms implemented in ShakeAlert, uses the ground-motion period parameter (τc) and peak initial displacement parameter (Pd) to estimate the magnitude and expected ground shaking of an ongoing earthquake. It is the only algorithm originally designed to issue single station alerts, necessitating that results from individual stations be as reliable and accurate as possible.The ShakeAlert system has been undergoing testing on continuous real-time data in California for several years, and the latest version of the Onsite algorithm for several months. This permits analysis of the response to a range of signals, from environmental noise to hardware testing and maintenance procedures to moderate or large earthquake signals at varying distances from the networks. We find that our existing discriminator, relying only on τc and Pd, while performing well to exclude large teleseismic events, is less effective for moderate regional events and can also incorrectly exclude data from local events. Motivated by these experiences, we use a collection of waveforms from potentially problematic 'noise' events and real earthquakes to explore methods to discriminate real and false events, using the ground motion and period parameters available in Onsite's processing methodology. Once an event is correctly identified, a magnitude and location estimate is critical to determining the expected ground shaking. Scatter in the measured parameters translates to higher than desired uncertainty in Onsite's current calculations We present an overview of alternative methods, including incorporation of polarization information, to improve parameter determination for a test suite including both large (M4 to M7) events and three years of small to moderate events across California.

  18. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  19. State-based Event Detection Optimization for Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Shanglian PENG

    2014-02-01

    Full Text Available Detection of patterns in high speed, large volume of event streams has been an important paradigm in many application areas of Complex Event Processing (CEP including security monitoring, financial markets analysis and health-care monitoring. To assure real-time responsive complex pattern detection over high volume and speed event streams, efficient event detection techniques have to be designed. Unfortunately evaluation of the Nondeterministic Finite Automaton (NFA based event detection model mainly considers single event query and its optimization. In this paper, we propose multiple event queries evaluation on event streams. In particular, we consider scalable multiple event detection model that shares NFA transfer states of different event queries. For each event query, the event query is parse into NFA and states of the NFA are partitioned into different units. With this partition, the same individual state of NFA is run on different processing nodes, providing states sharing and reducing partial matches maintenance. We compare our state-based approach with Stream-based And Shared Event processing (SASE. Our experiments demonstrate that state-based approach outperforms SASE both on CPU time usage and memory consumption.

  20. Reservoir Sedimentation Based on Uncertainty Analysis

    Directory of Open Access Journals (Sweden)

    Farhad Imanshoar

    2014-01-01

    Full Text Available Reservoir sedimentation can result in loss of much needed reservoir storage capacity, reducing the useful life of dams. Thus, sufficient sediment storage capacity should be provided for the reservoir design stage to ensure that sediment accumulation will not impair the functioning of the reservoir during the useful operational-economic life of the project. However, an important issue to consider when estimating reservoir sedimentation and accumulation is the uncertainty involved in reservoir sedimentation. In this paper, the basic factors influencing the density of sediments deposited in reservoirs are discussed, and uncertainties in reservoir sedimentation have been determined using the Delta method. Further, Kenny Reservoir in the White River Basin in northwestern Colorado was selected to determine the density of deposits in the reservoir and the coefficient of variation. The results of this investigation have indicated that by using the Delta method in the case of Kenny Reservoir, the uncertainty regarding accumulated sediment density, expressed by the coefficient of variation for a period of 50 years of reservoir operation, could be reduced to about 10%. Results of the Delta method suggest an applicable approach for dead storage planning via interfacing with uncertainties associated with reservoir sedimentation.

  1. Contributions to Physics-Based Aeroservoelastic Uncertainty Analysis

    Science.gov (United States)

    Wu, Sang

    The thesis presents the development of a new fully-integrated, MATLAB based simulation capability for aeroservoelastic (ASE) uncertainty analysis that accounts for uncertainties in all disciplines as well as discipline interactions. This new capability allows probabilistic studies of complex configuration at a scope and with depth not known before. Several statistical tools and methods have been integrated into the capability to guide the tasks such as parameter prioritization, uncertainty reduction, and risk mitigation. (Abstract shortened by ProQuest.).

  2. Measuring the Higgs boson mass using event-by-event uncertainties

    NARCIS (Netherlands)

    Castelli, A.

    2015-01-01

    The thesis presents a measurement of the properties of the Higgs particle, performed by using the data collected by the ATLAS experiment in 2011 and 2012. The measurement is performed by using a three-dimensional model based on analytic functions to describe the signal produced by the Higgs boson

  3. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    Science.gov (United States)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  4. Heavy precipitation events in the Mediterranean: sensitivity to cloud physics parameterisation uncertainties

    Directory of Open Access Journals (Sweden)

    S. Fresnay

    2012-08-01

    Full Text Available In autumn, southeastern France is often affected by heavy precipitation events which may result in damaging flash-floods. The 20 October and 1 November 2008 are two archetypes of the meteorological situations under which these events occur: an upper-level trough directing a warm and moist flow from the Mediterranean towards the Cévennes ridge or a quasi stationary meso-scale convective complex developing over the Rhone valley. These two types of events exhibit a contrasting level of predictability; the former being usually better forecast than the latter. Control experiments performed with the Meso-NH model run with a 2.5 km resolution confirm these predictability issues. The deterministic forecast of the November case (Cévennes ridge is found to be much more skilful than the one for the October case (Rhone valley. These two contrasting situations are used to investigate the sensitivity of the model for cloud physics parameterisation uncertainties. Three 9-member ensembles are constructed. In the first one, the rain distribution intercept parameter is varied within its range of allowed values. In the second one, random perturbations are applied to the rain evaporation rate, whereas in the third one, random perturbations are simultaneously applied to the cloud autoconversion, rain accretion, and rain evaporation rates. Results are assessed by comparing the time and space distribution of the observed and forecasted precipitation. For the Rhone valley case, it is shown that not one of the ensembles is able to drastically improve the skill of the forecast. Taylor diagrams indicate that the microphysical perturbations are more efficient in modulating the rainfall intensities than in altering their localization. Among the three ensembles, the multi-process perturbation ensemble is found to yield the largest spread for most parameters. In contrast, the results of the Cévennes case exhibit almost no sensitivity to the microphysical perturbations

  5. Reservoir Sedimentation Based on Uncertainty Analysis

    OpenAIRE

    Farhad Imanshoar; Afshin Jahangirzadeh; Hossein Basser; Shatirah Akib; Babak Kamali; Tabatabaei, Mohammad Reza M.; Masoud Kakouei

    2013-01-01

    Reservoir sedimentation can result in loss of much needed reservoir storage capacity, reducing the useful life of dams. Thus, sufficient sediment storage capacity should be provided for the reservoir design stage to ensure that sediment accumulation will not impair the functioning of the reservoir during the useful operational-economic life of the project. However, an important issue to consider when estimating reservoir sedimentation and accumulation is the uncertainty involved in reservoir ...

  6. Uncertainty in Vs30-based site response

    Science.gov (United States)

    Thompson, Eric M.; Wald, David J.

    2016-01-01

    Methods that account for site response range in complexity from simple linear categorical adjustment factors to sophisticated nonlinear constitutive models. Seismic‐hazard analysis usually relies on ground‐motion prediction equations (GMPEs); within this framework site response is modeled statistically with simplified site parameters that include the time‐averaged shear‐wave velocity to 30 m (VS30) and basin depth parameters. Because VS30 is not known in most locations, it must be interpolated or inferred through secondary information such as geology or topography. In this article, we analyze a subset of stations for which VS30 has been measured to address effects of VS30 proxies on the uncertainty in the ground motions as modeled by GMPEs. The stations we analyze also include multiple recordings, which allow us to compute the repeatable site effects (or empirical amplification factors [EAFs]) from the ground motions. Although all methods exhibit similar bias, the proxy methods only reduce the ground‐motion standard deviations at long periods when compared to GMPEs without a site term, whereas measured VS30 values reduce the standard deviations at all periods. The standard deviation of the ground motions are much lower when the EAFs are used, indicating that future refinements of the site term in GMPEs have the potential to substantially reduce the overall uncertainty in the prediction of ground motions by GMPEs.

  7. A new structural reliability index based on uncertainty theory

    Directory of Open Access Journals (Sweden)

    Pidong WANG

    2017-08-01

    Full Text Available The classical probabilistic reliability theory and fuzzy reliability theory cannot directly measure the uncertainty of structural reliability with uncertain variables, i.e., subjective random and fuzzy variables. In order to simultaneously satisfy the duality of randomness and subadditivity of fuzziness in the reliability problem, a new quantification method for the reliability of structures is presented based on uncertainty theory, and an uncertainty-theory-based perspective of classical Cornell reliability index is explored. In this paper, by introducing the uncertainty theory, we adopt the uncertain measure to quantify the reliability of structures for the subjective probability or fuzzy variables, instead of probabilistic and possibilistic measures. We utilize uncertain variables to uniformly represent the subjective random and fuzzy parameters, based on which we derive solutions to analyze the uncertainty reliability of structures with uncertainty distributions. Moreover, we propose the Cornell uncertainty reliability index based on the uncertain expected value and variance. Experimental results on three numerical applications demonstrate the validity of the proposed method.

  8. Uncertainties of yeast-based biofuel cell operational characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Babanova, S.; Mitov, M.; Mandjukov, P. [Department of Chemistry, South-West University, 66 Ivan Mihailov str., 2700 Blagoevgrad (Bulgaria); Hubenova, Y. [Department of Biochemistry and Microbiology, Plovdiv University, 24 Tsar Asen str., 4000 Plovdiv (Bulgaria)

    2011-12-15

    The commonly used parameters characterizing fuel cells and in particular microbial fuel cells (MFCs) electrical performance are open circuit voltage (OCV), maximum power, and short circuit current. These characteristics are usually obtained from polarization and power curves. In the present study, the expanded uncertainties of operational characteristics for yeast-based fuel cell were evaluated and the main sources of uncertainty were determined. Two approaches were used: the uncertainty budget building for sources uncertainty estimation and a statistical treatment of identical MFCs results - for operational characteristics uncertainty calculation. It was found that in this particular bioelectrochemical system the major factor contributing to operational characteristics uncertainties was the electrodes' resistance. The operational characteristics uncertainties were decreased from 19 to 13% for OCV, from 42 to 14% for maximal power, and from 46 to 13% for short circuit current with the usage of electrodes with resistance in the interval 6-7 {omega}. The described approaches can be used for operational characteristics expanded uncertainties calculation of all types of fuel cells using data from polarization measurements. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  10. Structural Design Methodology Based on Concepts of Uncertainty

    Science.gov (United States)

    Lin, K. Y.; Du, Jiaji; Rusk, David

    2000-01-01

    In this report, an approach to damage-tolerant aircraft structural design is proposed based on the concept of an equivalent "Level of Safety" that incorporates past service experience in the design of new structures. The discrete "Level of Safety" for a single inspection event is defined as the compliment of the probability that a single flaw size larger than the critical flaw size for residual strength of the structure exists, and that the flaw will not be detected. The cumulative "Level of Safety" for the entire structure is the product of the discrete "Level of Safety" values for each flaw of each damage type present at each location in the structure. Based on the definition of "Level of Safety", a design procedure was identified and demonstrated on a composite sandwich panel for various damage types, with results showing the sensitivity of the structural sizing parameters to the relative safety of the design. The "Level of Safety" approach has broad potential application to damage-tolerant aircraft structural design with uncertainty.

  11. Uncertainty Of Stream Nutrient Transport Estimates Using Random Sampling Of Storm Events From High Resolution Water Quality And Discharge Data

    Science.gov (United States)

    Scholefield, P. A.; Arnscheidt, J.; Jordan, P.; Beven, K.; Heathwaite, L.

    2007-12-01

    The uncertainties associated with stream nutrient transport estimates are frequently overlooked and the sampling strategy is rarely if ever investigated. Indeed, the impact of sampling strategy and estimation method on the bias and precision of stream phosphorus (P) transport calculations is little understood despite the use of such values in the calibration and testing of models of phosphorus transport. The objectives of this research were to investigate the variability and uncertainty in the estimates of total phosphorus transfers at an intensively monitored agricultural catchment. The Oona Water which is located in the Irish border region, is part of a long term monitoring program focusing on water quality. The Oona Water is a rural river catchment with grassland agriculture and scattered dwelling houses and has been monitored for total phosphorus (TP) at 10 min resolution for several years (Jordan et al, 2007). Concurrent sensitive measurements of discharge are also collected. The water quality and discharge data were provided at 1 hour resolution (averaged) and this meant that a robust estimate of the annual flow weighted concentration could be obtained by simple interpolation between points. A two-strata approach (Kronvang and Bruhn, 1996) was used to estimate flow weighted concentrations using randomly sampled storm events from the 400 identified within the time series and also base flow concentrations. Using a random stratified sampling approach for the selection of events, a series ranging from 10 through to the full 400 were used, each time generating a flow weighted mean using a load-discharge relationship identified through log-log regression and monte-carlo simulation. These values were then compared to the observed total phosphorus concentration for the catchment. Analysis of these results show the impact of sampling strategy, the inherent bias in any estimate of phosphorus concentrations and the uncertainty associated with such estimates. The

  12. Uncertainty visualization in HARDI based on ensembles of ODFs

    KAUST Repository

    Jiao, Fangxiang

    2012-02-01

    In this paper, we propose a new and accurate technique for uncertainty analysis and uncertainty visualization based on fiber orientation distribution function (ODF) glyphs, associated with high angular resolution diffusion imaging (HARDI). Our visualization applies volume rendering techniques to an ensemble of 3D ODF glyphs, which we call SIP functions of diffusion shapes, to capture their variability due to underlying uncertainty. This rendering elucidates the complex heteroscedastic structural variation in these shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio. Our uncertainty analysis and visualization framework is then applied to synthetic data, as well as to HARDI human-brain data, to study the impact of various image acquisition parameters and background noise levels on the diffusion shapes. © 2012 IEEE.

  13. Uncertainty estimation for map-based analyses

    Science.gov (United States)

    Ronald E. McRoberts; Mark A. Hatfield; Susan J. Crocker

    2010-01-01

    Traditionally, natural resource managers have asked the question, “How much?” and have received sample-based estimates of resource totals or means. Increasingly, however, the same managers are now asking the additional question, “Where?” and are expecting spatially explicit answers in the form of maps. Recent development of natural resource databases, access to...

  14. Reliability-Based Multidisciplinary Design Optimization under Correlated Uncertainties

    Directory of Open Access Journals (Sweden)

    Huanwei Xu

    2017-01-01

    Full Text Available Complex mechanical system is usually composed of several subsystems, which are often coupled with each other. Reliability-based multidisciplinary design optimization (RBMDO is an efficient method to design such complex system under uncertainties. However, the present RBMDO methods ignored the correlations between uncertainties. In this paper, through combining the ellipsoidal set theory and first-order reliability method (FORM for multidisciplinary design optimization (MDO, characteristics of correlated uncertainties are investigated. Furthermore, to improve computational efficiency, the sequential optimization and reliability assessment (SORA strategy is utilized to obtain the optimization result. Both a mathematical example and a case study of an engineering system are provided to illustrate the feasibility and validity of the proposed method.

  15. Design of crusher liner based on time - varying uncertainty theory

    Science.gov (United States)

    Tang, J. C.; Shi, B. Q.; Yu, H. J.; Wang, R. J.; Zhang, W. Y.

    2017-09-01

    This article puts forward the time-dependent design method considering the load fluctuation factors for the liner based on the time-varying uncertainty theory. In this method, the time-varying uncertainty design model of liner is constructed by introducing the parameters that affect the wear rate, the volatility and the drift rate. Based on the design example, the timevarying design outline of the moving cone liner is obtained. Based on the theory of minimum wear, the gap curve of wear resistant cavity is designed, and the optimized cavity is obtained by the combination of the thickness of the cone and the cavity gap. Taking the PYGB1821 multi cylinder hydraulic cone crusher as an example, it is proved that the service life of the new liner is improved by more than 14.3%.

  16. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample th...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine......Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...

  17. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    Science.gov (United States)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the

  18. Robustness-Based Design Optimization Under Data Uncertainty

    Science.gov (United States)

    Zaman, Kais; McDonald, Mark; Mahadevan, Sankaran; Green, Lawrence

    2010-01-01

    This paper proposes formulations and algorithms for design optimization under both aleatory (i.e., natural or physical variability) and epistemic uncertainty (i.e., imprecise probabilistic information), from the perspective of system robustness. The proposed formulations deal with epistemic uncertainty arising from both sparse and interval data without any assumption about the probability distributions of the random variables. A decoupled approach is proposed in this paper to un-nest the robustness-based design from the analysis of non-design epistemic variables to achieve computational efficiency. The proposed methods are illustrated for the upper stage design problem of a two-stage-to-orbit (TSTO) vehicle, where the information on the random design inputs are only available as sparse point and/or interval data. As collecting more data reduces uncertainty but increases cost, the effect of sample size on the optimality and robustness of the solution is also studied. A method is developed to determine the optimal sample size for sparse point data that leads to the solutions of the design problem that are least sensitive to variations in the input random variables.

  19. Uncertainty of lateral boundary conditions in a convection-permitting ensemble: a strategy of selection for Mediterranean heavy precipitation events

    Directory of Open Access Journals (Sweden)

    O. Nuissier

    2012-10-01

    Full Text Available This study examines the impact of lateral boundary conditions (LBCs in convection-permitting (C-P ensemble simulations with the AROME model driven by the ARPEGE EPS (PEARP. Particular attention is paid to two torrential rainfall episodes, observed on 15–16 June 2010 (the Var case and 7–8 September 2010 (the Gard-Ardèche case over the southeastern part of France. Regarding the substantial computing time for convection-permitting models, a methodology of selection of a few LBCs, dedicated for C-P ensemble simulations of heavy precipitation events is evaluated. Several sensitivity experiments are carried out to evaluate the skill of the AROME ensembles, using different approaches for selection of the driving PEARP members. The convective-scale predictability of the Var case is very low and it is driven primarily by a surface low over the Gulf of Lyon inducing a strong convergent low-level flow, and accordingly advecting strong moisture supply from the Mediterranean Sea toward the flooded area. The Gard-Ardèche case is better handled in ensemble simulations as a surface cold front moved slowly eastwards while increasing the low-level water vapour ahead is well reproduced. The selection based on a cluster analysis of the PEARP members generally better performs against a random selection. The consideration of relevant meteorological parameters for the convective events of interest (i.e. geopotential height at 500 hPa and horizontal moisture flux at 925 hPa refined the cluster analysis. It also helps in better capturing the forecast uncertainty variability which is spatially more localized at the "high-impact region" due to the selection of more mesoscale parameters.

  20. Uncertainty Representation and Interpretation in Model-based Prognostics Algorithms based on Kalman Filter Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — This article discusses several aspects of uncertainty represen- tation and management for model-based prognostics method- ologies based on our experience with Kalman...

  1. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  2. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    Science.gov (United States)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  3. Uncertainty and Sensitivity Analyses of a Pebble Bed HTGR Loss of Cooling Event

    Directory of Open Access Journals (Sweden)

    Gerhard Strydom

    2013-01-01

    Full Text Available The Very High Temperature Reactor Methods Development group at the Idaho National Laboratory identified the need for a defensible and systematic uncertainty and sensitivity approach in 2009. This paper summarizes the results of an uncertainty and sensitivity quantification investigation performed with the SUSA code, utilizing the International Atomic Energy Agency CRP 5 Pebble Bed Modular Reactor benchmark and the INL code suite PEBBED-THERMIX. Eight model input parameters were selected for inclusion in this study, and after the input parameters variations and probability density functions were specified, a total of 800 steady state and depressurized loss of forced cooling (DLOFC transient PEBBED-THERMIX calculations were performed. The six data sets were statistically analyzed to determine the 5% and 95% DLOFC peak fuel temperature tolerance intervals with 95% confidence levels. It was found that the uncertainties in the decay heat and graphite thermal conductivities were the most significant contributors to the propagated DLOFC peak fuel temperature uncertainty. No significant differences were observed between the results of Simple Random Sampling (SRS or Latin Hypercube Sampling (LHS data sets, and use of uniform or normal input parameter distributions also did not lead to any significant differences between these data sets.

  4. Advanced Methods for Determining Prediction Uncertainty in Model-Based Prognostics with Application to Planetary Rovers

    Science.gov (United States)

    Daigle, Matthew J.; Sankararaman, Shankar

    2013-01-01

    Prognostics is centered on predicting the time of and time until adverse events in components, subsystems, and systems. It typically involves both a state estimation phase, in which the current health state of a system is identified, and a prediction phase, in which the state is projected forward in time. Since prognostics is mainly a prediction problem, prognostic approaches cannot avoid uncertainty, which arises due to several sources. Prognostics algorithms must both characterize this uncertainty and incorporate it into the predictions so that informed decisions can be made about the system. In this paper, we describe three methods to solve these problems, including Monte Carlo-, unscented transform-, and first-order reliability-based methods. Using a planetary rover as a case study, we demonstrate and compare the different methods in simulation for battery end-of-discharge prediction.

  5. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such as MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.

  6. Active Clustering with Model-Based Uncertainty Reduction.

    Science.gov (United States)

    Xiong, Caiming; Johnson, David M; Corso, Jason J

    2017-01-01

    Semi-supervised clustering seeks to augment traditional clustering methods by incorporating side information provided via human expertise in order to increase the semantic meaningfulness of the resulting clusters. However, most current methods are passive in the sense that the side information is provided beforehand and selected randomly. This may require a large number of constraints, some of which could be redundant, unnecessary, or even detrimental to the clustering results. Thus in order to scale such semi-supervised algorithms to larger problems it is desirable to pursue an active clustering method-i.e., an algorithm that maximizes the effectiveness of the available human labor by only requesting human input where it will have the greatest impact. Here, we propose a novel online framework for active semi-supervised spectral clustering that selects pairwise constraints as clustering proceeds, based on the principle of uncertainty reduction. Using a first-order Taylor expansion, we decompose the expected uncertainty reduction problem into a gradient and a step-scale, computed via an application of matrix perturbation theory and cluster-assignment entropy, respectively. The resulting model is used to estimate the uncertainty reduction potential of each sample in the dataset. We then present the human user with pairwise queries with respect to only the best candidate sample. We evaluate our method using three different image datasets (faces, leaves and dogs), a set of common UCI machine learning datasets and a gene dataset. The results validate our decomposition formulation and show that our method is consistently superior to existing state-of-the-art techniques, as well as being robust to noise and to unknown numbers of clusters.

  7. A terrestrial lidar-based workflow for determining three-dimensional slip vectors and associated uncertainties

    Science.gov (United States)

    Gold, Peter O.; Cowgill, Eric; Kreylos, Oliver; Gold, Ryan D.

    2012-01-01

    Three-dimensional (3D) slip vectors recorded by displaced landforms are difficult to constrain across complex fault zones, and the uncertainties associated with such measurements become increasingly challenging to assess as landforms degrade over time. We approach this problem from a remote sensing perspective by using terrestrial laser scanning (TLS) and 3D structural analysis. We have developed an integrated TLS data collection and point-based analysis workflow that incorporates accurate assessments of aleatoric and epistemic uncertainties using experimental surveys, Monte Carlo simulations, and iterative site reconstructions. Our scanning workflow and equipment requirements are optimized for single-operator surveying, and our data analysis process is largely completed using new point-based computing tools in an immersive 3D virtual reality environment. In a case study, we measured slip vector orientations at two sites along the rupture trace of the 1954 Dixie Valley earthquake (central Nevada, United States), yielding measurements that are the first direct constraints on the 3D slip vector for this event. These observations are consistent with a previous approximation of net extension direction for this event. We find that errors introduced by variables in our survey method result in <2.5 cm of variability in components of displacement, and are eclipsed by the 10–60 cm epistemic errors introduced by reconstructing the field sites to their pre-erosion geometries. Although the higher resolution TLS data sets enabled visualization and data interactivity critical for reconstructing the 3D slip vector and for assessing uncertainties, dense topographic constraints alone were not sufficient to significantly narrow the wide (<26°) range of allowable slip vector orientations that resulted from accounting for epistemic uncertainties.

  8. A HOS-Based Blind Spectrum Sensing in Noise Uncertainty

    Directory of Open Access Journals (Sweden)

    Agus Subekti

    2015-08-01

    Full Text Available Spectrum sensing for cognitive radio is a challenging task since it has to be able to detect the primary signal at a low signal to noise ratio (SNR. At a low SNR, the variance of noise fluctuates due to noise uncertainty. Detection of the primary signal will be difficult especially for blind spectrum sensing methods that rely on the variance of noise for their threshold setting, such as energy detection. Instead of using the energy difference, we propose a spectrum sensing method based on the distribution difference. When the channel is occupied, the distribution of the received signal, which propagates under a wireless fading channel, will have a non-Gaussian distribution. This will be different from the  Gaussian noise when the channel is vacant. Kurtosis, a higher order statistic (HOS of  the  4th order,  is used as normality test for the test statistic. We measured the detection rate of the proposed method by performing a simulation of the detection process. Our proposed method’s performance proved superior in detecting a real digital TV signal in noise uncertainty.

  9. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Science.gov (United States)

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  10. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    Science.gov (United States)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content

  11. Facing uncertainty in ecosystem services-based resource management.

    Science.gov (United States)

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Uncertainty Analysis in MRI-based Polymer Gel Dosimetry.

    Science.gov (United States)

    Keshtkar, M; Zahmatkesh, M H; Montazerabadi, A R

    2017-09-01

    Polymer gel dosimeters combined with magnetic resonance imaging (MRI) can be used for dose verification of advanced radiation therapy techniques. However, the uncertainty of dose map measured by gel dosimeter should be known. The purpose of this study is to investigate the uncertainty related to calibration curve and MRI protocol for MAGIC (Methacrylic and Ascorbic acid in Gelatin Initiated by Copper) gel and finally ways of optimization MRI protocol is introduced. MAGIC gel was prepared by the Fong et al. instruction. The gels were poured into calibration vials and irradiated by 18 MV photons. 1.5 Tesla MRI was used for reading out information. Finally, uncertainty of measured dose was calculated. Results show that for MAGIC polymer gel dosimeter, at low doses, the estimated uncertainty is high (≈ 18.96% for 1 Gy) but it reduces to approximately 4.17% for 10 Gy. Also, with increasing dose, the uncertainty for the measured dose decreases non-linearly. For low doses, the most significant uncertainties are σR0 (uncertainty of intercept) and σa (uncertainty of slope) for high doses. MRI protocol parameters influence signal-to-noise ratio (SNR). The most important source of uncertainty is uncertainty of R2. Hence, MRI protocol and parameters therein should be optimized. At low doses, the estimated uncertainty is high and reduces by increasing dose. It is suggested that in relative dosimetry, gels are irradiated by high doses in linear range of given gel dosimeter and then scaled down to the desired dose range.

  13. Polynomial Chaos Based Acoustic Uncertainty Predictions from Ocean Forecast Ensembles

    Science.gov (United States)

    Dennis, S.

    2016-02-01

    Most significant ocean acoustic propagation occurs at tens of kilometers, at scales small compared basin and to most fine scale ocean modeling. To address the increased emphasis on uncertainty quantification, for example transmission loss (TL) probability density functions (PDF) within some radius, a polynomial chaos (PC) based method is utilized. In order to capture uncertainty in ocean modeling, Navy Coastal Ocean Model (NCOM) now includes ensembles distributed to reflect the ocean analysis statistics. Since the ensembles are included in the data assimilation for the new forecast ensembles, the acoustic modeling uses the ensemble predictions in a similar fashion for creating sound speed distribution over an acoustically relevant domain. Within an acoustic domain, singular value decomposition over the combined time-space structure of the sound speeds can be used to create Karhunen-Loève expansions of sound speed, subject to multivariate normality testing. These sound speed expansions serve as a basis for Hermite polynomial chaos expansions of derived quantities, in particular TL. The PC expansion coefficients result from so-called non-intrusive methods, involving evaluation of TL at multi-dimensional Gauss-Hermite quadrature collocation points. Traditional TL calculation from standard acoustic propagation modeling could be prohibitively time consuming at all multi-dimensional collocation points. This method employs Smolyak order and gridding methods to allow adaptive sub-sampling of the collocation points to determine only the most significant PC expansion coefficients to within a preset tolerance. Practically, the Smolyak order and grid sizes grow only polynomially in the number of Karhunen-Loève terms, alleviating the curse of dimensionality. The resulting TL PC coefficients allow the determination of TL PDF normality and its mean and standard deviation. In the non-normal case, PC Monte Carlo methods are used to rapidly establish the PDF. This work was

  14. An evaluation of the uncertainty of extreme events statistics at the WMO/CIMO Lead Centre on precipitation intensity

    Science.gov (United States)

    Colli, M.; Lanza, L. G.; La Barbera, P.

    2012-12-01

    Improving the quality of point-scale rainfall measurements is a crucial issue fostered in recent years by the WMO Commission for Instruments and Methods of Observation (CIMO) by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries. The WMO/CIMO Lead Centre on Precipitation Intensity (LC) was recently constituted, in a joint effort between the Dep. of Civil, Chemical and Environmental Engineering of the University of Genova and the Italian Air Force Met Service, gathering the considerable asset of data and information achieved by the past infield and laboratory campaigns with the aim of researching novel methodologies for improving the accuracy of rainfall intensity (RI) measurement techniques. Among the ongoing experimental activities carried out by the LC laboratory particular attention is paid to the reliability evaluation of extreme rainfall events statistics , a common tool in the engineering practice for urban and non urban drainage system design, based on real world observations obtained from weighing gauges. Extreme events statistics were proven already to be highly affected by the traditional tipping-bucket rain gauge RI measurement inaccuracy (La Barbera et al., 2002) and the time resolution of the available RI series certainly constitutes another key-factor in the reliability of the derived hyetographs. The present work reports the LC laboratory efforts in assembling a rainfall simulation system to reproduce the inner temporal structure of the rainfall process by means of dedicated calibration and validation tests. This allowed testing of catching type rain gauges under non-steady flow conditions and quantifying, in a first instance, the dynamic behaviour of the investigated instruments. Considerations about the influence of the dynamic response on

  15. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    Science.gov (United States)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in

  16. Uncertainty budget for final assay of a pharmaceutical product based on RP-HPLC

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas; Byrialsen, Kirsten

    2003-01-01

    Compliance with specified limits for the content of active substance in a pharmaceutical drug requires knowledge of the uncertainty of the final assay. The uncertainty of measurement is based on the ISO recommendation as expressed in the Guide to the Expression of Uncertainty in Measurement (GUM......). The reported example illustrates the estimation of uncertainty for the final determination of a protein concentration by HPLC using UV detection, using the approach described by EURACHEM/CITAC. The combined standard uncertainty for a protein concentration of 2400 mumol/L was estimated to be 14 mumol/L. All...... known and potential uncertainty components are presented in Ishikawa diagrams and were carefully evaluated using Type A or Type B estimates. Special efforts were made to avoid duplication or omission of significant contributions to the combined uncertainty. Hence, before accepting the uncertainty budget...

  17. Sensitivity of seasonal weather prediction and extreme precipitation events to soil moisture initialization uncertainty using SMOS soil moisture products

    Science.gov (United States)

    Khodayar-Pardo, Samiro; Lopez-Baeza, Ernesto; Coll Pajaron, M. Amparo

    Sensitivity of seasonal weather prediction and extreme precipitation events to soil moisture initialization uncertainty using SMOS soil moisture products (1) S. Khodayar, (2) A. Coll, (2) E. Lopez-Baeza (1) Institute for Meteorology and Climate Research, Karlsruhe Institute of Technology (KIT), Karlsruhe Germany (2) University of Valencia. Earth Physics and Thermodynamics Department. Climatology from Satellites Group Soil moisture is an important variable in agriculture, hydrology, meteorology and related disciplines. Despite its importance, it is complicated to obtain an appropriate representation of this variable, mainly because of its high temporal and spatial variability. SVAT (Soil-Vegetation-Atmosphere-Transfer) models can be used to simulate the temporal behaviour and spatial distribution of soil moisture in a given area and/or state of the art products such as the soil moisture measurements from the SMOS (Soil Moisture and Ocean Salinity) space mission may be also convenient. The potential role of soil moisture initialization and associated uncertainty in numerical weather prediction is illustrated in this study through sensitivity numerical experiments using the SVAT SURFEX model and the non-hydrostatic COSMO model. The aim of this investigation is twofold, (a) to demonstrate the sensitivity of model simulations of convective precipitation to soil moisture initial uncertainty, as well as the impact on the representation of extreme precipitation events, and (b) to assess the usefulness of SMOS soil moisture products to improve the simulation of water cycle components and heavy precipitation events. Simulated soil moisture and precipitation fields are compared with observations and with level-1(~1km), level-2(~15 km) and level-3(~35 km) soil moisture maps generated from SMOS over the Iberian Peninsula, the SMOS validation area (50 km x 50 km, eastern Spain) and selected stations, where in situ measurements are available covering different vegetation cover

  18. Propagating uncertainties in statistical model based shape prediction

    Science.gov (United States)

    Syrkina, Ekaterina; Blanc, Rémi; Székely, Gàbor

    2011-03-01

    This paper addresses the question of accuracy assessment and confidence regions estimation in statistical model based shape prediction. Shape prediction consists in estimating the shape of an organ based on a partial observation, due e.g. to a limited field of view or poorly contrasted images, and generally requires a statistical model. However, such predictions can be impaired by several sources of uncertainty, in particular the presence of noise in the observation, limited correlations between the predictors and the shape to predict, as well as limitations of the statistical shape model - in particular the number of training samples. We propose a framework which takes these into account and derives confidence regions around the predicted shape. Our method relies on the construction of two separate statistical shape models, for the predictors and for the unseen parts, and exploits the correlations between them assuming a joint Gaussian distribution. Limitations of the models are taken into account by jointly optimizing the prediction and minimizing the shape reconstruction error through cross-validation. An application to the prediction of the shape of the proximal part of the human tibia given the shape of the distal femur is proposed, as well as the evaluation of the reliability of the estimated confidence regions, using a database of 184 samples. Potential applications are reconstructive surgery, e.g. to assess whether an implant fits in a range of acceptable shapes, or functional neurosurgery when the target's position is not directly visible and needs to be inferred from nearby visible structures.

  19. Uncertainty of Flood Forecasting Based on Radar Rainfall Data Assimilation

    Directory of Open Access Journals (Sweden)

    Xinchi Chen

    2016-01-01

    Full Text Available Precipitation is the core data input to hydrological forecasting. The uncertainty in precipitation forecast data can lead to poor performance of predictive hydrological models. Radar-based precipitation measurement offers advantages over ground-based measurement in the quantitative estimation of temporal and spatial aspects of precipitation, but errors inherent in this method will still act to reduce the performance. Using data from White Lotus River of Hubei Province, China, five methods were used to assimilate radar rainfall data transformed from the classified Z-R relationship, and the postassimilation data were compared with precipitation measured by rain gauges. The five sets of assimilated rainfall data were then used as input to the Xinanjiang model. The effect of precipitation data input error on runoff simulation was analyzed quantitatively by disturbing the input data using the Breeding of Growing Modes method. The results of practical application demonstrated that the statistical weight integration and variational assimilation methods were superior. The corresponding performance in flood hydrograph prediction was also better using the statistical weight integration and variational methods compared to the others. It was found that the errors of radar rainfall data disturbed by the Breeding of Growing Modes had a tendency to accumulate through the hydrological model.

  20. Component-based event composition modeling for CPS

    Science.gov (United States)

    Yin, Zhonghai; Chu, Yanan

    2017-06-01

    In order to combine event-drive model with component-based architecture design, this paper proposes a component-based event composition model to realize CPS’s event processing. Firstly, the formal representations of component and attribute-oriented event are defined. Every component is consisted of subcomponents and the corresponding event sets. The attribute “type” is added to attribute-oriented event definition so as to describe the responsiveness to the component. Secondly, component-based event composition model is constructed. Concept lattice-based event algebra system is built to describe the relations between events, and the rules for drawing Hasse diagram are discussed. Thirdly, as there are redundancies among composite events, two simplification methods are proposed. Finally, the communication-based train control system is simulated to verify the event composition model. Results show that the event composition model we have constructed can be applied to express composite events correctly and effectively.

  1. An Efficient Deterministic Approach to Model-based Prediction Uncertainty

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the...

  2. Residual uncertainty estimation using instance-based learning with applications to hydrologic forecasting

    Science.gov (United States)

    Wani, Omar; Beckers, Joost V. L.; Weerts, Albrecht H.; Solomatine, Dimitri P.

    2017-08-01

    A non-parametric method is applied to quantify residual uncertainty in hydrologic streamflow forecasting. This method acts as a post-processor on deterministic model forecasts and generates a residual uncertainty distribution. Based on instance-based learning, it uses a k nearest-neighbour search for similar historical hydrometeorological conditions to determine uncertainty intervals from a set of historical errors, i.e. discrepancies between past forecast and observation. The performance of this method is assessed using test cases of hydrologic forecasting in two UK rivers: the Severn and Brue. Forecasts in retrospect were made and their uncertainties were estimated using kNN resampling and two alternative uncertainty estimators: quantile regression (QR) and uncertainty estimation based on local errors and clustering (UNEEC). Results show that kNN uncertainty estimation produces accurate and narrow uncertainty intervals with good probability coverage. Analysis also shows that the performance of this technique depends on the choice of search space. Nevertheless, the accuracy and reliability of uncertainty intervals generated using kNN resampling are at least comparable to those produced by QR and UNEEC. It is concluded that kNN uncertainty estimation is an interesting alternative to other post-processors, like QR and UNEEC, for estimating forecast uncertainty. Apart from its concept being simple and well understood, an advantage of this method is that it is relatively easy to implement.

  3. Seasonality of flood events in a changing climate - An uncertainty assessment for Europe through the combination of different climate projections

    Science.gov (United States)

    Eisner, Stephanie; Voß, Frank; Schneider, Christof

    2010-05-01

    Global climate models (GCMs) project an increasing intensity and frequency of heavy rainfall events due to climate change. As a result, the frequency and magnitude of severe flood events is expected to increase in many regions. Furthermore, a change in the seasonality of flood events can be anticipated. In regions that regularly experience snowmelt floods, for instance, temperature increase will lead to a decreased snow accumulation and to a shortened duration of the snowpack. Thus, the risk of spring floods may be reduced. This study aims to estimate the impact of the projected climate change on the seasonality of flood events in the European region. For this purpose large scale river discharge simulations were carried out with the integrated, global model WaterGAP3 (Water - Global Assessment and Prognosis) with a spatial resolution of the grid cells of 5'. WaterGAP3 couples a hydrological model for the simulation of the terrestrial water cycle with a water use model that computes withdrawal and consumptive water use of the sectors manufacturing, electricity production, agriculture and private households. Thus, on the basis of daily climate input parameters with a spatial resolution of 0.5° and downscaled to the 5' grid scale level daily stream flow was simulated and analyzed. First, the seasonality of flood events of defined recurrence periods was determined for the reference period 1961-1990 and validated against measured river discharge data. Subsequently, WaterGAP3 was forced with bias corrected time series originating from simulation runs of different GCMs for the scenario period 2071-2100. To asses the uncertainty that arises from the GCM output used as input forcing to the hydrological model, the calculations were carried out for three different GCMs (Echam5, CNRM, ISLP) and two emission scenarios (A2 and B1 of the IPCC SRES scenarios), respectively. The study demonstrates that the selection of a particular GCM is a major source of uncertainty in assessing

  4. Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms Based on Kalman Filter Estimation

    Science.gov (United States)

    Galvan, Jose Ramon; Saxena, Abhinav; Goebel, Kai Frank

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process, and how it relates to uncertainty representation, management and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for two while considering prognostics in making critical decisions.

  5. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  6. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.

  7. A Jacobian singularity based robust controller design for structured uncertainty

    Science.gov (United States)

    Brown, Brandon

    Any real system will have differences between the mathematical model response and the response of the true system it represents. These differences can come from external disturbances, incomplete modeling of the dynamics (unstructured uncertainty) or simply incorrect or changing parameter values (structured uncertainty) in the model. Sources of unstructured uncertainty are unavoidable in real systems, so a controller design must always consider robustness to these effects. In many cases, when the sources of structured uncertainty are addressed as another source of unstructured uncertainty, the resulting controller design is conservative. By accurately addressing and designing a controller for the structured uncertainty, large benefits in the controller performance can be generated since the conservative bound is reduced. The classical approach to output shaping of a system involves a feedback loop since this architecture is more robust to differences between the mathematical model and the true system. This dissertation will present an approach to design a feedback controller which is robust to structured uncertainties in a plant, in an accurate and minimal way. The approach begins by identifying a critical set of system parameters which will be proven to represent the full set of system parameters in the Nyquist plane. This critical set is populated by all parameter vectors which satisfy a developed deficiency condition and is the minimal set which will contribute to the Nyquist plane portraits. The invariance of this critical set to control structure is shown explicitly. An improvement of previous work is the addition of a numerical solution technique which guarantees that all critical points are found. The presented approach will allow for the designer set minimum relative stability margins, such as gain and phase margins, which previous work could not compute accurately or with confidence of the results. A robust controller is designed with respect to this

  8. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  9. DD4Hep based event reconstruction

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Frank, Markus; Gaede, Frank-Dieter; Hynds, Daniel; Lu, Shaojun; Nikiforou, Nikiforos; Petric, Marko; Simoniello, Rosa; Voutsinas, Georgios Gerasimos

    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measuremen...

  10. Uncertainties Related to Extreme Event Statistics of Sewer System Surcharge and Overflow

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Johansen, C.; Thorndahl, Søren Liedtke

    2005-01-01

    Today it is common practice - in the major part of Europe - to base design of sewer systems in urban areas on recommended minimum values of flooding frequencies related to either pipe top level, basement level in buildings or level of road surfaces. Thus storm water runoff in sewer systems is only...... proceeding in an acceptable manner, if flooding of these levels is having an average return period bigger than a predefined value. This practice is also often used in functional analysis of existing sewer systems. If a sewer system can fulfil recommended flooding frequencies or not, can only be verified...

  11. Building Partnerships through Classroom-Based Events

    Science.gov (United States)

    Zacarian, Debbie; Silverstone, Michael

    2017-01-01

    Building partnerships with families can be a challenge, especially in ethnically diverse classrooms. In this article, the authors describe how to create such partnerships with three kinds of classroom events: community-building events that deepen social relationships and make families feel welcome; curriculum showcase events that give families a…

  12. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel

    2006-01-01

    , with predicted changes in range size by 2030 differing in both magnitude and direction (e.g. from 92% loss to 322% gain). We explain differences with reference to two characteristics of the modelling techniques: data input requirements (presence/absence vs. presence-only approaches) and assumptions made by each......, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. Location The Western Cape of South Africa. Methods We applied nine of the most widely used modelling techniques to model potential distributions under current...... algorithm when extrapolating beyond the range of data used to build the model. The effects of these factors should be carefully considered when using this modelling approach to predict species ranges. Main conclusions We highlight an important source of uncertainty in assessments of the impacts of climate...

  13. MOPSO-based multi-objective TSO planning considering uncertainties

    OpenAIRE

    Wang, Qi; Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2014-01-01

    The concerns of sustainability and climate change have posed a significant growth of renewable energy associated with smart grid technologies. Various uncertainties are the major problems need to be handled by transmission system operator (TSO) planning. This paper mainly focuses on three uncertain factors, i.e. load growth, generation capacity and line faults, and aims to enhance the transmission system via the multi-objective TSO planning (MOTP) approach. The proposed MOTP approach optimize...

  14. Fault detection based on microseismic events

    Science.gov (United States)

    Yin, Chen

    2017-09-01

    In unconventional reservoirs, small faults allow the flow of oil and gas as well as act as obstacles to exploration; for, (1) fracturing facilitates fluid migration, (2) reservoir flooding, and (3) triggering of small earthquakes. These small faults are not generally detected because of the low seismic resolution. However, such small faults are very active and release sufficient energy to initiate a large number of microseismic events (MEs) during hydraulic fracturing. In this study, we identified microfractures (MF) from hydraulic fracturing and natural small faults based on microseismicity characteristics, such as the time-space distribution, source mechanism, magnitude, amplitude, and frequency. First, I identified the mechanism of small faults and MF by reservoir stress analysis and calibrated the ME based on the microseismic magnitude. The dynamic characteristics (frequency and amplitude) of MEs triggered by natural faults and MF were analyzed; moreover, the geometry and activity types of natural fault and MF were grouped according to the source mechanism. Finally, the differences among time-space distribution, magnitude, source mechanism, amplitude, and frequency were used to differentiate natural faults and manmade fractures.

  15. Unit-based emission inventory and uncertainty assessment of coal-fired power plants

    Science.gov (United States)

    Chen, Linghong; Sun, Yangyang; Wu, Xuecheng; Zhang, Yongxin; Zheng, Chenghang; Gao, Xiang; Cen, Kefa

    2014-12-01

    A unit-based emission inventory of coal-fired power plants in China was developed which contains unit capacity, coal consumption, emission control technology and geographical location. Estimated total emissions of SO2, NOx, particulate matter (PM) and PM2.5 in 2011 were 7251 kt, 8067 kt, 1433 kt and 622 kt, respectively. Units larger than 300 MW consumed 75% coal, while emitting 46% SO2, 58% NOx, 55% PM and 63.2% PM2.5. Emission comparisons between key regions such as the Yangtze River Delta, the Pearl River Delta and Shandong Province showed a general downward trend from 2005 to 2011, mainly because of the growing application ratio of desulphurisation, LNBs, denitration and dust-removal facilities. The uncertainties at unit level of SO2, NOx, PM and PM2.5 were estimated to be -10.1% ∼ +5.4%, -2.1% ∼ +4.6%, -5.7% ∼ +6.9% and -4.3% ∼ +6.5%, respectively. Meanwhile sector-based Monte Carlo simulation was conducted for better understanding of the uncertainties. Unit-based simulation yielded narrowed estimates of uncertainties, possibly caused by the neglected diversity of emission characteristics in sector-based simulation. The large number of plants narrowed unit-based uncertainties as large uncertainties were found in provinces with a small number of power plants, such as Qinghai. However, sector-based uncertainty analysis well depends on detailed source classification, because small NOx uncertainties were found in Shandong due to the detailed classification of NOx emission factors. The main uncertainty sources are discussed in the sensitivity analysis, which identifies specific needs in data investigation and field measures to improve them. Though unit-based Monte Carlo greatly narrowed uncertainties, the possibility of underestimated uncertainties at unit level cannot be ignored as the correlation of emission factors between units in the same source category was neglected.

  16. Learning Risk-Taking and Coping with Uncertainty through Experiential, Team-Based Entrepreneurship Education

    Science.gov (United States)

    Arpiainen, Riitta-Liisa; Kurczewska, Agnieszka

    2017-01-01

    This empirical study investigates how students' perceptions of risk-taking and coping with uncertainty change while they are exposed to experience-based entrepreneurship education. The aim of the study is twofold. First, the authors set out to identify the dynamics of entrepreneurial thinking among students experiencing risk and uncertainty while…

  17. Robust LMI-Based Control of Wind Turbines with Parametric Uncertainties

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Esbensen, Thomas; Niss, Michael Odgaard Kuch

    2009-01-01

    This paper considers the design of robust LMI-based controllers for a wind turbine along its entire nominal operating trajectory. The proposed robust controller design takes into account parametric uncertainties in the model using a structured uncertainty description, which makes the controllers...

  18. Practical application of uncertainty-based validation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, M. C. (Mark C.); Hylok, J. E. (Jeffrey E.); Maupin, R. D. (Ryan D.); Rutherford, A. C. (Amanda C.)

    2004-01-01

    Validation of simulation results by comparison with experimental data is certainly not a new idea. However, as the capability to simulate complex physical phenomena has increased over the last few decades, the need for a systematic approach to validation assessment has become evident. Organizations such as the American Society of Mechanical Engineers (ASME) and the National Laboratories are in the process of formulating validation requirements and approaches. A typical depiction of the validation process is given in Figure 1, derived from current ASME efforts regarding computational solid mechanics. Note that uncertainty quantification plays an integral role in the validation comparison step of the process defined in the figure. This is a natural consequence of the need for verification and validation to facilitate decision-making by the customers of simulation results. Since very little is exactly known about real systems, questions of economy, reliability, and safety are best answered in the language of uncertainty. The process illustrated in the figure above is very logical, but very general. Examples of concrete applications of this are still rare. Engineers at Los Alamos National Laboratory have been applying a systematic verification and validation process, much like that of Figure 1, to structural dynamic simulations for the past several years. These applications have resulted in the realizations that there are many details not mentioned in general process descriptions that can complicate a validation assessment. Such details include the following: (1) The need for a hierarchical approach in which the interactions between components within/between assemblies are considered in addition to the overall input/output behavior of the entire system; (2) The need for system state and response data within the important elements of the hierarchy in addition to the observed characteristics at the system level; (3) Selection of appropriate response features for

  19. Study and Application on Stability Classification of Tunnel Surrounding Rock Based on Uncertainty Measure Theory

    Directory of Open Access Journals (Sweden)

    Hujun He

    2014-01-01

    Full Text Available Based on uncertainty measure theory, a stability classification and order-arranging model of surrounding rock was established. Considering the practical engineering geologic condition, 5 factors that influence surrounding rock stability were taken into account and uncertainty measure function was obtained based on the in situ data. In this model, uncertainty influence factors were analyzed quantitatively and qualitatively based on the real situation; the weight of index was given based on information entropy theory; surrounding rock stability level was judged based on credible degree recognition criterion; and surrounding rock was ordered based on order-arranging criterion. Furthermore, this model was employed to evaluate 5 sections surrounding rock in Dongshan tunnel of Huainan. The results show that uncertainty measure method is reasonable and can have significance for surrounding rock stability evaluation in the future.

  20. Synthesis of Optimal Processing Pathway for Microalgae-based Biorefinery under Uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Lee, Jay H.; Gani, Rafiqul

    2015-01-01

    decision making, we propose a systematic framework for the synthesis and optimal design of microalgae-based processing network under uncertainty. By incorporating major uncertainties into the biorefinery superstructure model we developed previously, a stochastic mixed integer nonlinear programming (s......MINLP) problem is formulated for determining the optimal biorefinery structure under given parameter uncertainties modelled as sampled scenarios. The solution to the sMINLP problem determines the optimal decisions with respect to processing technologies, material flows, and product portfolio in the presence......The research in the field of microalgae-based biofuels and chemicals is in early phase of the development, and therefore a wide range of uncertainties exist due to inconsistencies among and shortage of technical information. In order to handle and address these uncertainties to ensure robust...

  1. Recent developments in predictive uncertainty assessment based on the model conditional processor approach

    Directory of Open Access Journals (Sweden)

    G. Coccia

    2011-10-01

    Full Text Available The work aims at discussing the role of predictive uncertainty in flood forecasting and flood emergency management, its relevance to improve the decision making process and the techniques to be used for its assessment.

    Real time flood forecasting requires taking into account predictive uncertainty for a number of reasons. Deterministic hydrological/hydraulic forecasts give useful information about real future events, but their predictions, as usually done in practice, cannot be taken and used as real future occurrences but rather used as pseudo-measurements of future occurrences in order to reduce the uncertainty of decision makers. Predictive Uncertainty (PU is in fact defined as the probability of occurrence of a future value of a predictand (such as water level, discharge or water volume conditional upon prior observations and knowledge as well as on all the information we can obtain on that specific future value from model forecasts. When dealing with commensurable quantities, as in the case of floods, PU must be quantified in terms of a probability distribution function which will be used by the emergency managers in their decision process in order to improve the quality and reliability of their decisions.

    After introducing the concept of PU, the presently available processors are introduced and discussed in terms of their benefits and limitations. In this work the Model Conditional Processor (MCP has been extended to the possibility of using two joint Truncated Normal Distributions (TNDs, in order to improve adaptation to low and high flows.

    The paper concludes by showing the results of the application of the MCP on two case studies, the Po river in Italy and the Baron Fork river, OK, USA. In the Po river case the data provided by the Civil Protection of the Emilia Romagna region have been used to implement an operational example, where the predicted variable is the observed water level. In the Baron Fork River

  2. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    Science.gov (United States)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the

  3. MOPSO-based multi-objective TSO planning considering uncertainties

    DEFF Research Database (Denmark)

    Wang, Qi; Zhang, Chunyu; Ding, Yi

    2014-01-01

    The concerns of sustainability and climate change have posed a significant growth of renewable energy associated with smart grid technologies. Various uncertainties are the major problems need to be handled by transmission system operator (TSO) planning. This paper mainly focuses on three uncertain...... factors, i.e. load growth, generation capacity and line faults, and aims to enhance the transmission system via the multi-objective TSO planning (MOTP) approach. The proposed MOTP approach optimizes three objectives simultaneously, namely the probabilistic available transfer capability (PATC), investment...... cost and power outage cost. A two-phase MOPSO algorithm is employed to solve this optimization problem, which can accelerate the convergence and guarantee the diversity ofPareto-optimal front set as well. The feasibility and effectiveness of both the proposed multi-objective planning approach...

  4. A Practical ANOVA Approach for Uncertainty Analysis in Population-Based Disease Microsimulation Models.

    Science.gov (United States)

    Sharif, Behnam; Wong, Hubert; Anis, Aslam H; Kopec, Jacek A

    2017-04-01

    To provide a practical approach for calculating uncertainty intervals and variance components associated with initial-condition and dynamic-equation parameters in computationally expensive population-based disease microsimulation models. In the proposed uncertainty analysis approach, we calculated the required computational time and the number of runs given a user-defined error bound on the variance of the grand mean. The equations for optimal sample sizes were derived by minimizing the variance of the grand mean using initial estimates for variance components. Finally, analysis of variance estimators were used to calculate unbiased variance estimates. To illustrate the proposed approach, we performed uncertainty analysis to estimate the uncertainty associated with total direct cost of osteoarthritis in Canada from 2010 to 2031 according to a previously published population health microsimulation model of osteoarthritis. We first calculated crude estimates for initial-population sampling and dynamic-equation parameters uncertainty by performing a small number of runs. We then calculated the optimal sample sizes and finally derived 95% uncertainty intervals of the total cost and unbiased estimates for variance components. According to our results, the contribution of dynamic-equation parameter uncertainty to the overall variance was higher than that of initial parameter sampling uncertainty throughout the study period. The proposed analysis of variance approach provides the uncertainty intervals for the mean outcome in addition to unbiased estimates for each source of uncertainty. The contributions of each source of uncertainty can then be compared with each other for validation purposes so as to improve the model accuracy. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Calibrating ground-based microwave radiometers: Uncertainty and drifts

    Science.gov (United States)

    Küchler, N.; Turner, D. D.; Löhnert, U.; Crewell, S.

    2016-04-01

    The quality of microwave radiometer (MWR) calibrations, including both the absolute radiometric accuracy and the spectral consistency, determines the accuracy of geophysical retrievals. The Microwave Radiometer Calibration Experiment (MiRaCalE) was conducted to evaluate the performance of MWR calibration techniques, especially of the so-called Tipping Curve Calibrations (TCC) and Liquid Nitrogen Calibrations (LN2cal), by repeatedly calibrating a fourth-generation Humidity and Temperature Profiler (HATPRO-G4) that measures downwelling radiance between 20 GHz and 60 GHz. MiRaCalE revealed two major points to improve MWR calibrations: (i) the necessary repetition frequency for MWR calibration techniques to correct drifts, which ensures stable long-term measurements; and (ii) the spectral consistency of control measurements of a well known reference is useful to estimate calibration accuracy. Besides, we determined the accuracy of the HATPRO's liquid nitrogen-cooled blackbody's temperature. TCCs and LN2cals were found to agree within 0.5 K when observing the liquid nitrogen-cooled blackbody with a physical temperature of 77 K. This agreement of two different calibration techniques suggests that the brightness temperature of the LN2 cooled blackbody is accurate within at least 0.5 K, which is a significant reduction of the uncertainties that have been assumed to vary between 0.6 K and 1.5 K when calibrating the HATPRO-G4. The error propagation of both techniques was found to behave almost linearly, leading to maximum uncertainties of 0.7 K when observing a scene that is associated with a brightness temperature of 15 K.

  6. Disturbance Observer Based Control of Multirotor Helicopters Based on a Universal Model with Unstructured Uncertainties

    Directory of Open Access Journals (Sweden)

    Ye Xie

    2015-01-01

    Full Text Available To handle different perspectives of unstructured uncertainties, two robust control techniques on the basis of a universal model are studied in this paper. Rather than building a model only applicable to a specific small-scale multirotor helicopter (MHeli, the paper proposes a modeling technique to develop a universal model-framework. Particularly, it is straightforward to apply the universal model to a certain MHeli because the contribution and allocation matrix is proposed in the model-framework. Based on the model uncertainties, the load perturbation of the rotor is the primary focus due to its indispensable importance in the tracking performance. In contrast to the common methods, it is proposed to take this unstructured uncertainty in that external disturbance and designs disturbance observer (DOB. In addition, a class of lead-compensator is specifically designed as for compensating phase lag induced by DOB. Compared with H∞ loop-shaping, greater robust tracking performance on rejecting load perturbation could be achieved as a tradeoff between robust stability and tracking performance which is successfully avoided with DOB-based control strategy.

  7. Uncertainty dimensions of information behaviour in a group based problem solving context

    DEFF Research Database (Denmark)

    Hyldegård, Jette

    2009-01-01

    This paper presents a study of uncertainty dimensions of information behaviour in a group based problem solving context. After a presentation of the cognitive uncertainty dimension underlying Kuhlthau's ISP-model, uncertainty factors associated with personality, the work task situation and social...... members' experiences of uncertainty differ from the individual information seeker in Kuhlthau's ISP-model, and how this experience may be related to personal, work task and social factors. A number of methods have been employed to collect data on each group member during the assignment process......: a demographic survey, a personality test, 3 process surveys, 3 diaries and 3 interviews. It was found that group members' experiences of uncertainty did not correspond with the ISP-model in that other factors beyond the mere information searching process seemed to intermingle with the complex process...

  8. Climate data induced uncertainty in model-based estimations of terrestrial primary productivity

    Science.gov (United States)

    Wu, Zhendong; Ahlström, Anders; Smith, Benjamin; Ardö, Jonas; Eklundh, Lars; Fensholt, Rasmus; Lehsten, Veiko

    2017-06-01

    Model-based estimations of historical fluxes and pools of the terrestrial biosphere differ substantially. These differences arise not only from differences between models but also from differences in the environmental and climatic data used as input to the models. Here we investigate the role of uncertainties in historical climate data by performing simulations of terrestrial gross primary productivity (GPP) using a process-based dynamic vegetation model (LPJ-GUESS) forced by six different climate datasets. We find that the climate induced uncertainty, defined as the range among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 11 Pg C yr-1 globally (9% of mean GPP). We also assessed a hypothetical maximum climate data induced uncertainty by combining climate variables from different datasets, which resulted in significantly larger uncertainties of 41 Pg C yr-1 globally or 32% of mean GPP. The uncertainty is partitioned into components associated to the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (climate data range) and the apparent sensitivity of the modeled GPP to the driver (apparent model sensitivity). We find that LPJ-GUESS overestimates GPP compared to empirically based GPP data product in all land cover classes except for tropical forests. Tropical forests emerge as a disproportionate source of uncertainty in GPP estimation both in the simulations and empirical data products. The tropical forest uncertainty is most strongly associated with shortwave radiation and precipitation forcing, of which climate data range contributes higher to overall uncertainty than apparent model sensitivity to forcing. Globally, precipitation dominates the climate induced uncertainty over nearly half of the vegetated land area, which is mainly due

  9. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy.

    Science.gov (United States)

    Wahl, N; Hennig, P; Wieser, H P; Bangert, M

    2017-06-26

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU [Formula: see text] min). The resulting standard deviation (expectation value) of dose show average global [Formula: see text] pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time

  10. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  11. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    Science.gov (United States)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  12. CMOS-based Stochastically Spiking Neural Network for Optimization under Uncertainties

    Science.gov (United States)

    2017-03-01

    uncertainties. We discuss a ‘scenario generation’ circuit to non- parametrically estimate/emulate statistics of uncertain cost/constraints...are explored: (1) We discuss a ‘scenario generation’ circuit to non- parametrically estimate and emulate statistics of uncertain cost/constraints...uncertainties. The discussed mixed-signal, CMOS-based architecture of stochastically spiking neural network minimizes area/power of each cell and

  13. Status of XSUSA for Sampling Based Nuclear Data Uncertainty and Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Pasichnyk I.

    2013-03-01

    Full Text Available In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steadystate as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA – Uncertainty Analyses for Criticality Safety Assessment, UAM – Uncertainty Analysis in Modelling. It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO2/MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations.

  14. THE UNCERTAINTIES ON THE GIS BASED LAND SUITABILITY ASSESSMENT FOR URBAN AND RURAL PLANNING

    Directory of Open Access Journals (Sweden)

    H. Liu

    2017-09-01

    Full Text Available The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the “Nature Breaks” method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  15. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    Science.gov (United States)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  16. Fluctuations in Uncertainty

    OpenAIRE

    Nicholas Bloom

    2013-01-01

    Uncertainty is an amorphous concept. It reflects uncertainty in the minds of consumers, managers, and policymakers about possible futures. It is also a broad concept, including uncertainty over the path of macro phenomena like GDP growth, micro phenomena like the growth rate of firms, and noneconomic events like war and climate change. In this essay, I address four questions about uncertainty. First, what are some facts and patterns about economic uncertainty? Both macro and micro uncertainty...

  17. Naive Probability: Model-Based Estimates of Unique Events.

    Science.gov (United States)

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  18. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    2015-01-07

    Jan 7, 2015 ... local or global and global methods can be based on regression, correlation, parameter bounding, and variance decomposition. (Matott et al., 2009). Methods based on ... al., 2006; Wolski and Murray-Hudson, 2008; Wolski, 2009). ...... ROSE KA, SMITH EP, GARDNER RH, BRENKERT AL and BARTELL.

  19. A scenario based approach for flexible resource loading under uncertainty

    NARCIS (Netherlands)

    Wullink, Gerhard; Gademann, Noud; Hans, Elias W.; van Harten, Aart

    2003-01-01

    Order acceptance decisions in manufacture-to-order environments are often made based on incomplete or uncertain information. To promise reliable due dates and to manage resource capacity adequately, resource capacity loading is an indispensable supporting tool. We propose a scenario based approach

  20. A new MC-based method to evaluate the fission fraction uncertainty at reactor neutrino experiment

    CERN Document Server

    Ma, X B; Chen, Y X

    2016-01-01

    Uncertainties of fission fraction is an important uncertainty source for the antineutrino flux prediction in a reactor antineutrino experiment. A new MC-based method of evaluating the covariance coefficients between isotopes was proposed. It was found that the covariance coefficients will varying with reactor burnup and which may change from positive to negative because of fissioning balance effect, for example, the covariance coefficient between $^{235}$U and $^{239}$Pu changes from 0.15 to -0.13. Using the equation between fission fraction and atomic density, the consistent of uncertainty of fission fraction and the covariance matrix were obtained. The antineutrino flux uncertainty is 0.55\\% which does not vary with reactor burnup, and the new value is about 8.3\\% smaller.

  1. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    Science.gov (United States)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  2. Uncertainty-based simulation-optimization using Gaussian process emulation: Application to coastal groundwater management

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ketabchi, Hamed

    2017-12-01

    Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.

  3. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  4. Modeling Representation Uncertainty in Concept-Based Multimedia Retrieval

    NARCIS (Netherlands)

    Aly, Robin

    2010-01-01

    This thesis considers concept-based multimedia retrieval, where documents are represented by the occurrence of concepts (also referred to as semantic concepts or high-level features). A concept can be thought of as a kind of label, which is attached to (parts of) the multimedia documents in which it

  5. Modeling representation uncertainty in concept-based multimedia retrieval

    NARCIS (Netherlands)

    Aly, Robin

    2010-01-01

    This thesis considers concept-based multimedia retrieval, where documents are represented by the occurrence of concepts (also referred to as semantic concepts or high-level features). A concept can be thought of as a kind of label, which is attached to (parts of) the multimedia documents in which it

  6. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    2015-01-07

    Jan 7, 2015 ... This general methodology is applied to a reservoir model of the Okavango ... local or global and global methods can be based on regression, ..... fdet) empirically represent rooting depth and simulate a linear .... work, we apply an enhanced version of FAST, the extended ..... John Wiley & Sons, Ltd., Sussex,.

  7. Stereo based Obstacle Detection with Uncertainty in Rough Terrain

    NARCIS (Netherlands)

    Mark, W. van der; Heuvel, J.C. van den; Groen, F.C.A.

    2007-01-01

    Autonomous robot vehicles that operate in offroad terrain should avoid obstacle hazards. In this paper we present a stereo vision based method that is able to cluster reconstructed terrain points into obstacles by evaluating their relative angles and distances. In our approach, constraints are

  8. Uncertainty of scattered light roughness measurements based on speckle correlation methods

    Science.gov (United States)

    Patzelt, Stefan; Stöbener, Dirk; Ströbel, Gerald; Fischer, Andreas

    2017-06-01

    Surface micro topography measurement (e.g., form, waviness, roughness) is a precondition to assess the surface quality of technical components with regard to their applications. Well defined, standardized measuring devices measure and specify geometrical surface textures only under laboratory conditions. Laser speckle-based roughness measurement is a parametric optical scattered light measuring technique that overcomes this confinement. Field of view dimensions of some square millimeters and measuring frequencies in the kHz domain enable in-process roughness characterization of even moving part surfaces. However, camera exposure times of microseconds or less and a high detector pixel density mean less light energy per pixel due to the limited laser power. This affects the achievable measurement uncertainty according to the Heisenberg uncertainty principle. The influence of fundamental, inevitable noise sources such as the laser shot noise and the detector noise is not quantified yet. Therefore, the uncertainty for speckle roughness measurements is analytically estimated. The result confirms the expected inverse proportionality of the measurement uncertainty to the square root of the illuminating light power and the direct proportionality to the detector readout noise, quantization noise and dark current noise, respectively. For the first time it is possible to quantify the achievable measurement uncertainty u(Sa) < 1 nm for the scattered light measuring system. The low uncertainty offers ideal preconditions for in-process roughness measurements in an industrial environment with an aspired resolution of 1 nm.

  9. Multifidelity Sparse-Grid-Based Uncertainty Quantification for the Hokkaido Nansei-oki Tsunami

    Science.gov (United States)

    de Baar, Jouke H. S.; Roberts, Stephen G.

    2017-08-01

    With uncertainty quantification, we aim to efficiently propagate the uncertainties in the input parameters of a computer simulation, in order to obtain a probability distribution of its output. In this work, we use multi-fidelity sparse grid interpolation to propagate the uncertainty in the shape of the incoming wave for the Okushiri test-case, which is a wave tank model of a part of the 1993 Hokkaido Nansei-oki tsunami. An important issue with many uncertainty quantification approaches is the `curse of dimensionality': the overall computational cost of the uncertainty propagation increases rapidly when we increase the number of uncertain input parameters. We aim to mitigate the curse of dimensionality by using a multifidelity approach. In the multifidelity approach, we combine results from a small number of accurate and expensive high-fidelity simulations with a large number of less accurate but also less expensive low-fidelity simulations. For the Okushiri test-case, we find an improved scaling when we increase the number of uncertain input parameters. This results in a significant reduction of the overall computational cost. For example, for four uncertain input parameters, accurate uncertainty quantification based on only high-fidelity simulations comes at a normalised cost of 219 high-fidelity simulations; when we use a multifidelity approach, this is reduced to a normalised cost of only 10 high-fidelity simulations.

  10. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect. © The Author(s) 2015.

  11. Understanding uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, M. P.; Kavetski, D.; Slater, A. G.; Newman, A. J.; Marks, D. G.; Landry, C.; Lundquist, J. D.; Rupp, D. E.; Nijssen, B.

    2013-12-01

    Building an environmental model requires making a series of decisions regarding the appropriate representation of natural processes. While some of these decisions can already be based on well-established physical understanding, gaps in our current understanding of environmental dynamics, combined with incomplete knowledge of properties and boundary conditions of most environmental systems, make many important modeling decisions far more ambiguous. There is consequently little agreement regarding what a 'correct' model structure is, especially at relatively larger spatial scales such as catchments and beyond. In current practice, faced with such a range of decisions, different modelers will generally make different modeling decisions, often on an ad hoc basis, based on their balancing of process understanding, the data available to evaluate the model, the purpose of the modeling exercise, and their familiarity with or investment in an existing model infrastructure. This presentation describes development and application of multiple-hypothesis models to evaluate process-based hydrologic models. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including multiple options for model parameterizations (e.g., below-canopy wind speed, thermal conductivity, storage and transmission of liquid water through soil, etc.), as well as multiple options for model architecture, that is, the coupling and organization of different model components (e.g., representations of sub-grid variability and hydrologic connectivity, coupling with groundwater, etc.). Application of this modeling framework across a collection of different research basins demonstrates that differences among model parameterizations are often overwhelmed by differences among equally-plausible model parameter sets, while differences in model architecture lead

  12. Uncertainties around incretin-based therapies: A literature review

    Directory of Open Access Journals (Sweden)

    Bader Al Tulaihi

    2017-01-01

    Full Text Available Background: Diabetes mellitus is a chronic debilitating and non-communicable disease. It has several long-term outcomes that are associated with various end organ damage, mainly the heart, blood vessels, eyes, nerves, and kidneys. There are different modalities of treatment of diabetes. The recent incretin-based therapies provided an innovative class of drugs including GLP-1 receptor agonists and DPP-4 inhibitors. This review aims to summarize the available evidence of their effectiveness. Method: This is a narrative review. Several databases were searched. Search terms used were MeSH and keywords with different combinations of Boolean operators according to the database but were comparable. Studies included were: randomized controlled trials, cohort and case-controlled studies, health technology report, meta-analysis, and systematic reviews. Results were analysed and reported in a narrative style with emphasis on the effectiveness and adverse effects of various types of incretin based therapies. Results: 17 articles were retrieved as they fulfilled the inclusion criteria. They were heterogeneous in terms of interventions, participants, settings and outcomes. Studies varied in their quality and/or reporting of their findings conducted in several settings. There are two types of incretin: Glucose dependent Insulinotropic Peptide (GIP and Glucagon-like Peptide 1 (GLP-1. There is no question that incretin-based glucose-lowering medications have demonstrated to be effective glucose-lowering drugs. They proved an evidence-based efficacy profile and appear to do so with significant effects to stimulate weight loss with minimal hypoglycaemia. However, there are few side effects that should not be overlooked when deciding to use such therapies. Conclusion: The findings of our review presented here, do not prove that these agents are unsafe, but it does suggest that the burden of evidence now rests with those who hope to persuade us of their safety

  13. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  14. A Belief Rule Based Expert System to Assess Mental Disorder under Uncertainty

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Afif Monrat, Ahmed; Hasan, Mamun

    2016-01-01

    Mental disorder is a change of mental or behavioral pattern that causes sufferings and impairs the ability to function in ordinary life. In psychopathology, the assessment methods of mental disorder contain various types of uncertainties associated with signs and symptoms. This study identifies...... to ignorance, incompleteness, and randomness. So, a belief rule-based expert system (BRBES) has been designed and developed with the capability of handling the uncertainties mentioned. Evidential reasoning works as the inference engine and the belief rule base as the knowledge representation schema...

  15. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    Science.gov (United States)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with

  16. The flood event explorer - a web based framework for rapid flood event analysis

    Science.gov (United States)

    Schröter, Kai; Lüdtke, Stefan; Kreibich, Heidi; Merz, Bruno

    2015-04-01

    Flood disaster management, recovery and reconstruction planning benefit from rapid evaluations of flood events and expected impacts. The near real time in-depth analysis of flood causes and key drivers for flood impacts requires a close monitoring and documentation of hydro-meteorological and socio-economic factors. Within the CEDIM's Rapid Flood Event Analysis project a flood event analysis system is developed which enables the near real-time evaluation of large scale floods in Germany. The analysis system includes functionalities to compile event related hydro-meteorological data, to evaluate the current flood situation, to assess hazard intensity and to estimate flood damage to residential buildings. A German flood event database is under development, which contains various hydro-meteorological information - in the future also impact information -for all large-scale floods since 1950. This data base comprises data on historic flood events which allow the classification of ongoing floods in terms of triggering processes and pre-conditions, critical controls and drivers for flood losses. The flood event analysis system has been implemented in a database system which automatically retrieves and stores data from more than 100 online discharge gauges on a daily basis. The current discharge observations are evaluated in a long term context in terms of flood frequency analysis. The web-based frontend visualizes the current flood situation in comparison to any past flood from the flood catalogue. The regional flood data base for Germany contains hydro-meteorological data and aggregated severity indices for a set of 76 historic large-scale flood events in Germany. This data base has been used to evaluate the key drivers for the flood in June 2013.

  17. Model and parametric uncertainty in source-based kinematic models of earthquake ground motion

    Science.gov (United States)

    Hartzell, Stephen; Frankel, Arthur; Liu, Pengcheng; Zeng, Yuehua; Rahman, Shariftur

    2011-01-01

    Four independent ground-motion simulation codes are used to model the strong ground motion for three earthquakes: 1994 Mw 6.7 Northridge, 1989 Mw 6.9 Loma Prieta, and 1999 Mw 7.5 Izmit. These 12 sets of synthetics are used to make estimates of the variability in ground-motion predictions. In addition, ground-motion predictions over a grid of sites are used to estimate parametric uncertainty for changes in rupture velocity. We find that the combined model uncertainty and random variability of the simulations is in the same range as the variability of regional empirical ground-motion data sets. The majority of the standard deviations lie between 0.5 and 0.7 natural-log units for response spectra and 0.5 and 0.8 for Fourier spectra. The estimate of model epistemic uncertainty, based on the different model predictions, lies between 0.2 and 0.4, which is about one-half of the estimates for the standard deviation of the combined model uncertainty and random variability. Parametric uncertainty, based on variation of just the average rupture velocity, is shown to be consistent in amplitude with previous estimates, showing percentage changes in ground motion from 50% to 300% when rupture velocity changes from 2.5 to 2.9 km/s. In addition, there is some evidence that mean biases can be reduced by averaging ground-motion estimates from different methods.

  18. Predictability of prototype flash flood events in the Western Mediterranean under uncertainties of the precursor upper-level disturbance: the HYDROPTIMET case studies

    Directory of Open Access Journals (Sweden)

    R. Romero

    2005-01-01

    uncertainty in the representation of the upper-level disturbance and the necessity to cope with it within the operational context when attempting to issue short to mid-range numerical weather predictions of these high impact weather events, a systematic exploration of the predictability of the three selected case studies subject to uncertainties in the representation of the upper-level precursor disturbance is carried out in this paper. The study is based on an ensemble of mesoscale numerical simulations of each event with the MM5 non-hydrostatic model after perturbing in a systematic way the upper-level disturbance, in the sense of displacing slightly this disturbance upstream/downstream along the zonal direction and intensifying/weakening its amplitude. These perturbations are guided by a previous application of the MM5-adjoint model, which consistently shows high sensitivities of the dynamical control of the heavy rain to the flow configuration about the upper-level disturbance on the day before, thus confirming the precursor characteristics of this agent. The perturbations are introduced to the initial conditions by applying a potential vorticity (PV inversion procedure to the positive PV anomaly associated with the upper-level disturbance, and then using the inverted fields (wind, temperature and geopotential to modify under a physically consistent balance the model initial fields. The results generally show that the events dominated by mesoscale low-level disturbances (Catalogne and last stage of the Piémont episode are very sensitive to the initial uncertainties, such that the heavy rain location and magnitude are in some of the experiments strongly changed in response to the 'forecast errors' of the cyclone trajectory, intensity, shape and translational speed. In contrast, the other situations (Cévennes and initial stage of the Piémont episode, dominated by a larger scale system wich basically acts to guarantee the establishment and persistence of the southerly LLJ

  19. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  20. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  1. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  2. Reduction of slope stability uncertainty based on hydraulic measurement via inverse analysis

    NARCIS (Netherlands)

    Vardon, P.J.; Liu, K.; Hicks, M.A.

    2016-01-01

    The determination of slope stability for existing slopes is challenging, partly due to the spatial variability of soils. Reliability-based design can incorporate uncertainties and yield probabilities of slope failure. Field measurements can be utilised to constrain probabilistic analyses, thereby

  3. Set-Based Approach to Design under Uncertainty and Applications to Shaping a Hydrofoil

    Science.gov (United States)

    2016-01-01

    Set-Based Approach to Design under Uncertainty and Applications to Shaping a Hydrofoil Johannes O. Royset∗ Professor Operations Research Department...manufacturing errors. Acknowledgements This work in supported in parts by DARPA under N66001-15-2-4055. References [1] Singer , D. J., Doerry, N., and

  4. Model-based Type B uncertainty evaluations of measurement towards more objective evaluation strategies

    NARCIS (Netherlands)

    Boumans, M.

    2013-01-01

    This article proposes a more objective Type B evaluation. This can be achieved when Type B uncertainty evaluations are model-based. This implies, however, grey-box modelling and validation instead of white-box modelling and validation which are appropriate for Type A evaluation.

  5. Cost benefits of postponing time-based maintenance under lifetime distribution uncertainty

    NARCIS (Netherlands)

    de Jonge, Bram; Dijkstra, Arjan; Romeijnders, Ward

    We consider the problem of scheduling time-based preventive maintenance under uncertainty in the lifetime distribution of a unit, with the understanding that every time a maintenance action is carried out, additional information on the lifetime distribution becomes available. Under such

  6. An Optimization-Based Approach to Determine System Requirements Under Multiple-Domain Specific Uncertainties

    Science.gov (United States)

    2016-04-30

    employed by airlines . The expected fleet profit values return to the top-level subspace as the metrics of interest. The process continues until the...incorporates techniques from multidisciplinary design optimization, statistical theory , and robust/reliability-based methods to develop computationally...domain-specific uncertainties. These two competing objectives of productivity and fuel consumption (maximizing productivity increases fuel consumption

  7. Analyzing the uncertainty of ensemble-based gridded observations in land surface simulations and drought assessment

    Science.gov (United States)

    Ahmadalipour, Ali; Moradkhani, Hamid

    2017-12-01

    Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.

  8. A framework for model-based optimization of bioprocesses under uncertainty: Lignocellulosic ethanol production case

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2012-01-01

    metrics via uncertainty analysis. Finally, stochastic programming is applied to drive the process development efforts forward subject to these uncertainties. The framework is evaluated on four different process configurations for cellulosic ethanol production including Simultaneous Saccharification and Co.......), hydrolysis (inhibition constant for xylose on conversion of cellulose and cellobiose, etc) and co-fermentation (ethanol yield on xylose, inhibition constant on microbial growth, etc.), are the most significant sources of uncertainties affecting the unit production cost of ethanol with a standard deviation...... of up to 0.13 USD/gal-ethanol. Further stochastic optimization demonstrated the options for further reduction of the production costs with different processing configurations, reaching a reduction of up to 28% in the production cost in the SHCF configuration compared to the base case operation. Further...

  9. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    to the sMINLP problem determines the processing technologies, material flows, and product portfolio that are optimal with respect to all the sampled scenarios. The developed framework is implemented and tested on a specific case study. The optimal processing pathways selected with and without......We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...

  10. An all-but-one entropic uncertainty relation, and application to password-based identification

    NARCIS (Netherlands)

    Bouman, N.J.; Fehr, S.; González-Guillén, C.; Schaffner, C.

    2013-01-01

    Entropic uncertainty relations are quantitative characterizations of Heisenberg’s uncertainty principle, which make use of an entropy measure to quantify uncertainty. We propose a new entropic uncertainty relation. It is the first such uncertainty relation that lower bounds the uncertainty in the

  11. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  12. Dynamic UAV-based traffic monitoring under uncertainty as a stochastic arc-inventory routing policy

    Directory of Open Access Journals (Sweden)

    Joseph Y.J. Chow

    2016-10-01

    Full Text Available Given the rapid advances in unmanned aerial vehicles, or drones, and increasing need to monitor at a city level, one of the current research gaps is how to systematically deploy drones over multiple periods. We propose a real-time data-driven approach: we formulate the first deterministic arc-inventory routing problem and derive its stochastic dynamic policy. The policy is expected to be of greatest value in scenarios where uncertainty is highest and costliest, such as city monitoring during major events. The Bellman equation for an approximation of the proposed inventory routing policy is formulated as a selective vehicle routing problem. We propose an approximate dynamic programming algorithm based on Least Squares Monte Carlo simulation to find that policy. The algorithm has been modified so that the least squares dependent variable is defined to be the “expected stock out cost upon the next replenishment”. The new algorithm is tested on 30 simulated instances of real time trajectories over 5 time periods of the selective vehicle routing problem to evaluate the proposed policy and algorithm. Computational results on the selected instances show that the algorithm on average outperforms the myopic policy by 23–28%, depending on the parametric design. Further tests are conducted on classic benchmark arc routing problem instances. The 11-link instance gdb19 (Golden et al., 1983 is expanded into a sequential 15-period stochastic dynamic example and used to demonstrate why a naïve static multi-period deployment plan would not be effective in real networks.

  13. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  14. Multitask Learning-Based Security Event Forecast Methods for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hui He

    2016-01-01

    Full Text Available Wireless sensor networks have strong dynamics and uncertainty, including network topological changes, node disappearance or addition, and facing various threats. First, to strengthen the detection adaptability of wireless sensor networks to various security attacks, a region similarity multitask-based security event forecast method for wireless sensor networks is proposed. This method performs topology partitioning on a large-scale sensor network and calculates the similarity degree among regional subnetworks. The trend of unknown network security events can be predicted through multitask learning of the occurrence and transmission characteristics of known network security events. Second, in case of lacking regional data, the quantitative trend of unknown regional network security events can be calculated. This study introduces a sensor network security event forecast method named Prediction Network Security Incomplete Unmarked Data (PNSIUD method to forecast missing attack data in the target region according to the known partial data in similar regions. Experimental results indicate that for an unknown security event forecast the forecast accuracy and effects of the similarity forecast algorithm are better than those of single-task learning method. At the same time, the forecast accuracy of the PNSIUD method is better than that of the traditional support vector machine method.

  15. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  16. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  17. Towards a Reliable Framework of Uncertainty-Based Group Decision Support System

    OpenAIRE

    Chai, Junyi; Liu, James N. K.

    2011-01-01

    This study proposes a framework of Uncertainty-based Group Decision Support System (UGDSS). It provides a platform for multiple criteria decision analysis in six aspects including (1) decision environment, (2) decision problem, (3) decision group, (4) decision conflict, (5) decision schemes and (6) group negotiation. Based on multiple artificial intelligent technologies, this framework provides reliable support for the comprehensive manipulation of applications and advanced decision approache...

  18. Consequences of random and systematic reconstruction uncertainties in 3D image based brachytherapy in cervical cancer.

    Science.gov (United States)

    Tanderup, Kari; Hellebust, Taran Paulsen; Lang, Stefan; Granfeldt, Jørgen; Pötter, Richard; Lindegaard, Jacob Christian; Kirisits, Christian

    2008-11-01

    The purpose of this study was to evaluate the impact of random and systematic applicator reconstruction uncertainties on DVH parameters in brachytherapy for cervical cancer. Dose plans were analysed for 20 cervical cancer patients with MRI based brachytherapy. Uncertainty of applicator reconstruction was modelled by translating and rotating the applicator. Changes in DVH parameters per mm of applicator displacement were evaluated for GTV, CTV, bladder, rectum, and sigmoid. These data were used to derive patient population based estimates of delivered dose relative to expected dose. Deviations of DVH parameters depend on direction of reconstruction uncertainty. The most sensitive organs are rectum and bladder where mean DVH parameter shifts are 5-6% per mm applicator displacement in ant-post direction. For other directions and other DVH parameters, mean shifts are below 4% per mm. By avoiding systematic reconstruction errors, uncertainties on DVH parameters can be kept below 10% in 90% of a patient population. Systematic errors of a few millimetres can lead to significant deviations. Comprehensive quality control of afterloader, applicators and imaging procedures should be applied to prevent systematic errors in applicator reconstruction. Random errors should be minimised by using small slice thickness. With careful reconstruction procedures, reliable DVH parameters for target and OAR's can be obtained.

  19. Active disturbance rejection based trajectory linearization control for hypersonic reentry vehicle with bounded uncertainties.

    Science.gov (United States)

    Shao, Xingling; Wang, Honglun

    2015-01-01

    This paper investigates a novel compound control scheme combined with the advantages of trajectory linearization control (TLC) and alternative active disturbance rejection control (ADRC) for hypersonic reentry vehicle (HRV) attitude tracking system with bounded uncertainties. Firstly, in order to overcome actuator saturation problem, nonlinear tracking differentiator (TD) is applied in the attitude loop to achieve fewer control consumption. Then, linear extended state observers (LESO) are constructed to estimate the uncertainties acting on the LTV system in the attitude and angular rate loop. In addition, feedback linearization (FL) based controllers are designed using estimates of uncertainties generated by LESO in each loop, which enable the tracking error for closed-loop system in the presence of large uncertainties to converge to the residual set of the origin asymptotically. Finally, the compound controllers are derived by integrating with the nominal controller for open-loop nonlinear system and FL based controller. Also, comparisons and simulation results are presented to illustrate the effectiveness of the control strategy. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Event-based prospective memory performance in autism spectrum disorder.

    Science.gov (United States)

    Altgassen, Mareike; Schmitz-Hübsch, Maren; Kliegel, Matthias

    2010-03-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to remember to respond to a cue-event. Everyday planning performance was assessed with proxy ratings. Although parents of the autism group rated their children's everyday performance as significantly poorer than controls' parents, no group differences were found in event-based prospective memory. Nevertheless, individual differences in laboratory-based and everyday performances were related. Clinical implications of these findings are discussed.

  1. Parameter uncertainty-based pattern identification and optimization for robust decision making on watershed load reduction

    Science.gov (United States)

    Jiang, Qingsong; Su, Han; Liu, Yong; Zou, Rui; Ye, Rui; Guo, Huaicheng

    2017-04-01

    Nutrients loading reduction in watershed is essential for lake restoration from eutrophication. The efficient and optimal decision-making on loading reduction is generally based on water quality modeling and the quantitative identification of nutrient sources at the watershed scale. The modeling process is influenced inevitably by inherent uncertainties, especially by uncertain parameters due to equifinality. Therefore, the emerging question is: if there is parameter uncertainty, how to ensure the robustness of the optimal decisions? Based on simulation-optimization models, an integrated approach of pattern identification and analysis of robustness was proposed in this study that focuses on the impact of parameter uncertainty in water quality modeling. Here the pattern represents the discernable regularity of solutions for load reduction under multiple parameter sets. Pattern identification is achieved by using a hybrid clustering analysis (i.e., Ward-Hierarchical and K-means), which was flexible and efficient in analyzing Lake Bali near the Yangtze River in China. The results demonstrated that urban domestic nutrient load is the most potential source that should be reduced, and there are two patterns for Total Nitrogen (TN) reduction and three patterns for Total Phosphorus (TP) reduction. The patterns indicated different total reduction of nutrient loads, which reflect diverse decision preferences. The robust solution was identified by the highest accomplishment with the water quality at monitoring stations that were improved uniformly with this solution. We conducted a process analysis of robust decision-making that was based on pattern identification and uncertainty, which provides effective support for decision-making with preference under uncertainty.

  2. Scenario-based fitted Q-iteration for adaptive control of water reservoir systems under uncertainty

    Science.gov (United States)

    Bertoni, Federica; Giuliani, Matteo; Castelletti, Andrea

    2017-04-01

    Over recent years, mathematical models have largely been used to support planning and management of water resources systems. Yet, the increasing uncertainties in their inputs - due to increased variability in the hydrological regimes - are a major challenge to the optimal operations of these systems. Such uncertainty, boosted by projected changing climate, violates the stationarity principle generally used for describing hydro-meteorological processes, which assumes time persisting statistical characteristics of a given variable as inferred by historical data. As this principle is unlikely to be valid in the future, the probability density function used for modeling stochastic disturbances (e.g., inflows) becomes an additional uncertain parameter of the problem, which can be described in a deterministic and set-membership based fashion. This study contributes a novel method for designing optimal, adaptive policies for controlling water reservoir systems under climate-related uncertainty. The proposed method, called scenario-based Fitted Q-Iteration (sFQI), extends the original Fitted Q-Iteration algorithm by enlarging the state space to include the space of the uncertain system's parameters (i.e., the uncertain climate scenarios). As a result, sFQI embeds the set-membership uncertainty of the future inflow scenarios in the action-value function and is able to approximate, with a single learning process, the optimal control policy associated to any scenario included in the uncertainty set. The method is demonstrated on a synthetic water system, consisting of a regulated lake operated for ensuring reliable water supply to downstream users. Numerical results show that the sFQI algorithm successfully identifies adaptive solutions to operate the system under different inflow scenarios, which outperform the control policy designed under historical conditions. Moreover, the sFQI policy generalizes over inflow scenarios not directly experienced during the policy design

  3. Robust Optimization-Based Scheduling of Multi-Microgrids Considering Uncertainties

    Directory of Open Access Journals (Sweden)

    Akhtar Hussain

    2016-04-01

    Full Text Available Scheduling of multi-microgrids (MMGs is one of the important tasks in MMG operation and it faces new challenges as the integration of demand response (DR programs and renewable generation (wind and solar sources increases. In order to address these challenges, robust optimization (RO-based scheduling has been proposed in this paper considering uncertainties in both renewable energy sources and forecasted electric loads. Initially, a cost minimization deterministic model has been formulated for the MMG system. Then, it has been transformed to a min-max robust counterpart and finally, a traceable robust counterpart has been formulated using linear duality theory and Karush–Kuhn–Tucker (KKT optimality conditions. The developed model provides immunity against the worst-case realization within the provided uncertainty bounds. Budget of uncertainty has been used to develop a trade-off between the conservatism of solution and probability of unfeasible solution. The effect of uncertainty gaps on internal and external trading, operation cost, unit commitment of dispatchable generators, and state of charge (SOC of battery energy storage systems (BESSs have also been analyzed in both grid-connected and islanded modes. Simulations results have proved the robustness of proposed strategy.

  4. Incorporating Wind Power Forecast Uncertainties Into Stochastic Unit Commitment Using Neural Network-Based Prediction Intervals.

    Science.gov (United States)

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2015-09-01

    Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.

  5. Identification of GCM Uncertainty of Dynamical Cores and Physical Parameterizations by Object-Based Methods

    Science.gov (United States)

    Yorgun, M. S.; Rood, R. B.

    2012-12-01

    time and location of, for example, a particular observed cloud system. Process- and object-based evaluation preserves information in the observations by avoiding the need for extensive spatial and temporal averaging. We pose that this approach will provide sound identification of model uncertainty by comparison to observations (i.e. GPCC gauge based data), and will intrinsically link local, weather-scale phenomena to important climatological features and provide a quantitative bridge between weather and climate.

  6. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  7. Event-based prospective memory performance in autism spectrum disorder

    NARCIS (Netherlands)

    Altgassen, A.M.; Schmitz-Hübsch, M.; Kliegel, M.

    2010-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with

  8. Lung Cancer Screening CT-Based Prediction of Cardiovascular Events

    NARCIS (Netherlands)

    Mets, Onno M.; Vliegenthart, Rozemarijn; Gondrie, Martijn J.; Viergever, Max A.; Oudkerk, Matthijs; de Koning, Harry J.; Mali, Willem P. Th M.; Prokop, Mathias; van Klaveren, Rob J.; van der Graaf, Yolanda; Buckens, Constantinus F. M.; Zanen, Pieter; Lammers, Jan-Willem J.; Groen, Harry J. M.; Isgum, Ivana; de Jong, Pim A.

    OBJECTIVES The aim of this study was to derivate and validate a prediction model for cardiovascular events based on quantification of coronary and aortic calcium volume in lung cancer screening chest computed tomography (CT). BACKGROUND CT-based lung cancer screening in heavy smokers is a very

  9. Assessment of Uncertainty-Based Screening Volumes for NASA Robotic LEO and GEO Conjunction Risk Assessment

    Science.gov (United States)

    Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.

    2011-01-01

    Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.

  10. A coupled hydrological-hydraulic flood inundation model calibrated using post-event measurements and integrated uncertainty analysis in a poorly gauged Mediterranean basin

    Science.gov (United States)

    Hdeib, Rouya; Abdallah, Chadi; Moussa, Roger; Colin, Francois

    2017-04-01

    Developing flood inundation maps of defined exceedance probabilities is required to provide information on the flood hazard and the associated risk. A methodology has been developed to model flood inundation in poorly gauged basins, where reliable information on the hydrological characteristics of floods are uncertain and partially captured by the traditional rain-gauge networks. Flood inundation is performed through coupling a hydrological rainfall-runoff (RR) model (HEC-HMS) with a hydraulic model (HEC-RAS). The RR model is calibrated against the January 2013 flood event in the Awali River basin, Lebanon (300 km2), whose flood peak discharge was estimated by post-event measurements. The resulting flows of the RR model are defined as boundary conditions of the hydraulic model, which is run to generate the corresponding water surface profiles and calibrated against 20 post-event surveyed cross sections after the January-2013 flood event. An uncertainty analysis is performed to assess the results of the models. Consequently, the coupled flood inundation model is simulated with design storms and flood inundation maps are generated of defined exceedance probabilities. The peak discharges estimated by the simulated RR model were in close agreement with the results from different empirical and statistical methods. This methodology can be extended to other poorly gauged basins facing common stage-gauge failure or characterized by floods with a stage exceeding the gauge measurement level, or higher than that defined by the rating curve.

  11. Multistage stochastic programming: A scenario tree based approach to planning under uncertainty

    OpenAIRE

    Defourny, Boris; Ernst, Damien; Wehenkel, Louis

    2011-01-01

    In this chapter, we present the multistage stochastic programming framework for sequential decision making under uncertainty. We discuss its differences with Markov Decision Processes, from the point of view of decision models and solution algorithms. We describe the standard technique for solving approximately multistage stochastic problems, which is based on a discretization of the disturbance space called scenario tree. We insist on a critical issue of the approach: the decisions can be ve...

  12. Survey of radiofrequency radiation levels around GSM base stations and evaluation of measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vulević Branislav D.

    2011-01-01

    Full Text Available This paper is a summary of broadband measurement values of radiofrequency radiation around GSM base stations in the vicinity of residential areas in Belgrade and 12 other cities in Serbia. It will be useful for determining non-ionizing radiation exposure levels of the general public in the future. The purpose of this paper is also an appropriate representation of basic information on the evaluation of measurement uncertainty.

  13. Toward a contingent resource-based view of nonmarket capabilities under regulatory uncertainty

    OpenAIRE

    Schwark, Bastian

    2009-01-01

    The article integrates theoretical perspectives from the resource-based view of the firm, dynamic capabilities and contingency. It explains one particular characteristic of the general business environment of the firm, regulatory uncertainty, and its influence on dynamic capabilities of a corporate political strategy (nonmarket strategy) and value creation. I argue that scanning and predictive capabilities as well as institutional influence capabilities will lead to a reduced perceived uncert...

  14. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  15. Review of the Different Sources of Uncertainty in Single Polarization Radar-Based Estimates of Rainfall

    Science.gov (United States)

    Villarini, Gabriele; Krajewski, Witold F.

    2010-01-01

    It is well acknowledged that there are large uncertainties associated with radar-based estimates of rainfall. Numerous sources of these errors are due to parameter estimation, the observational system and measurement principles, and not fully understood physical processes. Propagation of these uncertainties through all models for which radar-rainfall are used as input (e.g., hydrologic models) or as initial conditions (e.g., weather forecasting models) is necessary to enhance the understanding and interpretation of the obtained results. The aim of this paper is to provide an extensive literature review of the principal sources of error affecting single polarization radar-based rainfall estimates. These include radar miscalibration, attenuation, ground clutter and anomalous propagation, beam blockage, variability of the Z- R relation, range degradation, vertical variability of the precipitation system, vertical air motion and precipitation drift, and temporal sampling errors. Finally, the authors report some recent results from empirically-based modeling of the total radar-rainfall uncertainties. The bibliography comprises over 200 peer reviewed journal articles.

  16. Event-based prospective memory performance in autism spectrum disorder

    OpenAIRE

    Altgassen, Mareike; Schmitz-H?bsch, Maren; Kliegel, Matthias

    2009-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to ...

  17. Parallel genetic algorithm with population-based sampling approach to discrete optimization under uncertainty

    Science.gov (United States)

    Subramanian, Nithya

    Optimization under uncertainty accounts for design variables and external parameters or factors with probabilistic distributions instead of fixed deterministic values; it enables problem formulations that might maximize or minimize an expected value while satisfying constraints using probabilities. For discrete optimization under uncertainty, a Monte Carlo Sampling (MCS) approach enables high-accuracy estimation of expectations but it also results in high computational expense. The Genetic Algorithm (GA) with a Population-Based Sampling (PBS) technique enables optimization under uncertainty with discrete variables at a lower computational expense than using Monte Carlo sampling for every fitness evaluation. Population-Based Sampling uses fewer samples in the exploratory phase of the GA and a larger number of samples when `good designs' start emerging over the generations. This sampling technique therefore reduces the computational effort spent on `poor designs' found in the initial phase of the algorithm. Parallel computation evaluates the expected value of the objective and constraints in parallel to facilitate reduced wall-clock time. A customized stopping criterion is also developed for the GA with Population-Based Sampling. The stopping criterion requires that the design with the minimum expected fitness value to have at least 99% constraint satisfaction and to have accumulated at least 10,000 samples. The average change in expected fitness values in the last ten consecutive generations is also monitored. The optimization of composite laminates using ply orientation angle as a discrete variable provides an example to demonstrate further developments of the GA with Population-Based Sampling for discrete optimization under uncertainty. The focus problem aims to reduce the expected weight of the composite laminate while treating the laminate's fiber volume fraction and externally applied loads as uncertain quantities following normal distributions. Construction of

  18. Using the DOE Knowledge Base for Special Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  19. Accelerometer-Based Event Detector for Low-Power Applications

    Directory of Open Access Journals (Sweden)

    József Smidla

    2013-10-01

    Full Text Available In this paper, an adaptive, autocovariance-based event detection algorithm is proposed, which can be used with micro-electro-mechanical systems (MEMS accelerometer sensors to build inexpensive and power efficient event detectors. The algorithm works well with low signal-to-noise ratio input signals, and its computational complexity is very low, allowing its utilization on inexpensive low-end embedded sensor devices. The proposed algorithm decreases its energy consumption by lowering its duty cycle, as much as the event to be detected allows it. The performance of the algorithm is tested and compared to the conventional filter-based approach. The comparison was performed in an application where illegal entering of vehicles into restricted areas was detected.

  20. Detection of comfortable temperature based on thermal events detection indoors

    Science.gov (United States)

    Szczurek, Andrzej; Maciejewska, Monika; Uchroński, Mariusz

    2017-11-01

    This work focussed on thermal comfort as the basis to control indoor conditions. Its objective is a method to determine thermal preferences of office occupants. The method is based on detection of thermal events. They occur when indoor conditions are under control of occupants. Thermal events are associated with the use of local heating/cooling sources which have user-adjustable settings. The detection is based on Fourier analysis of indoor temperature time series. The relevant data is collected by temperature sensor. We achieved thermal events recognition rate of 86 %. Conditions when indoor conditions were beyond control were detected with 95.6 % success rate. Using experimental data it was demonstrated that the method allows to reproduce key elements of temperature statistics associated with conditions when occupants are in control of thermal comfort.

  1. Quantification of dose uncertainties for the bladder in prostate cancer radiotherapy based on dominant eigenmodes

    Science.gov (United States)

    Rios, Richard; Acosta, Oscar; Lafond, Caroline; Espinosa, Jairo; de Crevoisier, Renaud

    2017-11-01

    In radiotherapy for prostate cancer the dose at the treatment planning for the bladder may be a bad surrogate of the actual delivered dose as the bladder presents the largest inter-fraction shape variations during treatment. This paper presents PCA models as a virtual tool to estimate dosimetric uncertainties for the bladder produced by motion and deformation between fractions. Our goal is to propose a methodology to determine the minimum number of modes required to quantify dose uncertainties of the bladder for motion/deformation models based on PCA. We trained individual PCA models using the bladder contours available from three patients with a planning computed tomography (CT) and on-treatment cone-beam CTs (CBCTs). Based on the above models and via deformable image registration (DIR), we estimated two accumulated doses: firstly, an accumulated dose obtained by integrating the planning dose over the Gaussian probability distribution of the PCA model; and secondly, an accumulated dose obtained by simulating treatment courses via a Monte Carlo approach. We also computed a reference accumulated dose for each patient using his available images via DIR. Finally, we compared the planning dose with the three accumulated doses, and we calculated local dose variability and dose-volume histogram uncertainties.

  2. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    Science.gov (United States)

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  3. Forecasting dose-time profiles of solar particle events using a dosimetry-based forecasting methodology

    Science.gov (United States)

    Neal, John Stuart

    2001-10-01

    A dosimetery-based Bayesian methodology for forecasting astronaut radiation doses in deep space due to radiologically significant solar particle event proton fluences is developed. Three non-linear sigmoidal growth curves (Gompertz, Weibull, logistic) are used with hierarchical, non-linear, regression models to forecast solar particle event dose-time profiles from doses obtained early in the development of the event. Since there are no detailed measurements of dose versus time for actual events, surrogate dose data are provided by calculational methods. Proton fluence data are used as input to the deterministic, coupled neutron-proton space radiation computer code, BRYNTRN, for transporting protons and their reaction products (protons, neutrons, 2H, 3H, 3He, and 4He) through aluminum shielding material and water. Calculated doses and dose rates for ten historical solar particle events are used as the input data by grouping similar historical solar particle events, using asymptotic dose and maximum dose rate as the grouping criteria. These historical data are then used to lend strength to predictions of dose and dose rate-time profiles for new solar particle events. Bayesian inference techniques are used to make parameter estimates and predictive forecasts. Markov Chain Monte Carlo (MCMC) methods are used to sample from the posterior distributions. Hierarchical, non-linear regression models provide useful predictions of asymptotic dose and dose-time profiles for the November 8, 2000 and August 12, 1989 solar particle events. Predicted dose rate-time profiles are adequate for the November 8, 2000 solar particle event. Predictions of dose rate-time profiles for the August 12, 1989 solar particle event suffer due to a more complex dose rate-time profile. Forecasts provide a valuable tool to space operations planners when making recommendations concerning operations in which radiological exposure might jeopardize personal safety or mission completion. This work

  4. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  5. Wavelet based denoising of power quality events for characterization

    African Journals Online (AJOL)

    The effectiveness of wavelet transform (WT) methods for analyzing different power quality (PQ) events with or without noise has been demonstrated in this paper. Multi-resolution signal decomposition based on discrete WT is used to localize and to classify different power quality disturbances. The energy distribution at ...

  6. Towards an event-based corpuscular model for optical phenomena

    NARCIS (Netherlands)

    De Raedt, H.; Jin, F.; Michielsen, K.; Roychoudhuri, C; Khrennikov, AY; Kracklauer, AF

    2011-01-01

    We discuss an event-based corpuscular model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory through a series of cause-and-effect processes, starting with the emission and ending with the

  7. Training Team Problem Solving Skills: An Event-Based Approach.

    Science.gov (United States)

    Oser, R. L.; Gualtieri, J. W.; Cannon-Bowers, J. A.; Salas, E.

    1999-01-01

    Discusses how to train teams in problem-solving skills. Topics include team training, the use of technology, instructional strategies, simulations and training, theoretical framework, and an event-based approach for training teams to perform in naturalistic environments. Contains 68 references. (Author/LRW)

  8. Deterministic event-based simulation of universal quantum computation

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Raedt, K. De; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    We demonstrate that locally connected networks of classical processing units that leave primitive learning capabilities can be used to perform a deterministic; event-based simulation of universal tluanttim computation. The new simulation method is applied to implement Shor's factoring algorithm.

  9. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  10. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  11. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  12. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  13. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  14. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  15. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  16. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  17. Topic Modeling Based Image Clustering by Events in Social Media

    Directory of Open Access Journals (Sweden)

    Bin Xu

    2016-01-01

    Full Text Available Social event detection in large photo collections is very challenging and multimodal clustering is an effective methodology to deal with the problem. Geographic information is important in event detection. This paper proposed a topic model based approach to estimate the missing geographic information for photos. The approach utilizes a supervised multimodal topic model to estimate the joint distribution of time, geographic, content, and attached textual information. Then we annotate the missing geographic photos with a predicted geographic coordinate. Experimental results indicate that the clustering performance improved by annotated geographic information.

  18. Decision Model of Flight Safety Based on Flight Event

    Science.gov (United States)

    Xiao-yu, Zhang; Jiu-sheng, Chen

    To improve the management of flight safety for airline company, the hierarchy model is established about the evaluation of flight safety by flight event. Flight safety is evaluated by improved analytical hierarchy process (AHP). The method to rectify the consistency judgment matrix is given to improve the AHP. Then the weight can be given directly without consistency judgment matrix. It ensures absolute consistent of judgment matrix. By statistic of flight event incidence history data, the flight safety analysis is processed by means of static evaluation and dynamic evaluation. The hierarchy structure model is implemented based on .NET, and the simulation result proves the validity of the method.

  19. Development of a GCR Event-based Risk Model

    Science.gov (United States)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  20. Determination of the ATLAS jet energy measurement uncertainty using photon-jet events in proton-proton collisions at sqrt{s} = 7 TeV

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    The ATLAS jet energy calibration is validated in-situ by exploiting the conservation of transverse momentum pT in events which contain a photon and a hadronic jet in the central region of the detector. Three jet calibration schemes derived from Monte Carlo simulation are validated using 38 ipb of proton-proton $(pp)$ collisions at sqrt{s}=7 TeV. Two techniques, which exhibit different systematic uncertainties, are used to measure jet response: direct pT balance pT^ jet / pT \\gamma, and the Missing E_T Projection Fraction. For pT ^ \\gamma > 45 GeV, the jet response measured in-situ agrees with simulation within 3%. With the current amount of data, this analysis is statistically limited beyond pT^\\gamma of 250 GeV.

  1. Uncertainty estimation with bias-correction for flow series based on rating curve

    Science.gov (United States)

    Shao, Quanxi; Lerat, Julien; Podger, Geoff; Dutta, Dushmanta

    2014-03-01

    Streamflow discharge constitutes one of the fundamental data required to perform water balance studies and develop hydrological models. A rating curve, designed based on a series of concurrent stage and discharge measurements at a gauging location, provides a way to generate complete discharge time series with a reasonable quality if sufficient measurement points are available. However, the associated uncertainty is frequently not available even though it has a significant impact on hydrological modelling. In this paper, we identify the discrepancy of the hydrographers' rating curves used to derive the historical discharge data series and proposed a modification by bias correction which is also in the form of power function as the traditional rating curve. In order to obtain the uncertainty estimation, we propose a further both-side Box-Cox transformation to stabilize the regression residuals as close to the normal distribution as possible, so that a proper uncertainty can be attached for the whole discharge series in the ensemble generation. We demonstrate the proposed method by applying it to the gauging stations in the Flinders and Gilbert rivers in north-west Queensland, Australia.

  2. An integrated uncertainty and ensemble-based data assimilation approach for improved operational streamflow predictions

    Directory of Open Access Journals (Sweden)

    M. He

    2012-03-01

    Full Text Available The current study proposes an integrated uncertainty and ensemble-based data assimilation framework (ICEA and evaluates its viability in providing operational streamflow predictions via assimilating snow water equivalent (SWE data. This step-wise framework applies a parameter uncertainty analysis algorithm (ISURF to identify the uncertainty structure of sensitive model parameters, which is subsequently formulated into an Ensemble Kalman Filter (EnKF to generate updated snow states for streamflow prediction. The framework is coupled to the US National Weather Service (NWS snow and rainfall-runoff models. Its applicability is demonstrated for an operational basin of a western River Forecast Center (RFC of the NWS. Performance of the framework is evaluated against existing operational baseline (RFC predictions, the stand-alone ISURF and the stand-alone EnKF. Results indicate that the ensemble-mean prediction of ICEA considerably outperforms predictions from the other three scenarios investigated, particularly in the context of predicting high flows (top 5th percentile. The ICEA streamflow ensemble predictions capture the variability of the observed streamflow well, however the ensemble is not wide enough to consistently contain the range of streamflow observations in the study basin. Our findings indicate that the ICEA has the potential to supplement the current operational (deterministic forecasting method in terms of providing improved single-valued (e.g., ensemble mean streamflow predictions as well as meaningful ensemble predictions.

  3. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Error modeling based on geostatistics for uncertainty analysis in crop mapping using Gaofen-1 multispectral imagery

    Science.gov (United States)

    You, Jiong; Pei, Zhiyuan

    2015-01-01

    With the development of remote sensing technology, its applications in agriculture monitoring systems, crop mapping accuracy, and spatial distribution are more and more being explored by administrators and users. Uncertainty in crop mapping is profoundly affected by the spatial pattern of spectral reflectance values obtained from the applied remote sensing data. Errors in remotely sensed crop cover information and the propagation in derivative products need to be quantified and handled correctly. Therefore, this study discusses the methods of error modeling for uncertainty characterization in crop mapping using GF-1 multispectral imagery. An error modeling framework based on geostatistics is proposed, which introduced the sequential Gaussian simulation algorithm to explore the relationship between classification errors and the spectral signature from remote sensing data source. On this basis, a misclassification probability model to produce a spatially explicit classification error probability surface for the map of a crop is developed, which realizes the uncertainty characterization for crop mapping. In this process, trend surface analysis was carried out to generate a spatially varying mean response and the corresponding residual response with spatial variation for the spectral bands of GF-1 multispectral imagery. Variogram models were employed to measure the spatial dependence in the spectral bands and the derived misclassification probability surfaces. Simulated spectral data and classification results were quantitatively analyzed. Through experiments using data sets from a region in the low rolling country located at the Yangtze River valley, it was found that GF-1 multispectral imagery can be used for crop mapping with a good overall performance, the proposal error modeling framework can be used to quantify the uncertainty in crop mapping, and the misclassification probability model can summarize the spatial variation in map accuracy and is helpful for

  5. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  6. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  7. Event-Based User Classification in Weibo Media

    Science.gov (United States)

    Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  8. Uncertainty Quantification for Adjoint-Based Inverse Problems with Sparse Data

    Science.gov (United States)

    Loose, Nora; Heimbach, Patrick; Nisancioglu, Kerim

    2017-04-01

    The adjoint method of data assimilation (DA) is used in many fields of Geosciences. It fits a dynamical model to observations in a least-squares optimization problem, leading to a solution that follows the model equations exactly. While the physical consistency of the obtained solution makes the adjoint method an attractive DA technique for many applications, one of its major drawbacks is that an accompanying uncertainty quantification is computationally challenging. In theory, the Hessian of the model-data misfit function can provide such an error estimate on the solution of the inverse problem because - under certain assumptions - it can be associated with the inverse of the error covariance matrix. In practice, however, studies that use adjoint-based DA into ocean GCMs usually don't deal with a quantification of uncertainties, mostly because an analysis of the Hessian is often intractable due to its high dimensionality. This work is motivated by the fact that an increasing number of studies apply the adjoint-based DA machinery to paleoceanographic problems - without considering accompanying uncertainties. In such applications, the number of observations can be of the order 102, while the dimension of the control space is still as high as of the order 106 to 108. An uncertainty quantification in such heavily underdetermined inverse problems seems even more crucial, an objective that we pursue here. We take advantage of the fact that in such situations the Hessian is of very low rank (while still of high dimension). This enables us to explore in great detail to what extent paleo proxy data from ocean sediment cores informs the solution of the inverse problem. We use the MIT general circulation model (MITgcm) and sample a sparse set of observations from a control simulation, corresponding to available data from ocean sediment cores. We then quantify how well the synthetic data constrains different quantities of interest, such as heat content of specific ocean

  9. New Multi-objective Uncertainty-based Algorithm for Water Resource Models' Calibration

    Science.gov (United States)

    Keshavarz, Kasra; Alizadeh, Hossein

    2017-04-01

    Water resource models are powerful tools to support water management decision making process and are developed to deal with a broad range of issues including land use and climate change impacts analysis, water allocation, systems design and operation, waste load control and allocation, etc. These models are divided into two categories of simulation and optimization models whose calibration has been addressed in the literature where great relevant efforts in recent decades have led to two main categories of auto-calibration methods of uncertainty-based algorithms such as GLUE, MCMC and PEST and optimization-based algorithms including single-objective optimization such as SCE-UA and multi-objective optimization such as MOCOM-UA and MOSCEM-UA. Although algorithms which benefit from capabilities of both types, such as SUFI-2, were rather developed, this paper proposes a new auto-calibration algorithm which is capable of both finding optimal parameters values regarding multiple objectives like optimization-based algorithms and providing interval estimations of parameters like uncertainty-based algorithms. The algorithm is actually developed to improve quality of SUFI-2 results. Based on a single-objective, e.g. NSE and RMSE, SUFI-2 proposes a routine to find the best point and interval estimation of parameters and corresponding prediction intervals (95 PPU) of time series of interest. To assess the goodness of calibration, final results are presented using two uncertainty measures of p-factor quantifying percentage of observations covered by 95PPU and r-factor quantifying degree of uncertainty, and the analyst has to select the point and interval estimation of parameters which are actually non-dominated regarding both of the uncertainty measures. Based on the described properties of SUFI-2, two important questions are raised, answering of which are our research motivation: Given that in SUFI-2, final selection is based on the two measures or objectives and on the other

  10. Biomedical event trigger detection by dependency-based word embedding.

    Science.gov (United States)

    Wang, Jian; Zhang, Jianhai; An, Yuan; Lin, Hongfei; Yang, Zhihao; Zhang, Yijia; Sun, Yuanyuan

    2016-08-10

    In biomedical research, events revealing complex relations between entities play an important role. Biomedical event trigger identification has become a research hotspot since its important role in biomedical event extraction. Traditional machine learning methods, such as support vector machines (SVM) and maxent classifiers, which aim to manually design powerful features fed to the classifiers, depend on the understanding of the specific task and cannot generalize to the new domain or new examples. In this paper, we propose an approach which utilizes neural network model based on dependency-based word embedding to automatically learn significant features from raw input for trigger classification. First, we employ Word2vecf, the modified version of Word2vec, to learn word embedding with rich semantic and functional information based on dependency relation tree. Then neural network architecture is used to learn more significant feature representation based on raw dependency-based word embedding. Meanwhile, we dynamically adjust the embedding while training for adapting to the trigger classification task. Finally, softmax classifier labels the examples by specific trigger class using the features learned by the model. The experimental results show that our approach achieves a micro-averaging F1 score of 78.27 and a macro-averaging F1 score of 76.94 % in significant trigger classes, and performs better than baseline methods. In addition, we can achieve the semantic distributed representation of every trigger word.

  11. PREVENTING MEDICATION ERROR BASED ON KNOWLEDGE MANAGEMENT AGAINST ADVERSE EVENT

    Directory of Open Access Journals (Sweden)

    Apriyani Puji Hastuti

    2017-06-01

    Full Text Available Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE. Methods: This study consisted of two stages. The first stage of research was an explanative survey using cross-sectional approach involving 15 respondents selected by purposive sampling. The second stage was a pre-test experiment involving 29 respondents selected with cluster sampling. Partial Leas square (PLS was used to examine the factors affecting medication error prevention model while the Wilcoxon Signed Rank Test was used to test the effect of medication error prevention model against adverse events (AE. Results: Individual factors (path coefficient 12:56, t = 4,761 play an important role in nurse behavioral changes about medication error prevention based in knowledge management, organizational factor (path coefficient = 0276, t = 2.504 play an important role in nurse behavioral changes about medication error prevention based on knowledge management. Work characteristic factor (path coefficient = 0309, t = 1.98 play an important role in nurse behavioral changes about medication error prevention based on knowledge management. The medication error prevention model based on knowledge management was also significantly decreased adverse event (p = 0.000, α <0.05. Discussion: Factors of individuals, organizations and work characteristics were important in the development of medication error prevention models based on knowledge management.

  12. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  13. Voxel-based statistical analysis of uncertainties associated with deformable image registration.

    Science.gov (United States)

    Li, Shunshan; Glide-Hurst, Carri; Lu, Mei; Kim, Jinkoo; Wen, Ning; Adams, Jeffrey N; Gordon, James; Chetty, Indrin J; Zhong, Hualiang

    2013-09-21

    Deformable image registration (DIR) algorithms have inherent uncertainties in their displacement vector fields (DVFs).The purpose of this study is to develop an optimal metric to estimate DIR uncertainties. Six computational phantoms have been developed from the CT images of lung cancer patients using a finite element method (FEM). The FEM generated DVFs were used as a standard for registrations performed on each of these phantoms. A mechanics-based metric, unbalanced energy (UE), was developed to evaluate these registration DVFs. The potential correlation between UE and DIR errors was explored using multivariate analysis, and the results were validated by landmark approach and compared with two other error metrics: DVF inverse consistency (IC) and image intensity difference (ID). Landmark-based validation was performed using the POPI-model. The results show that the Pearson correlation coefficient between UE and DIR error is rUE-error = 0.50. This is higher than rIC-error = 0.29 for IC and DIR error and rID-error = 0.37 for ID and DIR error. The Pearson correlation coefficient between UE and the product of the DIR displacements and errors is rUE-error × DVF = 0.62 for the six patients and rUE-error × DVF = 0.73 for the POPI-model data. It has been demonstrated that UE has a strong correlation with DIR errors, and the UE metric outperforms the IC and ID metrics in estimating DIR uncertainties. The quantified UE metric can be a useful tool for adaptive treatment strategies, including probability-based adaptive treatment planning.

  14. Kinematic source inversions of teleseismic data based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, O.; McDougall, D.; Mai, P. M.; Babuska, I.

    2014-12-01

    One fundamental aspect of seismic hazard mitigation is gaining a better understanding of the rupture process. Because direct observation of the relevant parameters and properties is not possible, other means such as kinematic source inversions are used instead. By constraining the spatial and temporal evolution of fault slip during an earthquake, those inversion approaches may enable valuable insights in the physics of the rupture process. However, due to the underdetermined nature of this inversion problem (i.e., inverting a kinematic source model for an extended fault based on seismic data), the provided solutions are generally non-unique. Here we present a statistical (Bayesian) inversion approach based on an open-source library for uncertainty quantification (UQ) called QUESO that was developed at ICES (UT Austin). The approach has advantages with respect to deterministic inversion approaches as it provides not only a single (non-unique) solution but also provides uncertainty bounds with it. Those uncertainty bounds help to qualitatively and quantitatively judge how well constrained an inversion solution is and how much rupture complexity the data reliably resolve. The presented inversion scheme uses only tele-seismically recorded body waves but future developments may lead us towards joint inversion schemes. After giving an insight in the inversion scheme ifself (based on delayed rejection adaptive metropolis, DRAM) we explore the method's resolution potential. For that, we synthetically generate tele-seismic data, add for example different levels of noise and/or change fault plane parameterization and then apply our inversion scheme in the attempt to extract the (known) kinematic rupture model. We conclude with exemplary inverting real tele-seismic data of a recent large earthquake and compare those results with deterministically derived kinematic source models provided by other research groups.

  15. A Belief Rule Based Expert System to Assess Bronchiolitis Suspicion from Signs and Symptoms under Uncertainty

    DEFF Research Database (Denmark)

    Karim, Rezuan; Hossain, Mohammad Shahadat; Khalid, Md. Saifuddin

    2017-01-01

    Bronchiolitis is a common disease in children and an acute viral infection of the bronchioles that affects millions of children around the world. The assessment of the suspicion of this disease is usually carried out by looking at its signs and symptoms. However, these signs and symptoms cannot...... be measured with cent percent certainty, resulting in inaccuracy in determining the occurrence of Bronchiolitis. Therefore, this paper presents the development of a Belief Rule-Based Expert System (BRBES) to assess the suspicion of Bronchiolitis by using signs and symptoms under uncertainty. The recently...

  16. Quantification of Uncertainties in Turbulence Modeling: A Comparison of Physics-Based and Random Matrix Theoretic Approaches

    CERN Document Server

    Wang, Jian-Xun; Xiao, Heng

    2016-01-01

    Numerical models based on Reynolds-Averaged Navier-Stokes (RANS) equations are widely used in engineering turbulence modeling. However, the RANS predictions have large model-form uncertainties for many complex flows. Quantification of these large uncertainties originating from the modeled Reynolds stresses has attracted attention in turbulence modeling community. Recently, a physics-based Bayesian framework for quantifying model-form uncertainties has been proposed with successful applications to several flows. Nonetheless, how to specify proper priors without introducing unwarranted, artificial information remains challenging to the current form of the physics-based approach. Another recently proposed method based on random matrix theory provides the prior distributions with the maximum entropy, which is an alternative for model-form uncertainty quantification in RANS simulations. In this work, we utilize the random matrix theoretic approach to assess and possibly improve the specification of priors used in ...

  17. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  18. Effect of geometrical uncertainties on the performance of heat exchangers using an efficient POD-based model reduction technique

    Science.gov (United States)

    Abraham, S.; Ghorbaniasl, G.; Raisee, M.; Lacor, C.

    2016-06-01

    The present paper aims at assessing the effect of manufacturing tolerances on the performance of heat exchangers. To this end, a two-dimensional square rib-roughened cooling channel is considered and uncertainties are introduced along the rib profile, using a Karhunen-Loéve expansion including 20 uncertainties. In order to break the curse of dimensionality and keep the overall computational cost within acceptable limits, an efficient uncertainty quantification strategy is followed. A sensitivity analysis is first performed on a coarse grid, enabling the most important dimension to be identified and to remove the ones which have not any significant effect on the output of interest. Afterwards, an efficient Proper Orthogonal Decomposition based dimension reduction technique is implemented in order to propagate uncertainties through the CFD model. It is shown that heat transfer predictions are strongly affected by geometrical uncertainties while no significant effect was found for the pressure drop.

  19. Mars Science Laboratory; A Model for Event-Based EPO

    Science.gov (United States)

    Mayo, Louis; Lewis, E.; Cline, T.; Stephenson, B.; Erickson, K.; Ng, C.

    2012-10-01

    The NASA Mars Science Laboratory (MSL) and its Curiosity Rover, a part of NASA's Mars Exploration Program, represent the most ambitious undertaking to date to explore the red planet. MSL/Curiosity was designed primarily to determine whether Mars ever had an environment capable of supporting microbial life. NASA's MSL education program was designed to take advantage of existing, highly successful event based education programs to communicate Mars science and education themes to worldwide audiences through live webcasts, video interviews with scientists, TV broadcasts, professional development for teachers, and the latest social media frameworks. We report here on the success of the MSL education program and discuss how this methodological framework can be used to enhance other event based education programs.

  20. A Physics-Based Vibrotactile Feedback Library for Collision Events.

    Science.gov (United States)

    Park, Gunhyuk; Choi, Seungmoon

    2017-01-01

    We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.

  1. A Weighted Belief Entropy-Based Uncertainty Measure for Multi-Sensor Data Fusion

    Science.gov (United States)

    Tang, Yongchuan; Zhou, Deyun; Xu, Shuai; He, Zichang

    2017-01-01

    In real applications, how to measure the uncertain degree of sensor reports before applying sensor data fusion is a big challenge. In this paper, in the frame of Dempster–Shafer evidence theory, a weighted belief entropy based on Deng entropy is proposed to quantify the uncertainty of uncertain information. The weight of the proposed belief entropy is based on the relative scale of a proposition with regard to the frame of discernment (FOD). Compared with some other uncertainty measures in Dempster–Shafer framework, the new measure focuses on the uncertain information represented by not only the mass function, but also the scale of the FOD, which means less information loss in information processing. After that, a new multi-sensor data fusion approach based on the weighted belief entropy is proposed. The rationality and superiority of the new multi-sensor data fusion method is verified according to an experiment on artificial data and an application on fault diagnosis of a motor rotor. PMID:28441736

  2. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  3. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  4. Out of the black box: expansion of a theory-based intervention to self-manage the uncertainty associated with active surveillance (AS) for prostate cancer.

    Science.gov (United States)

    Kazer, Meredith Wallace; Bailey, Donald E; Whittemore, Robin

    2010-01-01

    Active surveillance (AS) (sometimes referred to as watchful waiting) is an alternative approach to managing low-risk forms of prostate cancer. This management approach allows men to avoid expensive prostate cancer treatments and their well-documented adverse events of erectile dysfunction and incontinence. However, AS is associated with illness uncertainty and reduced quality of life (QOL; Wallace, 2003). An uncertainty management intervention (UMI) was developed by Mishel et al. (2002) to manage uncertainty in women treated for breast cancer and men treated for prostate cancer. However, the UMI was not developed for men undergoing AS for prostate cancer and has not been adequately tested in this population. This article reports on the expansion of a theory-based intervention to manage the uncertainty associated with AS for prostate cancer. Intervention Theory (Sidani & Braden, 1998) is discussed as a framework for revising the UMI intervention for men undergoing AS for prostate cancer (UMI-AS). The article concludes with plans for testing of the expanded intervention and implications for the extended theory.

  5. A Genetic-Algorithms-Based Approach for Programming Linear and Quadratic Optimization Problems with Uncertainty

    Directory of Open Access Journals (Sweden)

    Weihua Jin

    2013-01-01

    Full Text Available This paper proposes a genetic-algorithms-based approach as an all-purpose problem-solving method for operation programming problems under uncertainty. The proposed method was applied for management of a municipal solid waste treatment system. Compared to the traditional interactive binary analysis, this approach has fewer limitations and is able to reduce the complexity in solving the inexact linear programming problems and inexact quadratic programming problems. The implementation of this approach was performed using the Genetic Algorithm Solver of MATLAB (trademark of MathWorks. The paper explains the genetic-algorithms-based method and presents details on the computation procedures for each type of inexact operation programming problems. A comparison of the results generated by the proposed method based on genetic algorithms with those produced by the traditional interactive binary analysis method is also presented.

  6. Optimal design and planning of glycerol-based biorefinery supply chains under uncertainty

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Carvalho, Ana; Gernaey, Krist V.

    2017-01-01

    -echelon mixed integer linear programming problem is proposed based upon a previous model, GlyThink. In the new formulation, market uncertainties are taken into account at the strategic planning level. The robustness of the supply chain structures is analyzed based on statistical data provided......The optimal design and planning of glycerol-based biorefinery supply chains is critical for the development and implementation of this concept in a sustainable manner. To achieve this, a decision-making framework is proposed in this work, to holistically optimize the design and planning...... consequences. Therefore, the proposed framework ultimately leads to the identification of the optimal design and planning decisions for the development of environmentally conscious biorefinery supply chains. The effectiveness of the presented approach is demonstrated through its application to the realistic...

  7. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  8. Valueloading And Uncertainty In A Sector-based Differentiation Scheme For Emission Allowances

    Energy Technology Data Exchange (ETDEWEB)

    Groenenberg, H. [Institute for Prospective Technological Studies, Edificio Expo, C/Inca Garcilaso, s/n, E41092, Seville (Spain); Van der Sluijs, J. [Copernicus Institute, Utrecht University (Netherlands)

    2005-07-01

    The Triptych approach is a sectoral approach for differentiation of quantitative greenhouse gas emission reduction objectives. In this study we investigate the ranges in emission reduction targets that result from differences in valueladen assumptions and uncertainties in input data and parameters. In order to assess the effect of highly valueladen assumptions on resulting objectives we used two approaches. First we performed a sensitivity analysis. Then we elaborated the approach from four ideal-typical value-orientations: the administrator, the businessman, the campaigner and the survivor. For each of these value-orientations we specified corresponding sets of assumptions of highly valueladen parameters. Within each set, we also assessed uncertainties for the remaining parameters and input data. We assessed the strength and we quantified their inexactnesses with probability distribution functions. Next, we carried out Monte Carlo simulations in each of the four value-orientations to quantify error propagation from the inexactnesses in input data and parameters. We found targets for the year 2015 for Annex I countries differed up to around 20%-points over the four value-orientations. For developing countries differences in allowances were found up to the order of four. In addition, results are affected to a large extent by uncertainties in the other input data and parameters. Ranges in the outcome resulting from uncertainties are between 10 and 35%-points for Annex I countries, depending on the value-orientation chosen and between 20 and 120%-points for non-Annex I countries. However, the ranking of countries within the calculated differentiation remains roughly the same, an exception being the ranking that resulted from the businessman's perspective. Other consistent combinations of valueladen assumptions may result in objectives that are outside the range that we based on the four value-orientations. We concluded that care should be taken when assessing

  9. Uncertainties in neural network model based on carbon dioxide concentration for occupancy estimation

    Energy Technology Data Exchange (ETDEWEB)

    Alam, Azimil Gani; Rahman, Haolia; Kim, Jung-Kyung; Han, Hwataik [Kookmin University, Seoul (Korea, Republic of)

    2017-05-15

    Demand control ventilation is employed to save energy by adjusting airflow rate according to the ventilation load of a building. This paper investigates a method for occupancy estimation by using a dynamic neural network model based on carbon dioxide concentration in an occupied zone. The method can be applied to most commercial and residential buildings where human effluents to be ventilated. An indoor simulation program CONTAMW is used to generate indoor CO{sub 2} data corresponding to various occupancy schedules and airflow patterns to train neural network models. Coefficients of variation are obtained depending on the complexities of the physical parameters as well as the system parameters of neural networks, such as the numbers of hidden neurons and tapped delay lines. We intend to identify the uncertainties caused by the model parameters themselves, by excluding uncertainties in input data inherent in measurement. Our results show estimation accuracy is highly influenced by the frequency of occupancy variation but not significantly influenced by fluctuation in the airflow rate. Furthermore, we discuss the applicability and validity of the present method based on passive environmental conditions for estimating occupancy in a room from the viewpoint of demand control ventilation applications.

  10. An optimization method based on scenario analysis for watershed management under uncertainty.

    Science.gov (United States)

    Liu, Yong; Guo, Huaicheng; Zhang, Zhenxing; Wang, Lijing; Dai, Yongli; Fan, Yingying

    2007-05-01

    In conjunction with socioeconomic development in watersheds, increasingly challenging problems, such as scarcity of water resources and environmental deterioration, have arisen. Watershed management is a useful tool for dealing with these issues and maintaining sustainable development at the watershed scale. The complex and uncertain characteristics of watershed systems have a great impact on decisions about countermeasures and other techniques that will be applied in the future. An optimization method based on scenario analysis is proposed in this paper as a means of handling watershed management under uncertainty. This method integrates system analysis, forecast methods, and scenario analysis, as well as the contributions of stakeholders and experts, into a comprehensive framework. The proposed method comprises four steps: system analyses, a listing of potential engineering techniques and countermeasures, scenario analyses, and the optimal selection of countermeasures and engineering techniques. The proposed method was applied to the case of the Lake Qionghai watershed in southwestern China, and the results are reported in this paper. This case study demonstrates that the proposed method can be used to deal efficiently with uncertainties at the watershed level. Moreover, this method takes into consideration the interests of different groups, which is crucial for successful watershed management. In particular, social, economic, environmental, and resource systems are all considered in order to improve the applicability of the method. In short, the optimization method based on scenario analysis proposed here is a valuable tool for watershed management.

  11. A Recourse-Based Type-2 Fuzzy Programming Method for Water Pollution Control under Uncertainty

    Directory of Open Access Journals (Sweden)

    Jing Liu

    2017-11-01

    Full Text Available In this study, a recourse-based type-2 fuzzy programming (RTFP method is developed for supporting water pollution control of basin systems under uncertainty. The RTFP method incorporates type-2 fuzzy programming (TFP within a two-stage stochastic programming with recourse (TSP framework to handle uncertainties expressed as type-2 fuzzy sets (i.e., a fuzzy set in which the membership function is also fuzzy and probability distributions, as well as to reflect the trade-offs between conflicting economic benefits and penalties due to violated policies. The RTFP method is then applied to a real case of water pollution control in the Heshui River Basin (a rural area of China, where chemical oxygen demand (COD, total nitrogen (TN, total phosphorus (TP, and soil loss are selected as major indicators to identify the water pollution control strategies. Solutions of optimal production plans of economic activities under each probabilistic pollutant discharge allowance level and membership grades are obtained. The results are helpful for the authorities in exploring the trade-off between economic objective and pollutant discharge decision-making based on river water pollution control.

  12. High-Frequency Replanning Under Uncertainty Using Parallel Sampling-Based Motion Planning.

    Science.gov (United States)

    Sun, Wen; Patil, Sachin; Alterovitz, Ron

    2015-02-01

    As sampling-based motion planners become faster, they can be re-executed more frequently by a robot during task execution to react to uncertainty in robot motion, obstacle motion, sensing noise, and uncertainty in the robot's kinematic model. We investigate and analyze high-frequency replanning (HFR), where, during each period, fast sampling-based motion planners are executed in parallel as the robot simultaneously executes the first action of the best motion plan from the previous period. We consider discrete-time systems with stochastic nonlinear (but linearizable) dynamics and observation models with noise drawn from zero mean Gaussian distributions. The objective is to maximize the probability of success (i.e., avoid collision with obstacles and reach the goal) or to minimize path length subject to a lower bound on the probability of success. We show that, as parallel computation power increases, HFR offers asymptotic optimality for these objectives during each period for goal-oriented problems. We then demonstrate the effectiveness of HFR for holonomic and nonholonomic robots including car-like vehicles and steerable medical needles.

  13. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    Science.gov (United States)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and

  14. GLUE Based Uncertainty Estimation of Urban Drainage Modeling Using Weather Radar Precipitation Estimates

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ellerbæk; Thorndahl, Søren Liedtke; Rasmussen, Michael R.

    2011-01-01

    the uncertainty of the weather radar rainfall input. The main findings of this work, is that the input uncertainty propagate through the urban drainage model with significant effects on the model result. The GLUE methodology is in general a usable way to explore this uncertainty although; the exact width...

  15. Managing wildfire events: risk-based decision making among a group of federal fire managers.

    Science.gov (United States)

    Wilson, Robyn S; Winter, Patricia L; Maguire, Lynn A; Ascher, Timothy

    2011-05-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206 individuals in the USDA Forest Service with authority to choose how to manage a wildfire event (i.e., line officers and incident command personnel). The results indicate that the subjects exhibited loss aversion, choosing the safe option more often when the consequences of the choice were framed as potential gains, but this tendency was less pronounced among those with risk seeking attitudes. The subjects also exhibited discounting, choosing to minimize short-term over long-term risk due to a belief that future risk could be controlled, but this tendency was less pronounced among those with more experience. Finally, the subjects, in particular those with more experience, demonstrated a status quo bias, choosing suppression more often when their reported status quo was suppression. The results of this study point to a need to carefully construct the decision process to ensure that the uncertainty and conflicting objectives inherent in wildfire management do not result in the overuse of common heuristics. Individual attitudes toward risk or an agency culture of risk aversion may counterbalance such heuristics, whereas increased experience may lead to overconfident intuitive judgments and a failure to incorporate new and relevant information into the decision. © 2010 Society for Risk Analysis.

  16. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... be modified to fulfill the extra function according to external requirements. The control center is designed as a highest level agent in MAS to coordinate all the lower agents to prevent the system wide voltage disturbance. A hybrid simulation platform with MATLAB and RTDS is set up to demonstrate...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  17. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  18. A global parallel model based design of experiments method to minimize model output uncertainty.

    Science.gov (United States)

    Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E

    2012-03-01

    Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.

  19. GLUE Based Marine X-Band Weather Radar Data Calibration and Uncertainty Estimation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ellerbæk; Beven, Keith; Thorndahl, Søren Liedtke

    2015-01-01

    The Generalized Likelihood Uncertainty Estimation methodology (GLUE) is investigated for radar rainfall calibration and uncertainty assessment. The method is used to calibrate radar data collected by a Local Area Weather Radar (LAWR). In contrast to other LAWR data calibrations, the method combines...... calibration with uncertainty estimation. Instead of searching for a single set of calibration parameters, the method uses the observations to construct distributions of the calibration parameters. These parameter sets provide valuable knowledge of parameter sensitivity and the uncertainty. Two approaches...... improves the performance significantly. It is found that even if the dynamic adjustment method is used the uncertainty of rainfall estimates can still be significant....

  20. New Active Control Method Based on Using Multiactuators and Sensors Considering Uncertainty of Parameters

    Directory of Open Access Journals (Sweden)

    Babak Karimpour

    2014-01-01

    Full Text Available New approach is presented for controlling the structural vibrations. The proposed active control method is based on structural dynamics theories in which multiactuators and sensors are utilized. Each actuator force is modeled as an equivalent viscous damper so that several lower vibration modes are damped critically. This subject is achieved by simple mathematical formulation. The proposed method does not depend on the type of dynamic load and it could be applied to control structures with multidegrees of freedom. For numerical verification of proposed method, several criterions such as maximum displacement, maximum kinetic energy, maximum drift, and time history of controlled force and displacement are evaluated in two- , five- , and seven-story shear buildings, subjected to the harmonic load, impact force, and the Elcentro base excitation. This study shows that the proposed method has suitable efficiency for reducing structural vibrations. Moreover, the uncertainty effect of different parameters is investigated here.

  1. Robust optimization based energy dispatch in smart grids considering demand uncertainty

    Science.gov (United States)

    Nassourou, M.; Puig, V.; Blesa, J.

    2017-01-01

    In this study we discuss the application of robust optimization to the problem of economic energy dispatch in smart grids. Robust optimization based MPC strategies for tackling uncertain load demands are developed. Unexpected additive disturbances are modelled by defining an affine dependence between the control inputs and the uncertain load demands. The developed strategies were applied to a hybrid power system connected to an electrical power grid. Furthermore, to demonstrate the superiority of the standard Economic MPC over the MPC tracking, a comparison (e.g average daily cost) between the standard MPC tracking, the standard Economic MPC, and the integration of both in one-layer and two-layer approaches was carried out. The goal of this research is to design a controller based on Economic MPC strategies, that tackles uncertainties, in order to minimise economic costs and guarantee service reliability of the system.

  2. Uncertainty-Based Approach for Dynamic Aerodynamic Data Acquisition and Analysis

    Science.gov (United States)

    Heim, Eugene H. D.; Bandon, Jay M.

    2004-01-01

    Development of improved modeling methods to provide increased fidelity of flight predictions for aircraft motions during flight in flow regimes with large nonlinearities requires improvements in test techniques for measuring and characterizing wind tunnel data. This paper presents a method for providing a measure of data integrity for static and forced oscillation test techniques. Data integrity is particularly important when attempting to accurately model and predict flight of today s high performance aircraft which are operating in expanded flight envelopes, often maneuvering at high angular rates at high angles-of-attack, even above maximum lift. Current aerodynamic models are inadequate in predicting flight characteristics in the expanded envelope, such as rapid aircraft departures and other unusual motions. Present wind tunnel test methods do not factor changes of flow physics into data acquisition schemes, so in many cases data are obtained over more iterations than required, or insufficient data may be obtained to determine a valid estimate with statistical significance. Additionally, forced oscillation test techniques, one of the primary tools used to develop dynamic models, do not currently provide estimates of the uncertainty of the results during an oscillation cycle. A method to optimize the required number of forced oscillation cycles based on decay of uncertainty gradients and balance tolerances is also presented.

  3. Determination of uncertainties of PWR spent fuel radionuclide inventory based on real operational history data

    Energy Technology Data Exchange (ETDEWEB)

    Fast, Ivan; Bosbach, Dirk [Institute of Energy- and Climate Research, Nuclear Waste Management and Reactor Safety Research, IEK-6, Forschungszentrum, Julich GmbH, (Germany); Aksyutina, Yuliya; Tietze-Jaensch, Holger [German Product Control Office for Radioactive Waste (PKS), Institute of Energy- and Climate Research, Nuclear Waste Management and Reactor Safety Research, IEK-6, Forschungszentrum Julich GmbH, (Germany)

    2015-07-01

    A requisite for the official approval of the safe final disposal of SNF is a comprehensive specification and declaration of the nuclear inventory in SNF by the waste supplier. In the verification process both the values of the radionuclide (RN) activities and their uncertainties are required. Burn-up (BU) calculations based on typical and generic reactor operational parameters do not encompass any possible uncertainties observed in real reactor operations. At the same time, the details of the irradiation history are often not well known, which complicates the assessment of declared RN inventories. Here, we have compiled a set of burnup calculations accounting for the operational history of 339 published or anonymized real PWR fuel assemblies (FA). These histories were used as a basis for a 'SRP analysis', to provide information about the range of the values of the associated secondary reactor parameters (SRP's). Hence, we can calculate the realistic variation or spectrum of RN inventories. SCALE 6.1 has been employed for the burn-up calculations. The results have been validated using experimental data from the online database - SFCOMPO-1 and -2. (authors)

  4. Impact of Mindfulness-Based Cognitive Therapy on Intolerance of Uncertainty in Patients with Panic Disorder

    Science.gov (United States)

    Kim, Min Kuk; Lee, Kang Soo; Kim, Borah; Choi, Tai Kiu

    2016-01-01

    Objective Intolerance of uncertainty (IU) is a transdiagnostic construct in various anxiety and depressive disorders. However, the relationship between IU and panic symptom severity is not yet fully understood. We examined the relationship between IU, panic, and depressive symptoms during mindfulness-based cognitive therapy (MBCT) in patients with panic disorder. Methods We screened 83 patients with panic disorder and subsequently enrolled 69 of them in the present study. Patients participating in MBCT for panic disorder were evaluated at baseline and at 8 weeks using the Intolerance of Uncertainty Scale (IUS), Panic Disorder Severity Scale-Self Report (PDSS-SR), and Beck Depression Inventory (BDI). Results There was a significant decrease in scores on the IUS (ppanic disorder. Pre-treatment IUS scores significantly correlated with pre-treatment PDSS (p=0.003) and BDI (p=0.003) scores. We also found a significant association between the reduction in IU and PDSS after controlling for the reduction in the BDI score (ppanic disorder. MBCT is effective in lowering IU in patients with panic disorder. PMID:27081380

  5. Effect of uncertainties on probabilistic-based design capacity of hydrosystems

    Science.gov (United States)

    Tung, Yeou-Koung

    2018-02-01

    Hydrosystems engineering designs involve analysis of hydrometric data (e.g., rainfall, floods) and use of hydrologic/hydraulic models, all of which contribute various degrees of uncertainty to the design process. Uncertainties in hydrosystem designs can be generally categorized into aleatory and epistemic types. The former arises from the natural randomness of hydrologic processes whereas the latter are due to knowledge deficiency in model formulation and model parameter specification. This study shows that the presence of epistemic uncertainties induces uncertainty in determining the design capacity. Hence, the designer needs to quantify the uncertainty features of design capacity to determine the capacity with a stipulated performance reliability under the design condition. Using detention basin design as an example, the study illustrates a methodological framework by considering aleatory uncertainty from rainfall and epistemic uncertainties from the runoff coefficient, curve number, and sampling error in design rainfall magnitude. The effects of including different items of uncertainty and performance reliability on the design detention capacity are examined. A numerical example shows that the mean value of the design capacity of the detention basin increases with the design return period and this relation is found to be practically the same regardless of the uncertainty types considered. The standard deviation associated with the design capacity, when subject to epistemic uncertainty, increases with both design frequency and items of epistemic uncertainty involved. It is found that the epistemic uncertainty due to sampling error in rainfall quantiles should not be ignored. Even with a sample size of 80 (relatively large for a hydrologic application) the inclusion of sampling error in rainfall quantiles resulted in a standard deviation about 2.5 times higher than that considering only the uncertainty of the runoff coefficient and curve number. Furthermore, the

  6. Evaluation of uncertainty in capturing the spatial variability and magnitudes of extreme hydrological events for the uMngeni catchment, South Africa

    Science.gov (United States)

    Kusangaya, Samuel; Warburton Toucher, Michele L.; van Garderen, Emma Archer

    2018-02-01

    Downscaled General Circulation Models (GCMs) output are used to forecast climate change and provide information used as input for hydrological modelling. Given that our understanding of climate change points towards an increasing frequency, timing and intensity of extreme hydrological events, there is therefore the need to assess the ability of downscaled GCMs to capture these extreme hydrological events. Extreme hydrological events play a significant role in regulating the structure and function of rivers and associated ecosystems. In this study, the Indicators of Hydrologic Alteration (IHA) method was adapted to assess the ability of simulated streamflow (using downscaled GCMs (dGCMs)) in capturing extreme river dynamics (high and low flows), as compared to streamflow simulated using historical climate data from 1960 to 2000. The ACRU hydrological model was used for simulating streamflow for the 13 water management units of the uMngeni Catchment, South Africa. Statistically downscaled climate models obtained from the Climate System Analysis Group at the University of Cape Town were used as input for the ACRU Model. Results indicated that, high flows and extreme high flows (one in ten year high flows/large flood events) were poorly represented both in terms of timing, frequency and magnitude. Simulated streamflow using dGCMs data also captures more low flows and extreme low flows (one in ten year lowest flows) than that captured in streamflow simulated using historical climate data. The overall conclusion was that although dGCMs output can reasonably be used to simulate overall streamflow, it performs poorly when simulating extreme high and low flows. Streamflow simulation from dGCMs must thus be used with caution in hydrological applications, particularly for design hydrology, as extreme high and low flows are still poorly represented. This, arguably calls for the further improvement of downscaling techniques in order to generate climate data more relevant and

  7. A Discussion on Uncertainty Representation and Interpretation in Model-based Prognostics Algorithms based on Kalman Filter Estimation Applied to Prognostics of Electronics Components

    Data.gov (United States)

    National Aeronautics and Space Administration — This article presented a discussion on uncertainty representation and management for model-based prog- nostics methodologies based on the Bayesian tracking framework...

  8. Event-based internet biosurveillance: relation to epidemiological observation

    Directory of Open Access Journals (Sweden)

    Nelson Noele P

    2012-06-01

    Full Text Available Abstract Background The World Health Organization (WHO collects and publishes surveillance data and statistics for select diseases, but traditional methods of gathering such data are time and labor intensive. Event-based biosurveillance, which utilizes a variety of Internet sources, complements traditional surveillance. In this study we assess the reliability of Internet biosurveillance and evaluate disease-specific alert criteria against epidemiological data. Methods We reviewed and compared WHO epidemiological data and Argus biosurveillance system data for pandemic (H1N1 2009 (April 2009 – January 2010 from 8 regions and 122 countries to: identify reliable alert criteria among 15 Argus-defined categories; determine the degree of data correlation for disease progression; and assess timeliness of Internet information. Results Argus generated a total of 1,580 unique alerts; 5 alert categories generated statistically significant (p  Conclusion Confirmed pandemic (H1N1 2009 cases collected by Argus and WHO methods returned consistent results and confirmed the reliability and timeliness of Internet information. Disease-specific alert criteria provide situational awareness and may serve as proxy indicators to event progression and escalation in lieu of traditional surveillance data; alerts may identify early-warning indicators to another pandemic, preparing the public health community for disease events.

  9. Impact of model and dose uncertainty on model-based selection of oropharyngeal cancer patients for proton therapy.

    Science.gov (United States)

    Bijman, Rik G; Breedveld, Sebastiaan; Arts, Tine; Astreinidou, Eleftheria; de Jong, Martin A; Granton, Patrick V; Petit, Steven F; Hoogeman, Mischa S

    2017-11-01

    Proton therapy is becoming increasingly available, so it is important to apply objective and individualized patient selection to identify those who are expected to benefit most from proton therapy compared to conventional intensity modulated radiation therapy (IMRT). Comparative treatment planning using normal tissue complication probability (NTCP) evaluation has recently been proposed. This work investigates the impact of NTCP model and dose uncertainties on model-based patient selection. We used IMRT and intensity modulated proton therapy (IMPT) treatment plans of 78 oropharyngeal cancer patients, which were generated based on automated treatment planning and evaluated based on three published NTCP models. A reduction in NTCP of more than a certain threshold (e.g. 10% lower NTCP) leads to patient selection for IMPT, referred to as 'nominal' selection. To simulate the effect of uncertainties in NTCP-model coefficients (based on reported confidence intervals) and planned doses on the accuracy of model-based patient selection, the Monte Carlo method was used to sample NTCP-model coefficients and doses from a probability distribution centered at their nominal values. Patient selection accuracy within a certain sample was defined as the fraction of patients which had similar selection in both the 'nominal' and 'sampled' scenario. For all three NTCP models, the median patient selection accuracy was found to be above 70% when only NTCP-model uncertainty was considered. Selection accuracy decreased with increasing uncertainty resulting from differences between planned and delivered dose. In case of excessive dose uncertainty, selection accuracy decreased to 60%. Model and dose uncertainty highly influence the accuracy of model-based patient selection for proton therapy. A reduction of NTCP-model uncertainty is necessary to reach more accurate model-based patient selection.

  10. Event-based image recognition applied in tennis training assistance

    Science.gov (United States)

    Wawrzyniak, Zbigniew M.; Kowalski, Adam

    2016-09-01

    This paper presents a concept of a real-time system for individual tennis training assistance. The system is supposed to provide user (player) with information on his strokes accuracy as well as other training quality parameters such as velocity and rotation of the ball during its flight. The method is based on image processing methods equipped with developed explorative analysis of the events and their description by parameters of the movement. There has been presented the concept for further deployment to create a complete system that could assist tennis player during individual training.

  11. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  12. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  13. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  14. Identification of new events in Apollo 16 lunar seismic data by Hidden Markov Model-based event detection and classification

    Science.gov (United States)

    Knapmeyer-Endrun, Brigitte; Hammer, Conny

    2015-10-01

    Detection and identification of interesting events in single-station seismic data with little prior knowledge and under tight time constraints is a typical scenario in planetary seismology. The Apollo lunar seismic data, with the only confirmed events recorded on any extraterrestrial body yet, provide a valuable test case. Here we present the application of a stochastic event detector and classifier to the data of station Apollo 16. Based on a single-waveform example for each event class and some hours of background noise, the system is trained to recognize deep moonquakes, impacts, and shallow moonquakes and performs reliably over 3 years of data. The algorithm's demonstrated ability to detect rare events and flag previously undefined signal classes as new event types is of particular interest in the analysis of the first seismic recordings from a completely new environment. We are able to classify more than 50% of previously unclassified lunar events, and additionally find over 200 new events not listed in the current lunar event catalog. These events include deep moonquakes as well as impacts and could be used to update studies on temporal variations in event rate or deep moonquakes stacks used in phase picking for localization. No unambiguous new shallow moonquake was detected, but application to data of the other Apollo stations has the potential for additional new discoveries 40 years after the data were recorded. Besides, the classification system could be useful for future seismometer missions to other planets, e.g., the InSight mission to Mars.

  15. A Novel SHLNN Based Robust Control and Tracking Method for Hypersonic Vehicle under Parameter Uncertainty

    Directory of Open Access Journals (Sweden)

    Chuanfeng Li

    2017-01-01

    Full Text Available Hypersonic vehicle is a typical parameter uncertain system with significant characteristics of strong coupling, nonlinearity, and external disturbance. In this paper, a combined system modeling approach is proposed to approximate the actual vehicle system. The state feedback control strategy is adopted based on the robust guaranteed cost control (RGCC theory, where the Lyapunov function is applied to get control law for nonlinear system and the problem is transformed into a feasible solution by linear matrix inequalities (LMI method. In addition, a nonfragile guaranteed cost controller solved by LMI optimization approach is employed to the linear error system, where a single hidden layer neural network (SHLNN is employed as an additive gain compensator to reduce excessive performance caused by perturbations and uncertainties. Simulation results show the stability and well tracking performance for the proposed strategy in controlling the vehicle system.

  16. Combination of anti-optimization and fuzzy-set-based analysis for structural optimization under uncertainty

    Directory of Open Access Journals (Sweden)

    J. Fang

    1998-01-01

    Full Text Available An approach to the optimum design of structures, in which uncertainties with a fuzzy nature in the magnitude of the loads are considered, is proposed in this study. The optimization process under fuzzy loads is transformed into a fuzzy optimization problem based on the notion of Werners' maximizing set by defining membership functions of the objective function and constraints. In this paper, Werner's maximizing set is defined using the results obtained by first conducting an optimization through anti-optimization modeling of the uncertain loads. An example of a ten-bar truss is used to illustrate the present optimization process. The results are compared with those yielded by other optimization methods.

  17. Uncertainty in measurements by counting

    Science.gov (United States)

    Bich, Walter; Pennecchi, Francesca

    2012-02-01

    Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

  18. An approach based on Hierarchical Bayesian Graphical Models for measurement interpretation under uncertainty

    Science.gov (United States)

    Skataric, Maja; Bose, Sandip; Zeroug, Smaine; Tilke, Peter

    2017-02-01

    It is not uncommon in the field of non-destructive evaluation that multiple measurements encompassing a variety of modalities are available for analysis and interpretation for determining the underlying states of nature of the materials or parts being tested. Despite and sometimes due to the richness of data, significant challenges arise in the interpretation manifested as ambiguities and inconsistencies due to various uncertain factors in the physical properties (inputs), environment, measurement device properties, human errors, and the measurement data (outputs). Most of these uncertainties cannot be described by any rigorous mathematical means, and modeling of all possibilities is usually infeasible for many real time applications. In this work, we will discuss an approach based on Hierarchical Bayesian Graphical Models (HBGM) for the improved interpretation of complex (multi-dimensional) problems with parametric uncertainties that lack usable physical models. In this setting, the input space of the physical properties is specified through prior distributions based on domain knowledge and expertise, which are represented as Gaussian mixtures to model the various possible scenarios of interest for non-destructive testing applications. Forward models are then used offline to generate the expected distribution of the proposed measurements which are used to train a hierarchical Bayesian network. In Bayesian analysis, all model parameters are treated as random variables, and inference of the parameters is made on the basis of posterior distribution given the observed data. Learned parameters of the posterior distribution obtained after the training can therefore be used to build an efficient classifier for differentiating new observed data in real time on the basis of pre-trained models. We will illustrate the implementation of the HBGM approach to ultrasonic measurements used for cement evaluation of cased wells in the oil industry.

  19. Power distribution system diagnosis with uncertainty information based on rough sets and clouds model

    Science.gov (United States)

    Sun, Qiuye; Zhang, Huaguang

    2006-11-01

    During the distribution system fault period, usually the explosive growth signals including fuzziness and randomness are too redundant to make right decision for the dispatcher. The volume of data with a few uncertainties overwhelms classic information systems in the distribution control center and exacerbates the existing knowledge acquisition process of expert systems. So intelligent methods must be developed to aid users in maintaining and using this abundance of information effectively. An important issue in distribution fault diagnosis system (DFDS) is to allow the discovered knowledge to be as close as possible to natural languages to satisfy user needs with tractability, and to offer DFDS robustness. At this junction, the paper describes a systematic approach for detecting superfluous data. The approach therefore could offer user both the opportunity to learn about the data and to validate the extracted knowledge. It is considered as a "white box" rather than a "black box" like in the case of neural network. The cloud theory is introduced and the mathematical description of cloud has effectively integrated the fuzziness and randomness of linguistic terms in a unified way. Based on it, a method of knowledge representation in DFDS is developed which bridges the gap between quantitative knowledge and qualitative knowledge. In relation to classical rough set, the cloud-rough method can deal with the uncertainty of the attribute and make a soft discretization for continuous ones (such as the current and the voltage). A novel approach, including discretization, attribute reduction, rule reliability computation and equipment reliability computation, is presented. The data redundancy is greatly reduced based on an integrated use of cloud theory and rough set theory. Illustrated with a power distribution DFDS shows the effectiveness and practicality of the proposed approach.

  20. A Bayesian-based multilevel factorial analysis method for analyzing parameter uncertainty of hydrological model

    Science.gov (United States)

    Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.

    2017-10-01

    In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.

  1. UNCERTAINTY MANAGEMENT IN SEISMIC VULNERABILITY ASSESSMENT USING GRANULAR COMPUTING BASED ON COVERING OF UNIVERSE

    Directory of Open Access Journals (Sweden)

    F. Khamespanah

    2013-05-01

    Granular computing model concentrates on a general theory and methodology for problem solving as well as information processing by assuming multiple levels of granularity. Basic elements in granular computing are subsets, classes, and clusters of a universe called elements. In this research GrC is used for extracting classification rules based on seismic vulnerability with minimum entropy to handle uncertainty related to earthquake data. Tehran was selected as the study area. In our previous research, Granular computing model based on a partition model of universe was employed. The model has some kinds of limitations in defining similarity between elements of the universe and defining granules. In the model similarity between elements is defined based on an equivalence relation. According to this relation, two objects are similar based on some attributes, provided for each attribute the values of these objects are equal. In this research a general relation for defining similarity between elements of universe is proposed. The general relation is used for defining similarity and instead of partitioning the universe, granulation is done based on covering of universe. As a result of the study, a physical seismic vulnerability map of Tehran has been produced based on granular computing model. The accuracy of the seismic vulnerability map is evaluated using granular computing model based on covering of universe. The comparison between this model and granular computing model based on partition model of universe is undertaken which verified the superiority of the GrC based on covering of the universe in terms of the match between the achieved results with those confirmed by the related experts' judgments.

  2. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  3. Event-based total suspended sediment particle size distribution model

    Science.gov (United States)

    Thompson, Jennifer; Sattar, Ahmed M. A.; Gharabaghi, Bahram; Warner, Richard C.

    2016-05-01

    One of the most challenging modelling tasks in hydrology is prediction of the total suspended sediment particle size distribution (TSS-PSD) in stormwater runoff generated from exposed soil surfaces at active construction sites and surface mining operations. The main objective of this study is to employ gene expression programming (GEP) and artificial neural networks (ANN) to develop a new model with the ability to more accurately predict the TSS-PSD by taking advantage of both event-specific and site-specific factors in the model. To compile the data for this study, laboratory scale experiments using rainfall simulators were conducted on fourteen different soils to obtain TSS-PSD. This data is supplemented with field data from three construction sites in Ontario over a period of two years to capture the effect of transport and deposition within the site. The combined data sets provide a wide range of key overlooked site-specific and storm event-specific factors. Both parent soil and TSS-PSD in runoff are quantified by fitting each to a lognormal distribution. Compared to existing regression models, the developed model more accurately predicted the TSS-PSD using a more comprehensive list of key model input parameters. Employment of the new model will increase the efficiency of deployment of required best management practices, designed based on TSS-PSD, to minimize potential adverse effects of construction site runoff on aquatic life in the receiving watercourses.

  4. Enhanced extended state observer-based control for systems with mismatched uncertainties and disturbances.

    Science.gov (United States)

    Castillo, A; García, P; Sanz, R; Albertos, P

    2017-12-19

    This paper presents an enhanced Extended State Observer (ESO)-based control strategy to deal with the disturbance attenuation problem for a class of non integral-chain systems subject to non-linear mismatched uncertainties and external disturbances. The proposed control strategy does not assume the integral-chain form and it is formed by a state-feedback plus a dynamic disturbance compensation term, which is designed to reject the disturbance effect in the system output. From a theoretical point of view, the proposed strategy is reduced to the conventional ESO when the integral chain form and the matched condition hold. In this sense, this paper is presented as an extension of the ESO principles to cover a wider class of systems. The theoretical results show that the internal zero-dynamics plays an important role in ESO-based control design. Also, the closed-loop stability is analyzed and some numerical simulations show the effectiveness of the proposal in comparison with previous ESO-based techniques. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Landslide Displacement Prediction With Uncertainty Based on Neural Networks With Random Hidden Weights.

    Science.gov (United States)

    Lian, Cheng; Zeng, Zhigang; Yao, Wei; Tang, Huiming; Chen, Chun Lung Philip

    2016-12-01

    In this paper, we propose a new approach to establish a landslide displacement forecasting model based on artificial neural networks (ANNs) with random hidden weights. To quantify the uncertainty associated with the predictions, a framework for probabilistic forecasting of landslide displacement is developed. The aim of this paper is to construct prediction intervals (PIs) instead of deterministic forecasting. A lower-upper bound estimation (LUBE) method is adopted to construct ANN-based PIs, while a new single hidden layer feedforward ANN with random hidden weights for LUBE is proposed. Unlike the original implementation of LUBE, the input weights and hidden biases of the ANN are randomly chosen, and only the output weights need to be adjusted. Combining particle swarm optimization (PSO) and gravitational search algorithm (GSA), a hybrid evolutionary algorithm, PSOGSA, is utilized to optimize the output weights. Furthermore, a new ANN objective function, which combines a modified combinational coverage width-based criterion with one-norm regularization, is proposed. Two benchmark data sets and two real-world landslide data sets are presented to illustrate the capability and merit of our method. Experimental results reveal that the proposed method can construct high-quality PIs.

  6. Pu239 Cross-Section Variations Based on Experimental Uncertainties and Covariances

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David Edward [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parsons, D. Kent [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-18

    Algorithms and software have been developed for producing variations in plutonium-239 neutron cross sections based on experimental uncertainties and covariances. The varied cross-section sets may be produced as random samples from the multi-variate normal distribution defined by an experimental mean vector and covariance matrix, or they may be produced as Latin-Hypercube/Orthogonal-Array samples (based on the same means and covariances) for use in parametrized studies. The variations obey two classes of constraints that are obligatory for cross-section sets and which put related constraints on the mean vector and covariance matrix that detemine the sampling. Because the experimental means and covariances do not obey some of these constraints to sufficient precision, imposing the constraints requires modifying the experimental mean vector and covariance matrix. Modification is done with an algorithm based on linear algebra that minimizes changes to the means and covariances while insuring that the operations that impose the different constraints do not conflict with each other.

  7. Life cycle cost optimization of biofuel supply chains under uncertainties based on interval linear programming

    DEFF Research Database (Denmark)

    Ren, Jingzheng; Dong, Liang; Sun, Lu

    2015-01-01

    The aim of this work was to develop a model for optimizing the life cycle cost of biofuel supply chain under uncertainties. Multiple agriculture zones, multiple transportation modes for the transport of grain and biofuel, multiple biofuel plants, and multiple market centers were considered...... model, and the results showed that the proposed model is feasible for designing biofuel supply chain under uncertainties...

  8. A Multi-Band Uncertainty Set Based Robust SCUC With Spatial and Temporal Budget Constraints

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Chenxi; Wu, Lei; Wu, Hongyu

    2016-11-01

    The dramatic increase of renewable energy resources in recent years, together with the long-existing load forecast errors and increasingly involved price sensitive demands, has introduced significant uncertainties into power systems operation. In order to guarantee the operational security of power systems with such uncertainties, robust optimization has been extensively studied in security-constrained unit commitment (SCUC) problems, for immunizing the system against worst uncertainty realizations. However, traditional robust SCUC models with single-band uncertainty sets may yield over-conservative solutions in most cases. This paper proposes a multi-band robust model to accurately formulate various uncertainties with higher resolution. By properly tuning band intervals and weight coefficients of individual bands, the proposed multi-band robust model can rigorously and realistically reflect spatial/temporal relationships and asymmetric characteristics of various uncertainties, and in turn could effectively leverage the tradeoff between robustness and economics of robust SCUC solutions. The proposed multi-band robust SCUC model is solved by Benders decomposition (BD) and outer approximation (OA), while taking the advantage of integral property of the proposed multi-band uncertainty set. In addition, several accelerating techniques are developed for enhancing the computational performance and the convergence speed. Numerical studies on a 6-bus system and the modified IEEE 118-bus system verify the effectiveness of the proposed robust SCUC approach for enhancing uncertainty modeling capabilities and mitigating conservativeness of the robust SCUC solution.

  9. Application of stochastic programming to reduce uncertainty in quality-based supply planning of slaughterhouses

    NARCIS (Netherlands)

    Rijpkema, W.A.; Hendrix, E.M.T.; Rossi, R.; Vorst, van der J.G.A.J.

    2016-01-01

    To match products of different quality with end market preferences under supply uncertainty, it is crucial to integrate product quality information in logistics decision making. We present a case of this integration in a meat processing company that faces uncertainty in delivered livestock quality.

  10. Evaluating uncertainty in 7Be-based soil erosion estimates: an experimental plot approach

    Science.gov (United States)

    Blake, Will; Taylor, Alex; Abdelli, Wahid; Gaspar, Leticia; Barri, Bashar Al; Ryken, Nick; Mabit, Lionel

    2014-05-01

    Soil erosion remains a major concern for the international community and there is a growing need to improve the sustainability of agriculture to support future food security. High resolution soil erosion data are a fundamental requirement for underpinning soil conservation and management strategies but representative data on soil erosion rates are difficult to achieve by conventional means without interfering with farming practice and hence compromising the representativeness of results. Fallout radionuclide (FRN) tracer technology offers a solution since FRN tracers are delivered to the soil surface by natural processes and, where irreversible binding can be demonstrated, redistributed in association with soil particles. While much work has demonstrated the potential of short-lived 7Be (half-life 53 days), particularly in quantification of short-term inter-rill erosion, less attention has focussed on sources of uncertainty in derived erosion measurements and sampling strategies to minimise these. This poster outlines and discusses potential sources of uncertainty in 7Be-based soil erosion estimates and the experimental design considerations taken to quantify these in the context of a plot-scale validation experiment. Traditionally, gamma counting statistics have been the main element of uncertainty propagated and reported but recent work has shown that other factors may be more important such as: (i) spatial variability in the relaxation mass depth that describes the shape of the 7Be depth distribution for an uneroded point; (ii) spatial variability in fallout (linked to rainfall patterns and shadowing) over both reference site and plot; (iii) particle size sorting effects; (iv) preferential mobility of fallout over active runoff contributing areas. To explore these aspects in more detail, a plot of 4 x 35 m was ploughed and tilled to create a bare, sloped soil surface at the beginning of winter 2013/2014 in southwest UK. The lower edge of the plot was bounded by

  11. Barometric pressure and triaxial accelerometry-based falls event detection.

    Science.gov (United States)

    Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Lovell, Nigel H

    2010-12-01

    Falls and fall related injuries are a significant cause of morbidity, disability, and health care utilization, particularly among the age group of 65 years and over. The ability to detect falls events in an unsupervised manner would lead to improved prognoses for falls victims. Several wearable accelerometry and gyroscope-based falls detection devices have been described in the literature; however, they all suffer from unacceptable false positive rates. This paper investigates the augmentation of such systems with a barometric pressure sensor, as a surrogate measure of altitude, to assist in discriminating real fall events from normal activities of daily living. The acceleration and air pressure data are recorded using a wearable device attached to the subject's waist and analyzed offline. The study incorporates several protocols including simulated falls onto a mattress and simulated activities of daily living, in a cohort of 20 young healthy volunteers (12 male and 8 female; age: 23.7 ±3.0 years). A heuristically trained decision tree classifier is used to label suspected falls. The proposed system demonstrated considerable improvements in comparison to an existing accelerometry-based technique; showing an accuracy, sensitivity and specificity of 96.9%, 97.5%, and 96.5%, respectively, in the indoor environment, with no false positives generated during extended testing during activities of daily living. This is compared to 85.3%, 75%, and 91.5% for the same measures, respectively, when using accelerometry alone. The increased specificity of this system may enhance the usage of falls detectors among the elderly population.

  12. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    Science.gov (United States)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its

  13. Coverage-based treatment planning to accommodate delineation uncertainties in prostate cancer treatment.

    Science.gov (United States)

    Xu, Huijun; Gordon, J James; Siebers, Jeffrey V

    2015-09-01

    To compare two coverage-based planning (CP) techniques with fixed margin-based (FM) planning for high-risk prostate cancer treatments, with the exclusive consideration of the dosimetric impact of delineation uncertainties of target structures and normal tissues. In this work, 19-patient data sets were involved. To estimate structure dose for each delineated contour under the influence of interobserver contour variability and CT image quality limitations, 1000 alternative structures were simulated by an average-surface-of-standard-deviation model, which utilized the patient-specific information of delineated structure and CT image contrast. An IMRT plan with zero planning-target-volume (PTV) margin on the delineated prostate and seminal vesicles [clinical-target-volume (CTV prostate) and CTVSV] was created and dose degradation due to contour variability was quantified by the dosimetric consequences of 1000 alternative structures. When D98 failed to achieve a 95% coverage probability objective D98,95 ≥ 78 Gy (CTV prostate) or D98,95 ≥ 66 Gy (CTVSV), replanning was performed using three planning techniques: (1) FM (PTV prostate margin = 4,5,6 mm and PTVSV margin = 4,5,7 mm for RL, PA, and SI directions, respectively), (2) CPOM which optimized uniform PTV margins for CTV prostate and CTVSV to meet the D98,95 objectives, and (3) CPCOP which directly optimized coverage-based objectives for all the structures. These plans were intercompared by computing percentile dose-volume histograms and tumor-control probability/normal tissue complication probability (TCP/NTCP) distributions. Inherent contour variability resulted in unacceptable CTV coverage for the zero-PTV-margin plans for all patients. For plans designed to accommodate contour variability, 18/19 CP plans were most favored by achieving desirable D98,95 and TCP/NTCP values. The average improvement of probability of complication free control was 9.3% for CPCOP plans and 3.4% for CPOM plans. When the delineation

  14. Hybrid uncertainty-based design optimization and its application to hybrid rocket motors for manned lunar landing

    Directory of Open Access Journals (Sweden)

    Hao Zhu

    2017-04-01

    Full Text Available Design reliability and robustness are getting increasingly important for the general design of aerospace systems with many inherently uncertain design parameters. This paper presents a hybrid uncertainty-based design optimization (UDO method developed from probability theory and interval theory. Most of the uncertain design parameters which have sufficient information or experimental data are classified as random variables using probability theory, while the others are defined as interval variables with interval theory. Then a hybrid uncertainty analysis method based on Monte Carlo simulation and Taylor series interval analysis is developed to obtain the uncertainty propagation from the design parameters to system responses. Three design optimization strategies, including deterministic design optimization (DDO, probabilistic UDO and hybrid UDO, are applied to the conceptual design of a hybrid rocket motor (HRM used as the ascent propulsion system in Apollo lunar module. By comparison, the hybrid UDO is a feasible method and can be effectively applied to the general design of aerospace systems.

  15. Enhancing emotion-based learning in decision-making under uncertainty.

    Science.gov (United States)

    Alarcón, David; Amián, Josué G; Sánchez-Medina, José A

    2015-01-01

    The Iowa Gambling Task (IGT) is widely used to study decision-making differences between several clinical and healthy populations. Unlike the healthy participants, clinical participants have difficulty choosing between advantageous options, which yield long-term benefits, and disadvantageous options, which give high immediate rewards but lead to negative profits. However, recent studies have found that healthy participants avoid the options with a higher frequency of losses regardless of whether or not they are profitable in the long run. The aim of this study was to control for the confounding effect of the frequency of losses between options to improve the performance of healthy participants on the IGT. Eighty healthy participants were randomly assigned to the original IGT or a modified version of the IGT that diminished the gap in the frequency of losses between options. The participants who used the modified IGT version learned to make better decisions based on long-term profit, as indicated by an earlier ability to discriminate good from bad options, and took less time to make their choices. This research represents an advance in the study of decision making under uncertainty by showing that emotion-based learning is improved by controlling for the loss-frequency bias effect.

  16. Reliability-Based Marginal Cost Pricing Problem Case with Both Demand Uncertainty and Travelers’ Perception Errors

    Directory of Open Access Journals (Sweden)

    Shaopeng Zhong

    2013-01-01

    Full Text Available Focusing on the first-best marginal cost pricing (MCP in a stochastic network with both travel demand uncertainty and stochastic perception errors within the travelers’ route choice decision processes, this paper develops a perceived risk-based stochastic network marginal cost pricing (PRSN-MCP model. Numerical examples based on an integrated method combining the moment analysis approach, the fitting distribution method, and the reliability measures are also provided to demonstrate the importance and properties of the proposed model. The main finding is that ignoring the effect of travel time reliability and travelers’ perception errors may significantly reduce the performance of the first-best MCP tolls, especially under high travelers’ confidence and network congestion levels. The analysis result could also enhance our understanding of (1 the effect of stochastic perception error (SPE on the perceived travel time distribution and the components of road toll; (2 the effect of road toll on the actual travel time distribution and its reliability measures; (3 the effect of road toll on the total network travel time distribution and its statistics; and (4 the effect of travel demand level and the value of reliability (VoR level on the components of road toll.

  17. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  18. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  19. Statistical uncertainty of eddy flux-based estimates of gross ecosystem carbon exchange at Howland Forest, Maine

    Science.gov (United States)

    Hagen, S. C.; Braswell, B. H.; Linder, E.; Frolking, S.; Richardson, A. D.; Hollinger, D. Y.

    2006-04-01

    We present an uncertainty analysis of gross ecosystem carbon exchange (GEE) estimates derived from 7 years of continuous eddy covariance measurements of forest-atmosphere CO2 fluxes at Howland Forest, Maine, USA. These data, which have high temporal resolution, can be used to validate process modeling analyses, remote sensing assessments, and field surveys. However, separation of tower-based net ecosystem exchange (NEE) into its components (respiration losses and photosynthetic uptake) requires at least one application of a model, which is usually a regression model fitted to nighttime data and extrapolated for all daytime intervals. In addition, the existence of a significant amount of missing data in eddy flux time series requires a model for daytime NEE as well. Statistical approaches for analytically specifying prediction intervals associated with a regression require, among other things, constant variance of the data, normally distributed residuals, and linearizable regression models. Because the NEE data do not conform to these criteria, we used a Monte Carlo approach (bootstrapping) to quantify the statistical uncertainty of GEE estimates and present this uncertainty in the form of 90% prediction limits. We explore two examples of regression models for modeling respiration and daytime NEE: (1) a simple, physiologically based model from the literature and (2) a nonlinear regression model based on an artificial neural network. We find that uncertainty at the half-hourly timescale is generally on the order of the observations themselves (i.e., ˜100%) but is much less at annual timescales (˜10%). On the other hand, this small absolute uncertainty is commensurate with the interannual variability in estimated GEE. The largest uncertainty is associated with choice of model type, which raises basic questions about the relative roles of models and data.

  20. An adaptive strategy on the error of the objective functions for uncertainty-based derivative-free optimization

    Science.gov (United States)

    Fusi, F.; Congedo, P. M.

    2016-03-01

    In this work, a strategy is developed to deal with the error affecting the objective functions in uncertainty-based optimization. We refer to the problems where the objective functions are the statistics of a quantity of interest computed by an uncertainty quantification technique that propagates some uncertainties of the input variables through the system under consideration. In real problems, the statistics are computed by a numerical method and therefore they are affected by a certain level of error, depending on the chosen accuracy. The errors on the objective function can be interpreted with the abstraction of a bounding box around the nominal estimation in the objective functions space. In addition, in some cases the uncertainty quantification methods providing the objective functions also supply the possibility of adaptive refinement to reduce the error bounding box. The novel method relies on the exchange of information between the outer loop based on the optimization algorithm and the inner uncertainty quantification loop. In particular, in the inner uncertainty quantification loop, a control is performed to decide whether a refinement of the bounding box for the current design is appropriate or not. In single-objective problems, the current bounding box is compared to the current optimal design. In multi-objective problems, the decision is based on the comparison of the error bounding box of the current design and the current Pareto front. With this strategy, fewer computations are made for clearly dominated solutions and an accurate estimate of the objective function is provided for the interesting, non-dominated solutions. The results presented in this work prove that the proposed method improves the efficiency of the global loop, while preserving the accuracy of the final Pareto front.

  1. Flood risk assessment and associated uncertainty

    Directory of Open Access Journals (Sweden)

    H. Apel

    2004-01-01

    Full Text Available Flood disaster mitigation strategies should be based on a comprehensive assessment of the flood risk combined with a thorough investigation of the uncertainties associated with the risk assessment procedure. Within the 'German Research Network of Natural Disasters' (DFNK the working group 'Flood Risk Analysis' investigated the flood process chain from precipitation, runoff generation and concentration in the catchment, flood routing in the river network, possible failure of flood protection measures, inundation to economic damage. The working group represented each of these processes by deterministic, spatially distributed models at different scales. While these models provide the necessary understanding of the flood process chain, they are not suitable for risk and uncertainty analyses due to their complex nature and high CPU-time demand. We have therefore developed a stochastic flood risk model consisting of simplified model components associated with the components of the process chain. We parameterised these model components based on the results of the complex deterministic models and used them for the risk and uncertainty analysis in a Monte Carlo framework. The Monte Carlo framework is hierarchically structured in two layers representing two different sources of uncertainty, aleatory uncertainty (due to natural and anthropogenic variability and epistemic uncertainty (due to incomplete knowledge of the system. The model allows us to calculate probabilities of occurrence for events of different magnitudes along with the expected economic damage in a target area in the first layer of the Monte Carlo framework, i.e. to assess the economic risks, and to derive uncertainty bounds associated with these risks in the second layer. It is also possible to identify the contributions of individual sources of uncertainty to the overall uncertainty. It could be shown that the uncertainty caused by epistemic sources significantly alters the results

  2. Uncertainty Modeling and Evaluation of CMM Task Oriented Measurement Based on SVCMM

    Science.gov (United States)

    Li, Hongli; Chen, Xiaohuai; Cheng, Yinbao; Liu, Houde; Wang, Hanbin; Cheng, Zhenying; Wang, Hongtao

    2017-10-01

    Due to the variety of measurement tasks and the complexity of the errors of coordinate measuring machine (CMM), it is very difficult to reasonably evaluate the uncertainty of the measurement results of CMM. It has limited the application of CMM. Task oriented uncertainty evaluation has become a difficult problem to be solved. Taking dimension measurement as an example, this paper puts forward a practical method of uncertainty modeling and evaluation of CMM task oriented measurement (called SVCMM method). This method makes full use of the CMM acceptance or reinspection report and the Monte Carlo computer simulation method (MCM). The evaluation example is presented, and the results are evaluated by the traditional method given in GUM and the proposed method, respectively. The SVCMM method is verified to be feasible and practical. It can help CMM users to conveniently complete the measurement uncertainty evaluation through a single measurement cycle.

  3. Value-based decision making under uncertainty in hoarding and obsessive- compulsive disorders.

    Science.gov (United States)

    Pushkarskaya, Helen; Tolin, David; Ruderman, Lital; Henick, Daniel; Kelly, J MacLaren; Pittenger, Christopher; Levy, Ifat

    2017-12-01

    Difficulties in decision making are a core impairment in a range of disease states. For instance, both obsessive- compulsive disorder (OCD) and hoarding disorder (HD) are associated with indecisiveness, inefficient planning, and enhanced uncertainty intolerance, even in contexts unrelated to their core symptomology. We examined decision-making patterns in 19 individuals with OCD, 19 individuals with HD, 19 individuals with comorbid OCD and HD, and 57 individuals from the general population, using a well-validated choice task grounded in behavioral economic theory. Our results suggest that difficulties in decision making in individuals with OCD (with or without comorbid HD) are linked to reduced fidelity of value-based decision making (i.e. increase in inconsistent choices). In contrast, we find that performance of individuals with HD on our laboratory task is largely intact. Overall, these results support our hypothesis that decision-making impairments in OCD and HD, which can appear quite similar clinically, have importantly different underpinnings. Systematic investigation of different aspects of decision making, under varying conditions, may shed new light on commonalities between and distinctions among clinical syndromes. Copyright © 2017. Published by Elsevier B.V.

  4. An enhanced export coefficient based optimization model for supporting agricultural nonpoint source pollution mitigation under uncertainty.

    Science.gov (United States)

    Rong, Qiangqiang; Cai, Yanpeng; Chen, Bing; Yue, Wencong; Yin, Xin'an; Tan, Qian

    2017-02-15

    In this research, an export coefficient based dual inexact two-stage stochastic credibility constrained programming (ECDITSCCP) model was developed through integrating an improved export coefficient model (ECM), interval linear programming (ILP), fuzzy credibility constrained programming (FCCP) and a fuzzy expected value equation within a general two stage programming (TSP) framework. The proposed ECDITSCCP model can effectively address multiple uncertainties expressed as random variables, fuzzy numbers, pure and dual intervals. Also, the model can provide a direct linkage between pre-regulated management policies and the associated economic implications. Moreover, the solutions under multiple credibility levels can be obtained for providing potential decision alternatives for decision makers. The proposed model was then applied to identify optimal land use structures for agricultural NPS pollution mitigation in a representative upstream subcatchment of the Miyun Reservoir watershed in north China. Optimal solutions of the model were successfully obtained, indicating desired land use patterns and nutrient discharge schemes to get a maximum agricultural system benefits under a limited discharge permit. Also, numerous results under multiple credibility levels could provide policy makers with several options, which could help get an appropriate balance between system benefits and pollution mitigation. The developed ECDITSCCP model can be effectively applied to addressing the uncertain information in agricultural systems and shows great applicability to the land use adjustment for agricultural NPS pollution mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. FIRM: Sampling-based feedback motion-planning under motion uncertainty and imperfect measurements

    KAUST Repository

    Agha-mohammadi, A.-a.

    2013-11-15

    In this paper we present feedback-based information roadmap (FIRM), a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods. The crucial feature of FIRM is that the costs associated with the edges are independent of each other, and in this sense it is the first method that generates a graph in belief space that preserves the optimal substructure property. From a practical point of view, FIRM is a robust and reliable planning framework. It is robust since the solution is a feedback and there is no need for expensive replanning. It is reliable because accurate collision probabilities can be computed along the edges. In addition, FIRM is a scalable framework, where the complexity of planning with FIRM is a constant multiplier of the complexity of planning with PRM. In this paper, FIRM is introduced as an abstract framework. As a concrete instantiation of FIRM, we adopt stationary linear quadratic Gaussian (SLQG) controllers as belief stabilizers and introduce the so-called SLQG-FIRM. In SLQG-FIRM we focus on kinematic systems and then extend to dynamical systems by sampling in the equilibrium space. We investigate the performance of SLQG-FIRM in different scenarios. © The Author(s) 2013.

  6. Low-Carbon Based Multi-Objective Bi-Level Power Dispatching under Uncertainty

    Directory of Open Access Journals (Sweden)

    Xiaoyang Zhou

    2016-06-01

    Full Text Available This research examines a low-carbon power dispatch problem under uncertainty. A hybrid uncertain multi-objective bi-level model with one leader and multiple followers is established to support the decision making of power dispatch and generation. The upper level decision maker is the regional power grid corporation which allocates power quotas to each follower based on the objectives of reasonable returns, a small power surplus and low carbon emissions. The lower level decision makers are the power generation groups which decide on their respective power generation plans and prices to ensure the highest total revenue under consideration of government subsidies, environmental costs and the carbon trading. Random and fuzzy variables are adopted to describe the uncertain factors and chance constrained and expected value programming are used to handle the hybrid uncertain model. The bi-level models are then transformed into solvable single level models using a satisfaction method. Finally, a detailed case study and comparative analyses are presented to test the proposed models and approaches to validate the effectiveness and illustrate the advantages.

  7. Stochastic Extended LQR for Optimization-based Motion Planning Under Uncertainty.

    Science.gov (United States)

    Sun, Wen; van den Berg, Jur; Alterovitz, Ron

    2016-04-01

    We introduce a novel optimization-based motion planner, Stochastic Extended LQR (SELQR), which computes a trajectory and associated linear control policy with the objective of minimizing the expected value of a user-defined cost function. SELQR applies to robotic systems that have stochastic non-linear dynamics with motion uncertainty modeled by Gaussian distributions that can be state- and control-dependent. In each iteration, SELQR uses a combination of forward and backward value iteration to estimate the cost-to-come and the cost-to-go for each state along a trajectory. SELQR then locally optimizes each state along the trajectory at each iteration to minimize the expected total cost, which results in smoothed states that are used for dynamics linearization and cost function quadratization. SELQR progressively improves the approximation of the expected total cost, resulting in higher quality plans. For applications with imperfect sensing, we extend SELQR to plan in the robot's belief space. We show that our iterative approach achieves fast and reliable convergence to high-quality plans in multiple simulated scenarios involving a car-like robot, a quadrotor, and a medical steerable needle performing a liver biopsy procedure.

  8. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty

    Science.gov (United States)

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  9. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    Directory of Open Access Journals (Sweden)

    Bao-Jian Qiu

    Full Text Available Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.

  10. Model-Based Heterogeneous Data Fusion for Reliable Force Estimation in Dynamic Structures under Uncertainties.

    Science.gov (United States)

    Khodabandeloo, Babak; Melvin, Dyan; Jo, Hongki

    2017-11-17

    Direct measurements of external forces acting on a structure are infeasible in many cases. The Augmented Kalman Filter (AKF) has several attractive features that can be utilized to solve the inverse problem of identifying applied forces, as it requires the dynamic model and the measured responses of structure at only a few locations. But, the AKF intrinsically suffers from numerical instabilities when accelerations, which are the most common response measurements in structural dynamics, are the only measured responses. Although displacement measurements can be used to overcome the instability issue, the absolute displacement measurements are challenging and expensive for full-scale dynamic structures. In this paper, a reliable model-based data fusion approach to reconstruct dynamic forces applied to structures using heterogeneous structural measurements (i.e., strains and accelerations) in combination with AKF is investigated. The way of incorporating multi-sensor measurements in the AKF is formulated. Then the formulation is implemented and validated through numerical examples considering possible uncertainties in numerical modeling and sensor measurement. A planar truss example was chosen to clearly explain the formulation, while the method and formulation are applicable to other structures as well.

  11. Stochastic Extended LQR for Optimization-based Motion Planning Under Uncertainty

    Science.gov (United States)

    Sun, Wen; van den Berg, Jur; Alterovitz, Ron

    2016-01-01

    We introduce a novel optimization-based motion planner, Stochastic Extended LQR (SELQR), which computes a trajectory and associated linear control policy with the objective of minimizing the expected value of a user-defined cost function. SELQR applies to robotic systems that have stochastic non-linear dynamics with motion uncertainty modeled by Gaussian distributions that can be state- and control-dependent. In each iteration, SELQR uses a combination of forward and backward value iteration to estimate the cost-to-come and the cost-to-go for each state along a trajectory. SELQR then locally optimizes each state along the trajectory at each iteration to minimize the expected total cost, which results in smoothed states that are used for dynamics linearization and cost function quadratization. SELQR progressively improves the approximation of the expected total cost, resulting in higher quality plans. For applications with imperfect sensing, we extend SELQR to plan in the robot's belief space. We show that our iterative approach achieves fast and reliable convergence to high-quality plans in multiple simulated scenarios involving a car-like robot, a quadrotor, and a medical steerable needle performing a liver biopsy procedure. PMID:28163662

  12. Comparing Robust Decision-Making and Dynamic Adaptive Policy Pathways for model-based decision support under deep uncertainty

    NARCIS (Netherlands)

    Kwakkel, J.H.; Haasnoot, M.; Walker, W.E.

    2016-01-01

    A variety of model-based approaches for supporting decision-making under deep uncertainty have been suggested, but they are rarely compared and contrasted. In this paper, we compare Robust Decision-Making with Dynamic Adaptive Policy Pathways. We apply both to a hypothetical case inspired by a

  13. A Convex Model of Risk-Based Unit Commitment for Day-Ahead Market Clearing Considering Wind Power Uncertainty

    DEFF Research Database (Denmark)

    Zhang, Ning; Kang, Chongqing; Xia, Qing

    2015-01-01

    presents a novel risk-based day-ahead unit commitment (RUC) model that considers the risks of the loss of load, wind curtailment and branch overflow caused by wind power uncertainty. These risks are formulated in detail using the probabilistic distributions of wind power probabilistic forecast...

  14. Search for gamma-ray events in the BATSE data base

    Science.gov (United States)

    Lewin, Walter

    1994-01-01

    We find large location errors and error radii in the locations of channel 1 Cygnus X-1 events. These errors and their associated uncertainties are a result of low signal-to-noise ratios (a few sigma) in the two brightest detectors for each event. The untriggered events suffer from similarly low signal-to-noise ratios, and their location errors are expected to be at least as large as those found for Cygnus X-1 with a given signal-to-noise ratio. The statistical error radii are consistent with those found for Cygnus X-1 and with the published estimates. We therefore expect approximately 20 - 30 deg location errors for the untriggered events. Hence, many of the untriggered events occurring within a few months of the triggered activity from SGR 1900 plus 14 are indeed consistent with the SGR source location, although Cygnus X-1 is also a good candidate.

  15. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  16. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    Science.gov (United States)

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event-basered...... defineret og udført i en almindelig web-browser....

  18. Deterministic event-based simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, K; De Raedt, H; Michielsen, K

    2005-01-01

    We propose and analyse simple deterministic algorithms that can be used to construct machines that have primitive learning capabilities. We demonstrate that locally connected networks of these machines can be used to perform blind classification on an event-by-event basis, without storing the

  19. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  20. Uncertainties in TRMM-Era multisatellite-based tropical rainfall estimates over the Maritime Continent

    Science.gov (United States)

    Rauniyar, S. P.; Protat, A.; Kanamori, H.

    2017-05-01

    This study investigates the regional and seasonal rainfall rate retrieval uncertainties within nine state-of-the-art satellite-based rainfall products over the Maritime Continent (MC) region. The results show consistently larger differences in mean daily rainfall among products over land, especially over mountains and along coasts, compared to over ocean, by about 20% for low to medium rain rates and 5% for heavy rain rates. However, rainfall differences among the products do not exhibit any seasonal dependency over both surface types (land and ocean) of the MC region. The differences between products largely depends on the rain rate itself, with a factor 2 difference for light rain and 30% for intermediate and high rain rates over ocean. The rain-rate products dominated by microwave measurements showed less spread among themselves over ocean compared to the products dominated by infrared measurements. Conversely, over land, the rain gauge-adjusted post-real-time products dominated by microwave measurements produced the largest spreads, due to the usage of different gauge analyses for the bias corrections. Intercomparisons of rainfall characteristics of these products revealed large discrepancies in detecting the frequency and intensity of rainfall. These satellite products are finally evaluated at subdaily, daily, monthly, intraseasonal, and seasonal temporal scales against high-quality gridded rainfall observations in the Sarawak (Malaysia) region for the 4 year period 2000-2003. No single satellite-based rainfall product clearly outperforms the other products at all temporal scales. General guidelines are provided for selecting a product that could be best suited for a particular application and/or temporal resolution.

  1. Helping families thrive in the face of uncertainty: Strengths based approaches to working with families affected by progressive neurological illness.

    Science.gov (United States)

    Tams, Rachel; Prangnell, Simon J; Daisley, Audrey

    2016-03-23

    Management of the uncertainty inherent in a diagnosis of a progressive neurological illness is one of the major adjustment tasks facing those affected and their families. A causal relationship has been demonstrated between perceived illness uncertainty and negative psychological outcomes for individuals with progressive neurological illness. Whilst there is a small and promising intervention literature on the use of a range of individually focused strengths based psychological interventions there appears to be little guidance available how clinicians might help those family members of those affected. To undertake a systematic review of the evidence on the use of strengths based, family focused interventions that target illness uncertainty. A systematic literature search was undertaken using the National Library for Health abstract database. Five papers were included in the review, only two of which were published in peer reviewed journals. All five reported on strengths based approaches that could be used with families but only two explicitly identified illness uncertainty as a target. Outcome measures were heterogeneous so data could not be aggregated for meta-analysis. The results suggested that these interventions showed promised but the review highlighted a number of methodological issues which mean that the results must be interpreted with caution. There is very little evidence of the use of strengths based approaches to helping families manage the uncertainty associated with progressive neurological illness despite it having been identified as a key target for intervention. The review highlights the need for the development of an intervention framework to address this key clinical issue and suggests one model that might show promise.

  2. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data......, and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  3. Beam steering uncertainty analysis for Risley prisms based on Monte Carlo simulation

    Science.gov (United States)

    Zhang, Hao; Yuan, Yan; Su, Lijuan; Huang, Fengzhen

    2017-01-01

    The Risley-prism system is applied in imaging LADAR to achieve precision directing of laser beams. The image quality of LADAR is affected deeply by the laser beam steering quality of Risley prisms. The ray-tracing method was used to predict the pointing error. The beam steering uncertainty of Risley prisms was investigated through Monte Carlo simulation under the effects of rotation axis jitter and prism rotation error. Case examples were given to elucidate the probability distribution of pointing error. Furthermore, the effect of scan pattern on the beam steering uncertainty was also studied. It is found that the demand for the bearing rotational accuracy of the second prism is much more stringent than that of the first prism. Under the effect of rotation axis jitter, the pointing uncertainty in the field of regard is related to the altitude angle of the emerging beam, but it has no relationship with the azimuth angle. The beam steering uncertainty will be affected by the original phase if the scan pattern is a circle. The proposed method can be used to estimate the beam steering uncertainty of Risley prisms, and the conclusions will be helpful in the design and manufacture of this system.

  4. A frequency/consequence-based technique for visualizing and communicating uncertainty and perception of risk.

    Science.gov (United States)

    Slavin, David; Troy Tucker, W; Ferson, Scott

    2008-04-01

    This chapter presents an approach under development for communicating uncertainty regarding risk. The approach relies on a risk imaging technology that decomposes risk into two basic elements: (i) the frequency of each kind of harm associated with a hazard and (ii) the adversity of each of those harms. Because different kinds of harm are often measured along incompatible dimensions, adversity is quantified on an ordinal scale. Frequency is quantified on a ratio scale. Sampling error, measurement error, and bias all contribute to uncertainty about frequency. Differences in opinion, measurement error, and choice of dimensions lead to uncertainty about adversity. In this chapter, risk is imaged as an area circumscribed by uncertainty bounds around all of the harms. This area is called the risk profile of a hazard. Different individuals and groups respond to uncertainty and risk differently, and the risk profile can be further focused to visualize particular risk perceptions. These alternate risk visualizations may be contrasted and compared across management choices or across different risk perceivers to facilitate communication and decision making. To illustrate the method, we image published clinical trial data.

  5. Stochastic goal programming based groundwater remediation management under human-health-risk uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jing; He, Li, E-mail: li.he@ncepu.edu.cn; Lu, Hongwei; Fan, Xing

    2014-08-30

    Highlights: • We propose an integrated optimal groundwater remediation design approach. • The approach can address stochasticity in carcinogenic risks. • Goal programming is used to make the system approaching to ideal operation and remediation effects. • The uncertainty in slope factor is evaluated under different confidence levels. • Optimal strategies are obtained to support remediation design under uncertainty. - Abstract: An optimal design approach for groundwater remediation is developed through incorporating numerical simulation, health risk assessment, uncertainty analysis and nonlinear optimization within a general framework. Stochastic analysis and goal programming are introduced into the framework to handle uncertainties in real-world groundwater remediation systems. Carcinogenic risks associated with remediation actions are further evaluated at four confidence levels. The differences between ideal and predicted constraints are minimized by goal programming. The approach is then applied to a contaminated site in western Canada for creating a set of optimal remediation strategies. Results from the case study indicate that factors including environmental standards, health risks and technical requirements mutually affected and restricted themselves. Stochastic uncertainty existed in the entire process of remediation optimization, which should to be taken into consideration in groundwater remediation design.

  6. Event-based home safety problem detection under the CPS home safety architecture

    OpenAIRE

    Yang, Zhengguo; Lim, Azman Osman; Tan, Yasuo

    2013-01-01

    This paper presents a CPS(Cyber-physical System) home safety architecture for home safety problem detection and reaction and shows some example cases. In order for home safety problem detection, there are three levels of events defined: elementary event, semantic event and entire event, which representing the meaning from parameter to single safety problem, and then the whole safety status of a house. For the relationship between these events and raw data, a Finite State Machine (FSM) based m...

  7. Finite element model validation of bridge based on structural health monitoring—Part II: Uncertainty propagation and model validation

    Directory of Open Access Journals (Sweden)

    Xiaosong Lin

    2015-08-01

    Full Text Available Because of uncertainties involved in modeling, construction, and measurement systems, the assessment of the FE model validation must be conducted based on stochastic measurements to provide designers with confidence for further applications. In this study, based on the updated model using response surface methodology, a practical model validation methodology via uncertainty propagation is presented. Several criteria of testing/analysis correlation are introduced, and the sources of model and testing uncertainties are also discussed. After that, Monte Carlo stochastic finite element (FE method is employed to perform the uncertainty quantification and propagation. The proposed methodology is illustrated with the examination of the validity of a large-span prestressed concrete continuous rigid frame bridge monitored under operational conditions. It can be concluded that the calculated frequencies and vibration modes of the updated FE model of Xiabaishi Bridge are consistent with the measured ones. The relative errors of each frequency are all less than 3.7%. Meanwhile, the overlap ratio indexes of each frequency are all more than 75%; The MAC values of each calculated vibration frequency are all more than 90%. The model of Xiabaishi Bridge is valid in the whole operation space including experimental design space, and its confidence level is upper than 95%. The validated FE model of Xiabaishi Bridge can reflect the current condition of Xiabaishi Bridge, and also can be used as basis of bridge health monitoring, damage identification and safety assessment.

  8. A joint renewal process used to model event based data

    National Research Council Canada - National Science Library

    Mergenthaler, Wolfgang; Jaroszewski, Daniel; Feller, Sebastian; Laumann, Larissa

    2016-01-01

    .... Event data, herein defined as a collection of triples containing a time stamp, a failure code and eventually a descriptive text, can best be evaluated by using the paradigm of joint renewal processes...

  9. Inverse regression-based uncertainty quantification algorithms for high-dimensional models: Theory and practice

    Energy Technology Data Exchange (ETDEWEB)

    Li, Weixuan; Lin, Guang; Li, Bing

    2016-09-01

    A well-known challenge in uncertainty quantification (UQ) is the "curse of dimensionality". However, many high-dimensional UQ problems are essentially low-dimensional, because the randomness of the quantity of interest (QoI) is caused only by uncertain parameters varying within a low-dimensional subspace, known as the sufficient dimension reduction (SDR) subspace. Motivated by this observation, we propose and demonstrate in this paper an inverse regression-based UQ approach (IRUQ) for high-dimensional problems. Specifically, we use an inverse regression procedure to estimate the SDR subspace and then convert the original problem to a low-dimensional one, which can be efficiently solved by building a response surface model such as a polynomial chaos expansion. The novelty and advantages of the proposed approach is seen in its computational efficiency and practicality. Comparing with Monte Carlo, the traditionally preferred approach for high-dimensional UQ, IRUQ with a comparable cost generally gives much more accurate solutions even for high-dimensional problems, and even when the dimension reduction is not exactly sufficient. Theoretically, IRUQ is proved to converge twice as fast as the approach it uses seeking the SDR subspace. For example, while a sliced inverse regression method converges to the SDR subspace at the rate of $O(n^{-1/2})$, the corresponding IRUQ converges at $O(n^{-1})$. IRUQ also provides several desired conveniences in practice. It is non-intrusive, requiring only a simulator to generate realizations of the QoI, and there is no need to compute the high-dimensional gradient of the QoI. Finally, error bars can be derived for the estimation results reported by IRUQ.

  10. Artificial intelligence based event detection in wireless sensor networks

    OpenAIRE

    Bahrepour, M.

    2013-01-01

    Wireless sensor networks (WSNs) are composed of large number of small, inexpensive devices, called sensor nodes, which are equipped with sensing, processing, and communication capabilities. While traditional applications of wireless sensor networks focused on periodic monitoring, the focus of more recent applications is on fast and reliable identification of out-of-ordinary situations and events. This new functionality of wireless sensor networks is known as event detection. Due to the fact t...

  11. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  12. RELATIONSHIP BETWEEN CULTURAL/ARTISTIC EVENTS VISITATION AND OTHER ACTIVITY-BASED TOURISM SEGMENTS

    National Research Council Canada - National Science Library

    Ana Tezak; Darko Saftic; Zdravko Sergo

    2011-01-01

    .... One of these specific forms of tourism is event tourism. The aim of this research is to determine the relationship between cultural/artistic events visitation and other activity-based tourism segments...

  13. Agreement between event-based and trend-based glaucoma progression analyses.

    Science.gov (United States)

    Rao, H L; Kumbar, T; Kumar, A U; Babu, J G; Senthil, S; Garudadri, C S

    2013-07-01

    To evaluate the agreement between event- and trend-based analyses to determine visual field (VF) progression in glaucoma. VFs of 175 glaucoma eyes with ≥5 VFs were analyzed by proprietary software of VF analyzer to determine progression. Agreement (κ) between trend-based analysis of VF index (VFI) and event-based analysis (glaucoma progression analysis, GPA) was evaluated. For eyes progressing by event- and trend-based methods, time to progression by two methods was calculated. Median number of VFs per eye was 7 and follow-up 7.5 years. GPA classified 101 eyes (57.7%) as stable, 30 eyes (17.1%) as possible and 44 eyes (25.2%) as likely progression. Trend-based analysis classified 122 eyes (69.7%) as stable (slope >-1% per year or any slope magnitude with P>0.05), 53 eyes (30.3%) as progressing with slope trend-based analysis was 0.48, and between specific criteria of GPA (possible clubbed with no progression) and trend-based analysis was 0.50. In eyes progressing by sensitive criteria of both methods (42 eyes), median time to progression by GPA (4.9 years) was similar (P=0.30) to trend-based method (5.0 years). This was also similar in eyes progressing by specific criteria of both methods (25 eyes; 5.6 years versus 5.9 years, P=0.23). Agreement between event- and trend-based progression analysis was moderate. GPA seemed to detect progression earlier than trend-based analysis, but this wasn't statistically significant.

  14. SParSE++: improved event-based stochastic parameter search.

    Science.gov (United States)

    Roh, Min K; Daigle, Bernie J

    2016-11-25

    Despite the increasing availability of high performance computing capabilities, analysis and characterization of stochastic biochemical systems remain a computational challenge. To address this challenge, the Stochastic Parameter Search for Events (SParSE) was developed to automatically identify reaction rates that yield a probabilistic user-specified event. SParSE consists of three main components: the multi-level cross-entropy method, which identifies biasing parameters to push the system toward the event of interest, the related inverse biasing method, and an optional interpolation of identified parameters. While effective for many examples, SParSE depends on the existence of a sufficient amount of intrinsic stochasticity in the system of interest. In the absence of this stochasticity, SParSE can either converge slowly or not at all. We have developed SParSE++, a substantially improved algorithm for characterizing target events in terms of system parameters. SParSE++ makes use of a series of novel parameter leaping methods that accelerate the convergence rate to the target event, particularly in low stochasticity cases. In addition, the interpolation stage is modified to compute multiple interpolants and to choose the optimal one in a statistically rigorous manner. We demonstrate the performance of SParSE++ on four example systems: a birth-death process, a reversible isomerization model, SIRS disease dynamics, and a yeast polarization model. In all four cases, SParSE++ shows significantly improved computational efficiency over SParSE, with the largest improvements resulting from analyses with the strictest error tolerances. As researchers continue to model realistic biochemical systems, the need for efficient methods to characterize target events will grow. The algorithmic advancements provided by SParSE++ fulfill this need, enabling characterization of computationally intensive biochemical events that are currently resistant to analysis.

  15. Uncertainty analysis of life cycle greenhouse gas emissions from petroleum-based fuels and impacts on low carbon fuel policies.

    Science.gov (United States)

    Venkatesh, Aranya; Jaramillo, Paulina; Griffin, W Michael; Matthews, H Scott

    2011-01-01

    The climate change impacts of U.S. petroleum-based fuels consumption have contributed to the development of legislation supporting the introduction of low carbon alternatives, such as biofuels. However, the potential greenhouse gas (GHG) emissions reductions estimated for these policies using life cycle assessment methods are predominantly based on deterministic approaches that do not account for any uncertainty in outcomes. This may lead to unreliable and expensive decision making. In this study, the uncertainty in life cycle GHG emissions associated with petroleum-based fuels consumed in the U.S. is determined using a process-based framework and statistical modeling methods. Probability distributions fitted to available data were used to represent uncertain parameters in the life cycle model. Where data were not readily available, a partial least-squares (PLS) regression model based on existing data was developed. This was used in conjunction with probability mixture models to select appropriate distributions for specific life cycle stages. Finally, a Monte Carlo simulation was performed to generate sample output distributions. As an example of results from using these methods, the uncertainty range in life cycle GHG emissions from gasoline was shown to be 13%-higher than the typical 10% minimum emissions reductions targets specified by low carbon fuel policies.

  16. Exploring the uncertainty associated with satellite-based estimates of premature mortality due to exposure to fine particulate matter

    Directory of Open Access Journals (Sweden)

    B. Ford

    2016-03-01

    Full Text Available The negative impacts of fine particulate matter (PM2.5 exposure on human health are a primary motivator for air quality research. However, estimates of the air pollution health burden vary considerably and strongly depend on the data sets and methodology. Satellite observations of aerosol optical depth (AOD have been widely used to overcome limited coverage from surface monitoring and to assess the global population exposure to PM2.5 and the associated premature mortality. Here we quantify the uncertainty in determining the burden of disease using this approach, discuss different methods and data sets, and explain sources of discrepancies among values in the literature. For this purpose we primarily use the MODIS satellite observations in concert with the GEOS-Chem chemical transport model. We contrast results in the United States and China for the years 2004–2011. Using the Burnett et al. (2014 integrated exposure response function, we estimate that in the United States, exposure to PM2.5 accounts for approximately 2 % of total deaths compared to 14 % in China (using satellite-based exposure, which falls within the range of previous estimates. The difference in estimated mortality burden based solely on a global model vs. that derived from satellite is approximately 14 % for the US and 2 % for China on a nationwide basis, although regionally the differences can be much greater. This difference is overshadowed by the uncertainty in the methodology for deriving PM2.5 burden from satellite observations, which we quantify to be on the order of 20 % due to uncertainties in the AOD-to-surface-PM2.5 relationship, 10 % due to the satellite observational uncertainty, and 30 % or greater uncertainty associated with the application of concentration response functions to estimated exposure.

  17. Event-based text mining for biology and functional genomics

    Science.gov (United States)

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  18. Mapping Heat-related Risks for Community-based Adaptation Planning under Uncertainty

    Science.gov (United States)

    Bai, Yingjiu; Kaneko, Ikuyo; Kobayashi, Hikaru; Kurihara, Kazuo; Sasaki, Hidetaka; Murata, Akihiko; Takayabu, Izuru

    2016-04-01

    Climate change is leading to more frequent and intense heat waves. Recently, epidemiologic findings on heat-related health impacts have reinforced our understanding of the mortality impacts of extreme heat. This research has several aims: 1) to promote climate prediction services with spatial and temporal information on heat-related risks, using GIS (Geographical Information System), and digital mapping techniques; 2) to propose a visualization approach to articulating the evolution of local heat-health responses over time and the evaluation of new interventions for the implementation of valid community-based adaptation strategies and reliable actionable planning; and 3) to provide an appropriate and simple method of adjusting bias and quantifying the uncertainty in future outcomes, so that regional climate projections may be transcribed into useful forms for a wide variety of different users. Following the 2003 European heat wave, climatologists, medical specialists, and social scientists expedited efforts to revise and integrate risk governance frameworks for communities to take appropriate and effective actions themselves. Recently, the Coupled Model Intercomparison Project (CMIP) methodology has made projections possible for anyone wanting to openly access state-of-the-art climate model outputs and climate data to provide the backbone for decisions. Furthermore, the latest high-solution regional climate model (RCM) has been a huge increase in the volumes of data available. In this study, we used high-quality hourly projections (5-km resolution) from the Non-Hydrostatic Regional Climate Model (NHRCM-5km), following the SRES-A1B scenario developed by the Meteorological Research Institute (MRI) and observational data from the Automated Meteorological Data Acquisition System, Japan Meteorological Agency (JMA). The NHRCM-5km is a dynamic downscaling of results from the MRI-AGCM3.2S (20-km resolution), an atmospheric general circulation model (AGCM) driven by the

  19. Decentralized Event-Based Communication Strategy on Leader-Follower Consensus Control

    Directory of Open Access Journals (Sweden)

    Duosi Xie

    2016-01-01

    Full Text Available This paper addresses the leader-follower consensus problem of networked systems by using a decentralized event-based control strategy. The event-based control strategy makes the controllers of agents update at aperiodic event instants. Two decentralized event functions are designed to generate these event instants. In particular, the second event function only uses its own information and the neighbors’ states at their latest event instants. By using this event function, no continuous communication among followers is required. As the followers only communicate at these discrete event instants, this strategy is able to save communication and to reduce channel occupation. It is analytically shown that the leader-follower networked system is able to reach consensus by utilizing the proposed control strategy. Simulation examples are shown to illustrate effectiveness of the proposed control strategy.

  20. Managing Technical and Cost Uncertainties During Product Development in a Simulation-Based Design Environment

    Science.gov (United States)

    Karandikar, Harsh M.

    1997-01-01

    An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.

  1. Effect of emissivity uncertainty on surface temperature retrieval over urban areas: Investigations based on spectral libraries

    NARCIS (Netherlands)

    Chen, F.; Yang, S.; Su, Zhongbo; Wang, K.

    2016-01-01

    Land surface emissivity (LSE) is a prerequisite for retrieving land surface temperature (LST) through single channel methods. According to error model, a 0.01 (1%) uncertainty of LSE may result in a 0.5 K error in LST under a moderate condition, while an obvious error (approximately 1 K) is possible

  2. Uncertainty quantification and stochastic-based viscoelastic modeling of finite deformation elastomers

    Science.gov (United States)

    Oates, William S.; Hays, Michael; Miles, Paul; Smith, Ralph

    2013-04-01

    Material parameter uncertainty is a key aspect of model development. Here we quantify parameter uncertainty of a viscoelastic model through validation on rate dependent deformation of a dielectric elastomer that undergoes finite deformation. These materials are known for there large field induced deformation and applications in smart structures, although the rate dependent viscoelastic effects are not well understood. To address this issue, we first quantify hyperelastic and viscoelastic model uncertainty using Bayesian statistics by comparing a linear viscoelastic model to uniaxial rate dependent experiments. The probability densities, obtained from the Bayesian statistics, are then used to formulate a refined model that incorporates the probability densities directly within the model using homogenization methods. We focus on the uncertainty of the viscoelastic aspect of the model to show under what regimes does the stochastic homogenization framework provides improvements in predicting viscoelastic constitutive behavior. It is show that VHB has a relatively narrow probability distribution on the viscoelastic time constants. This supports use of a discrete viscoelastic model over the homogenized model.

  3. Impact of Uncertainty from Load-Based Reserves and Renewables on Dispatch Costs and Emissions

    Energy Technology Data Exchange (ETDEWEB)

    Li, Bowen; Maroukis, Spencer D.; Lin, Yashen; Mathieu, Johanna L.

    2016-11-21

    Aggregations of controllable loads are considered to be a fast-responding, cost-efficient, and environmental-friendly candidate for power system ancillary services. Unlike conventional service providers, the potential capacity from the aggregation is highly affected by factors like ambient conditions and load usage patterns. Previous work modeled aggregations of controllable loads (such as air conditioners) as thermal batteries, which are capable of providing reserves but with uncertain capacity. A stochastic optimal power flow problem was formulated to manage this uncertainty, as well as uncertainty in renewable generation. In this paper, we explore how the types and levels of uncertainty, generation reserve costs, and controllable load capacity affect the dispatch solution, operational costs, and CO2 emissions. We also compare the results of two methods for solving the stochastic optimization problem, namely the probabilistically robust method and analytical reformulation assuming Gaussian distributions. Case studies are conducted on a modified IEEE 9-bus system with renewables, controllable loads, and congestion. We find that different types and levels of uncertainty have significant impacts on dispatch and emissions. More controllable loads and less conservative solution methodologies lead to lower costs and emissions.

  4. Predicting Future Random Events Based on Past Performance

    OpenAIRE

    Donald G. Morrison; David C. Schmittlein

    1981-01-01

    There are many situations where one is interested in predicting the expected number of events in period 2 given that x events occurred in period 1. For example, insurance companies must decide whether or not to cancel the insurance of drivers who had 3 or more accidents during the previous year. In analyzing marketing research data an analyst may wish to predict the number of future purchases to be made by those customers who made x purchases in the previous 3 months. The owner of a baseball ...

  5. A Systematic Methodology for Uncertainty Analysis of Group Contribution Based and Atom Connectivity Index Based Models for Estimation of Properties of Pure Components

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Sin, Gürkan

    concentration. The application of the developed methodology is highlighted through a set of molecules not used in the parameter estimation step. The developed methodology can be used to assist uncertainty and sensitivity analysis of product/process design to obtain rationally the risk/ safety factors...... and atomic connectivity index method) has been employed to create the missing groups and predict their contributions through the regressed contributions of connectivity indices. The objective of this work is to develop a systematic methodology to carry out uncertainty analysis of group contribution based...... and atom connectivity index based property prediction models. This includes: (i) parameter estimation using available MG based property prediction models and large training sets to determine improved group and atom contributions; and (ii) uncertainty analysis to establish statistical information...

  6. Satellite-based emission constraint for nitrogen oxides: Capability and uncertainty

    Science.gov (United States)

    Lin, J.; McElroy, M. B.; Boersma, F.; Nielsen, C.; Zhao, Y.; Lei, Y.; Liu, Y.; Zhang, Q.; Liu, Z.; Liu, H.; Mao, J.; Zhuang, G.; Roozendael, M.; Martin, R.; Wang, P.; Spurr, R. J.; Sneep, M.; Stammes, P.; Clemer, K.; Irie, H.

    2013-12-01

    Vertical column densities (VCDs) of tropospheric nitrogen dioxide (NO2) retrieved from satellite remote sensing have been employed widely to constrain emissions of nitrogen oxides (NOx). A major strength of satellite-based emission constraint is analysis of emission trends and variability, while a crucial limitation is errors both in satellite NO2 data and in model simulations relating NOx emissions to NO2 columns. Through a series of studies, we have explored these aspects over China. We separate anthropogenic from natural sources of NOx by exploiting their different seasonality. We infer trends of NOx emissions in recent years and effects of a variety of socioeconomic events at different spatiotemporal scales including the general economic growth, global financial crisis, Chinese New Year, and Beijing Olympics. We further investigate the impact of growing NOx emissions on particulate matter (PM) pollution in China. As part of recent developments, we identify and correct errors in both satellite NO2 retrieval and model simulation that ultimately affect NOx emission constraint. We improve the treatments of aerosol optical effects, clouds and surface reflectance in the NO2 retrieval process, using as reference ground-based MAX-DOAS measurements to evaluate the improved retrieval results. We analyze the sensitivity of simulated NO2 to errors in the model representation of major meteorological and chemical processes with a subsequent correction of model bias. Future studies will implement these improvements to re-constrain NOx emissions.

  7. A Studio Project Based on the Events of September 11

    Science.gov (United States)

    Ruby, Nell

    2004-01-01

    A week after the 9/11 WTC event, the collage project that Nell Ruby and her class had been working on in a basic design classroom lacked relevance. They had been working from master works, analyzing hue and value relationships, color schemes, shape, and composition. The master works seemed unimportant because of the immense emotional impact of the…

  8. Stable Integration of Power Electronics-Based DG Links to the Utility Grid with Interfacing Impedance Uncertainties

    OpenAIRE

    Kazem Hoseini, S.; Pouresmaeil, Edris; Adabi, Jafar; Catalão, João

    2015-01-01

    Part 16: Energy: Power Conversion II; International audience; For the integration of distributed generation (DG) units to the utility grid, voltage source converter (VSC) is the key technology. In order to realize high quality power injection, different control techniques have been adopted. However, the converter-based DG interface is subject to inevitable uncertainties, which adversely influence the performance of the controller. The interfacing impedance seen by the VSC may considerably var...

  9. A Discussion on Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms based on Kalman Filter Estimation Applied to Prognostics of Electronics Components

    Science.gov (United States)

    Celaya, Jose R.; Saxen, Abhinav; Goebel, Kai

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process and how it relates to uncertainty representation, management, and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function and the true remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for the two while considering prognostics in making critical decisions.

  10. An annually resolved marine proxy record for the 8.2K cold event from the northern North Sea based on bivalve shells

    Science.gov (United States)

    Butler, Paul; Estrella-Martínez, Juan; Scourse, James

    2017-04-01

    The so-called 8.2K cold event is a rapid cooling of about 6° +/- 2° recorded in the Greenland ice core record and thought to be a consequence of a freshwater pulse from the Laurentide ice sheet which reduced deepwater formation in the North Atlantic. In the Greenland ice cores the event is characterized by a maximum extent of 159 years and a central event lasting for 70 years. As discussed by Thomas et al (QSR, 2007), the low resolution and dating uncertainty of much palaeoclimate data makes it difficult to determine the rates of change and causal sequence that characterise the event at different locations. We present here a bivalve shell chronology based on four shells of Arctica islandica from the northern North Sea which (within radiocarbon uncertainty) is coeval with the 8.2K event recorded in the Greenland ice cores. The years of death of each shell based on radiocarbon analysis and crossmatching are 8094, 8134, 8147, and 8208 yrs BP (where "present" = AD 1950), with an associated radiocarbon uncertainty of +/-80 yrs, and their longevities are 106, 122, 112 and 79 years respectively. The total length of the chronology is 192 years (8286 - 8094 BP +/- 80 yrs). The most noticeable feature of the chronology is an 60-year period of increasing growth which may correspond to a similar period of decreasing ice accumulation in the GRIP (central Greenland) ice core record. We tentatively suggest that this reflects increasing food supply to the benthos as summer stratification is weakened by colder seawater temperatures. Stable isotope analyses (results expected to be available when this abstract is presented), will show changes at annual and seasonal resolution, potentially giving a very detailed insight into the causal factors associated with the 8.2K event and its impact in the northern North Sea.

  11. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  12. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    Science.gov (United States)

    2010-06-01

    gathered from Wikipedia, the free online encyclopedia ; and the content from the Marketing server was gathered from the GNU Operating System’s homepage...Francoise and Julien Bourgeois. “Log-based Distributed Intrusion De- tection for Hybrid Networks”. CSIIRW, 2008. 197 30. Salem , Malek Ben, Shlomo Hershkop

  13. A quantum uncertainty relation based on Fisher's information

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Moreno, P; Plastino, A R; Dehesa, J S, E-mail: pablos@ugr.es, E-mail: arplastino@ugr.es, E-mail: dehesa@ugr.es [Departamento de Fisica Atomica, Molecular y Nuclear and Instituto Carlos I de Fisica Teorica y Computacional, University of Granada, Granada (Spain)

    2011-02-11

    We explore quantum uncertainty relations involving the Fisher information functionals I{sub x} and I{sub p} evaluated, respectively, on a wavefunction {Psi}(x) defined on a D-dimensional configuration space and the concomitant wavefunction {Psi}-tilde(p) on the conjugate momentum space. We prove that the associated Fisher functionals obey the uncertainty relation I{sub x}I{sub p} {>=} 4D{sup 2} when either {Psi}(x) or {Psi}-tilde(p) is real. On the other hand, there is no lower bound to the above product for arbitrary complex wavefunctions. We give explicit examples of complex wavefunctions not obeying the above bound. In particular, we provide a parametrized wavefunction for which the product I{sub x}I{sub p} can be made arbitrarily small.

  14. Life cycle cost optimization of biofuel supply chains under uncertainties based on interval linear programming.

    Science.gov (United States)

    Ren, Jingzheng; Dong, Liang; Sun, Lu; Goodsite, Michael Evan; Tan, Shiyu; Dong, Lichun

    2015-01-01

    The aim of this work was to develop a model for optimizing the life cycle cost of biofuel supply chain under uncertainties. Multiple agriculture zones, multiple transportation modes for the transport of grain and biofuel, multiple biofuel plants, and multiple market centers were considered in this model, and the price of the resources, the yield of grain and the market demands were regarded as interval numbers instead of constants. An interval linear programming was developed, and a method for solving interval linear programming was presented. An illustrative case was studied by the proposed model, and the results showed that the proposed model is feasible for designing biofuel supply chain under uncertainties. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  16. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  17. From low-level events to activities - A pattern-based approach

    NARCIS (Netherlands)

    Mannhardt, Felix; De Leoni, Massimiliano; Reijers, Hajo A.; Van Der Aalst, Wil M P; Toussaint, Pieter J.

    2016-01-01

    Process mining techniques analyze processes based on event data. A crucial assumption for process analysis is that events correspond to occurrences of meaningful activities. Often, low-level events recorded by information systems do not directly correspond to these. Abstraction methods, which

  18. Sunway Medical Laboratory Quality Control Plans Based on Six Sigma, Risk Management and Uncertainty.

    Science.gov (United States)

    Jairaman, Jamuna; Sakiman, Zarinah; Li, Lee Suan

    2017-03-01

    Sunway Medical Centre (SunMed) implemented Six Sigma, measurement uncertainty, and risk management after the CLSI EP23 Individualized Quality Control Plan approach. Despite the differences in all three approaches, each implementation was beneficial to the laboratory, and none was in conflict with another approach. A synthesis of these approaches, built on a solid foundation of quality control planning, can help build a strong quality management system for the entire laboratory. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. A scenario-based modeling approach for emergency evacuation management and risk analysis under multiple uncertainties.

    Science.gov (United States)

    Lv, Y; Huang, G H; Guo, L; Li, Y P; Dai, C; Wang, X W; Sun, W

    2013-02-15

    Nuclear emergency evacuation is important to prevent radioactive harms by hazardous materials and to limit the accidents' consequences; however, uncertainties are involved in the components and processes of such a management system. In the study, an interval-parameter joint-probabilistic integer programming (IJIP) method is developed for emergency evacuation management under uncertainties. Optimization techniques of interval-parameter programming (IPP) and joint-probabilistic constrained (JPC) programming are incorporated into an integer linear programming framework, so that the approach can deal with uncertainties expressed as joint probability and interval values. The IJIP method can schedule the optimal routes to guarantee the maximum population evacuated away from the effected zone during a finite time. Furthermore, it can also facilitate post optimization analysis to enhance robustness in controlling system violation risk imposed on the joint-probabilistic constraints. The developed method has been applied to a case study of nuclear emergency management; meanwhile, a number of scenarios under different system conditions have been analyzed. It is indicated that the solutions are useful for evacuation management practices. The result of the IJIP method can not only help to raise the capability of disaster responses in a systematic manner, but also provide an insight into complex relationships among evacuation planning, resources utilizations, policy requirements and system risks. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Qualitative Event-based Diagnosis with Possible Conflicts Applied to Spacecraft Power Distribution Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based diagnosis enables efficient and safe operation of engineered systems. In this paper, we describe two algorithms based on a qualitative event-based fault...

  1. NEBULAS a high performance data-driven event-building architecture based on an asynchronous self-routing packet-switching network

    CERN Document Server

    Christiansen, J; Letheren, M F; Marchioro, A; Tenhunen, H; Nummela, A; Nurmi, J; Gomes, P; Mandjavidze, I D; CERN. Geneva. Detector Research and Development Committee

    1992-01-01

    We propose a new approach to event building in future high rate experiments such as those at the LHC. We use a real-time, hierarchical event filtering paradigm based on pipelined triggering and data buffering at level 1, followed by farms of several hundreds of independent processors operating at level 2 and level 3. In view of the uncertainty in the rates and event sizes expected after the first level trigger in LHC experiments, it is important that data acquisition architectures can be open- endedly scaled to handle higher global bandwidths and accommodate more processors. We propose to apply the principle of self-routing packet-switching networks (currently under industrial development for telecommunications and multi-processor applications) to event building. We plan to implement a conceptually simple, distributed, asynchronous, data-driven, scalable, bottleneck-free architecture. An important feature of the architecture is that it can satisfy the data acquisition system's performance requirements using o...

  2. Gaussian Mixture Random Coefficient model based framework for SHM in structures with time-dependent dynamics under uncertainty

    Science.gov (United States)

    Avendaño-Valencia, Luis David; Fassois, Spilios D.

    2017-12-01

    The problem of vibration-based damage diagnosis in structures characterized by time-dependent dynamics under significant environmental and/or operational uncertainty is considered. A stochastic framework consisting of a Gaussian Mixture Random Coefficient model of the uncertain time-dependent dynamics under each structural health state, proper estimation methods, and Bayesian or minimum distance type decision making, is postulated. The Random Coefficient (RC) time-dependent stochastic model with coefficients following a multivariate Gaussian Mixture Model (GMM) allows for significant flexibility in uncertainty representation. Certain of the model parameters are estimated via a simple procedure which is founded on the related Multiple Model (MM) concept, while the GMM weights are explicitly estimated for optimizing damage diagnostic performance. The postulated framework is demonstrated via damage detection in a simple simulated model of a quarter-car active suspension with time-dependent dynamics and considerable uncertainty on the payload. Comparisons with a simpler Gaussian RC model based method are also presented, with the postulated framework shown to be capable of offering considerable improvement in diagnostic performance.

  3. Probability-based Prevention of Voltage Violation and Momentary Interruption due to Uncertainty of Renewable Energy Resources

    Science.gov (United States)

    Chaitusaney, Surachai; Yokoyama, Akihiko

    Distributed Generation (DG) provides many advantages to distribution systems. However, the presence of DG units usually causes voltage fluctuation in the systems, and even causes voltage violation. A number of researches have paid attention to this issue that is widely known for years. The other fact that is revealed in this paper is momentary electricity interruptions due to the voltage violation. To prevent damages resulting from voltage violation, sensitive loads are normally equipped with over/under-voltage relays. As a result, the more frequent bus voltages violate their limits, the more momentary electricity interruption tends to occur. In this paper, Momentary Average Interruption Event Frequency Index (MAIFIE) is evaluated to study this influence. In addition, uncertainty from renewable resources is integrated into the proposed voltage regulation method by using Probabilistic Load Flow (PLF). For IEEE-34 Bus test system, the numerical examples show that the uncertainty from renewable energy resources causes considerable MAIFIE, and the proposed method for voltage regulation can serve as an effective prevention for this problem.

  4. Uncertainty in runoff based on Global Climate Model precipitation and temperature data - Part 1: Assessment of Global Climate Models

    Science.gov (United States)

    McMahon, T. A.; Peel, M. C.; Karoly, D. J.

    2014-05-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between Global Climate Models (GCMs) and within a GCM. Uncertainty between GCM projections of future climate can be assessed through analysis of runs of a given scenario from a wide range of GCMs. Within GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The objective of this, the first of two complementary papers, is to reduce between-GCM uncertainty by identifying and removing poorly performing GCMs prior to the analysis presented in the second paper. Here we assess how well 46 runs from 22 Coupled Model Intercomparison Project phase 3 (CMIP3) GCMs are able to reproduce observed precipitation and temperature climatological statistics. The performance of each GCM in reproducing these statistics was ranked and better performing GCMs identified for later analyses. Observed global land surface precipitation and temperature data were drawn from the CRU 3.10 gridded dataset and re-sampled to the resolution of each GCM for comparison. Observed and GCM based estimates of mean and standard deviation of annual precipitation, mean annual temperature, mean monthly precipitation and temperature and Köppen climate type were compared. The main metrics for assessing GCM performance were the Nash-Sutcliffe efficiency index and RMSE between modelled and observed long-term statistics. This information combined with a literature review of the performance of the CMIP3 models identified the following five models as the better performing models for the next phase of our analysis in assessing the uncertainty in runoff estimated from GCM projections of precipitation and temperature: HadCM3 (Hadley Centre for Climate Prediction and Research), MIROCM (Center for Climate System Research (The University of Tokyo), National Institute for

  5. Monte-Carlo based Uncertainty Analysis For CO2 Laser Microchanneling Model

    Science.gov (United States)

    Prakash, Shashi; Kumar, Nitish; Kumar, Subrata

    2016-09-01

    CO2 laser microchanneling has emerged as a potential technique for the fabrication of microfluidic devices on PMMA (Poly-methyl-meth-acrylate). PMMA directly vaporizes when subjected to high intensity focused CO2 laser beam. This process results in clean cut and acceptable surface finish on microchannel walls. Overall, CO2 laser microchanneling process is cost effective and easy to implement. While fabricating microchannels on PMMA using a CO2 laser, the maximum depth of the fabricated microchannel is the key feature. There are few analytical models available to predict the maximum depth of the microchannels and cut channel profile on PMMA substrate using a CO2 laser. These models depend upon the values of thermophysical properties of PMMA and laser beam parameters. There are a number of variants of transparent PMMA available in the market with different values of thermophysical properties. Therefore, for applying such analytical models, the values of these thermophysical properties are required to be known exactly. Although, the values of laser beam parameters are readily available, extensive experiments are required to be conducted to determine the value of thermophysical properties of PMMA. The unavailability of exact values of these property parameters restrict the proper control over the microchannel dimension for given power and scanning speed of the laser beam. In order to have dimensional control over the maximum depth of fabricated microchannels, it is necessary to have an idea of uncertainty associated with the predicted microchannel depth. In this research work, the uncertainty associated with the maximum depth dimension has been determined using Monte Carlo method (MCM). The propagation of uncertainty with different power and scanning speed has been predicted. The relative impact of each thermophysical property has been determined using sensitivity analysis.

  6. Microseismic Event Grouping Based on PageRank Linkage at the Newberry Volcano Geothermal Site

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2016-12-01

    The Newberry Volcano DOE FORGE site in Central Oregon has been stimulated two times using high-pressure fluid injection to study the Enhanced Geothermal Systems (EGS) technology. Several hundred microseismic events were generated during the first stimulation in the fall of 2012. Initial locations of this microseismicity do not show well defined subsurface structure in part because event location uncertainties are large (Foulger and Julian, 2013). We focus on this stimulation to explore the spatial and temporal development of microseismicity, which is key to understanding how subsurface stimulation modifies stress, fractures rock, and increases permeability. We use PageRank, Google's initial search algorithm, to determine connectivity within the events (Aguiar and Beroza, 2014) and assess signal-correlation topology for the micro-earthquakes. We then use this information to create signal families and compare these to the spatial and temporal proximity of associated earthquakes. We relocate events within families (identified by PageRank linkage) using the Bayesloc approach (Myers et al., 2007). Preliminary relocations show tight spatial clustering of event families as well as evidence of events relocating to a different cluster than originally reported. We also find that signal similarity (linkage) at several stations, not just one or two, is needed in order to determine that events are in close proximity to one another. We show that indirect linkage of signals using PageRank is a reliable way to increase the number of events that are confidently determined to be similar to one another, which may lead to efficient and effective grouping of earthquakes with similar physical characteristics, such as focal mechanisms and stress drop. Our ultimate goal is to determine whether changes in the state of stress and/or changes in the generation of subsurface fracture networks can be detected using PageRank topology as well as aid in the event relocation to obtain more accurate

  7. Evaluation of satellite and reanalysis-based global net surface energy flux and uncertainty estimates

    Science.gov (United States)

    Allan, Richard; Liu, Chunlei

    2017-04-01

    The net surface energy flux is central to the climate system yet observational limitations lead to substantial uncertainty (Trenberth and Fasullo, 2013; Roberts et al., 2016). A combination of satellite-derived radiative fluxes at the top of atmosphere (TOA) adjusted using the latest estimation of the net heat uptake of the Earth system, and the atmospheric energy tendencies and transports from the ERA-Interim reanalysis are used to estimate surface energy flux globally (Liu et al., 2015). Land surface fluxes are adjusted through a simple energy balance approach using relations at each grid point with the consideration of snowmelt to improve regional realism. The energy adjustment is redistributed over the oceans using a weighting function to avoid meridional discontinuities. Uncertainties in surface fluxes are investigated using a variety of approaches including comparison with a range of atmospheric reanalysis input data and products. Zonal multiannual mean surface flux uncertainty is estimated to be less than 5 Wm-2 but much larger uncertainty is likely for regional monthly values. The meridional energy transport is calculated using the net surface heat fluxes estimated in this study and the result shows better agreement with observations in Atlantic than before. The derived turbulent fluxes (difference between the net heat flux and the CERES EBAF radiative flux at surface) also have good agreement with those from OAFLUX dataset and buoy observations. Decadal changes in the global energy budget and the hemisphere energy imbalances are quantified and present day cross-equator heat transports is re-evaluated as 0.22±0.15 PW southward by the atmosphere and 0.32±0.16 PW northward by the ocean considering the observed ocean heat sinks (Roemmich et al., 2006) . Liu et al. (2015) Combining satellite observations and reanalysis energy transports to estimate global net surface energy fluxes 1985-2012. J. Geophys. Res., Atmospheres. ISSN 2169-8996 doi: 10.1002/2015JD

  8. Applying an animal model to quantify the uncertainties of an image-based 4D-CT algorithm

    Science.gov (United States)

    Pierce, Greg; Wang, Kevin; Battista, Jerry; Lee, Ting-Yim

    2012-06-01

    The purpose of this paper is to use an animal model to quantify the spatial displacement uncertainties and test the fundamental assumptions of an image-based 4D-CT algorithm in vivo. Six female Landrace cross pigs were ventilated and imaged using a 64-slice CT scanner (GE Healthcare) operating in axial cine mode. The breathing amplitude pattern of the pigs was varied by periodically crimping the ventilator gas return tube during the image acquisition. The image data were used to determine the displacement uncertainties that result from matching CT images at the same respiratory phase using normalized cross correlation (NCC) as the matching criteria. Additionally, the ability to match the respiratory phase of a 4.0 cm subvolume of the thorax to a reference subvolume using only a single overlapping 2D slice from the two subvolumes was tested by varying the location of the overlapping matching image within the subvolume and examining the effect this had on the displacement relative to the reference volume. The displacement uncertainty resulting from matching two respiratory images using NCC ranged from 0.54 ± 0.10 mm per match to 0.32 ± 0.16 mm per match in the lung of the animal. The uncertainty was found to propagate in quadrature, increasing with number of NCC matches performed. In comparison, the minimum displacement achievable if two respiratory images were matched perfectly in phase ranged from 0.77 ± 0.06 to 0.93 ± 0.06 mm in the lung. The assumption that subvolumes from separate cine scan could be matched by matching a single overlapping 2D image between to subvolumes was validated. An in vivo animal model was developed to test an image-based 4D-CT algorithm. The uncertainties associated with using NCC to match the respiratory phase of two images were quantified and the assumption that a 4.0 cm 3D subvolume can by matched in respiratory phase by matching a single 2D image from the 3D subvolume was validated. The work in this paper shows the image-based 4D

  9. Sustainability assessment of roadway projects under uncertainty using Green Proforma: An index-based approach

    Directory of Open Access Journals (Sweden)

    Adil Umer

    2016-12-01

    Full Text Available Growing environmental and socioeconomic concerns due to rapid urbanization, population growth and climate change impacts have motivated decision-makers to incorporate sustainable best practices for transportation infrastructure development and management. A “sustainable” transportation infrastructure implies that all the sustainability objectives (i.e., mobility, safety, resource efficiency, economy, ecological protection, environmental quality are adequately met during the infrastructure life cycle. State-of-the-art sustainability rating tools contain the best practices for the sustainability assessment of infrastructure projects. Generally, the existing rating tools are not well equipped to handle uncertainties associated with data limitations and expert opinion and cannot effectively adapt to site specific constraints for reliable sustainability assessment. This paper presents the development of a customizable tool, called “Green Proforma” for the sustainability assessment of roadway projects under uncertainties. For evaluating how well the project meets sustainability objectives, a hierarchical framework is used to develop the sustainability objective indices by aggregating the selected indicators with the help of fuzzy synthetic evaluation technique. These indices are further aggregated to attain an overall sustainability index for a roadway project. To facilitate the decision makers, a “Roadway Project Sustainometer” has been developed to illustrate how well the roadway project is meeting its sustainability objectives. By linking the sustainability objectives to measurable indicators, the “Green Proforma” paves the way for a practical approach in sustainable planning and management of roadway projects.

  10. Characterizing model uncertainties in the life cycle of lignocellulose-based ethanol fuels.

    Science.gov (United States)

    Spatari, Sabrina; MacLean, Heather L

    2010-11-15

    Renewable and low carbon fuel standards being developed at federal and state levels require an estimation of the life cycle carbon intensity (LCCI) of candidate fuels that can substitute for gasoline, such as second generation bioethanol. Estimating the LCCI of such fuels with a high degree of confidence requires the use of probabilistic methods to account for known sources of uncertainty. We construct life cycle models for the bioconversion of agricultural residue (corn stover) and energy crops (switchgrass) and explicitly examine uncertainty using Monte Carlo simulation. Using statistical methods to identify significant model variables from public data sets and Aspen Plus chemical process models,we estimate stochastic life cycle greenhouse gas (GHG) emissions for the two feedstocks combined with two promising fuel conversion technologies. The approach can be generalized to other biofuel systems. Our results show potentially high and uncertain GHG emissions for switchgrass-ethanol due to uncertain CO₂ flux from land use change and N₂O flux from N fertilizer. However, corn stover-ethanol,with its low-in-magnitude, tight-in-spread LCCI distribution, shows considerable promise for reducing life cycle GHG emissions relative to gasoline and corn-ethanol. Coproducts are important for reducing the LCCI of all ethanol fuels we examine.

  11. Formal Uncertainty and Dispersion of Single and Double Difference Models for GNSS-Based Attitude Determination.

    Science.gov (United States)

    Chen, Wen; Yu, Chao; Dong, Danan; Cai, Miaomiao; Zhou, Feng; Wang, Zhiren; Zhang, Lei; Zheng, Zhengqi

    2017-02-20

    With multi-antenna synchronized global navigation satellite system (GNSS) receivers, the single difference (SD) between two antennas is able to eliminate both satellite and receiver clock error, thus it becomes necessary to reconsider the equivalency problem between the SD and double difference (DD) models. In this paper, we quantitatively compared the formal uncertainties and dispersions between multiple SD models and the DD model, and also carried out static and kinematic short baseline experiments. The theoretical and experimental results show that under a non-common clock scheme the SD and DD model are equivalent. Under a common clock scheme, if we estimate stochastic uncalibrated phase delay (UPD) parameters every epoch, this SD model is still equivalent to the DD model, but if we estimate only one UPD parameter for all epochs or take it as a known constant, the SD (here called SD2) and DD models are no longer equivalent. For the vertical component of baseline solutions, the formal uncertainties of the SD2 model are two times smaller than those of the DD model, and the dispersions of the SD2 model are even more than twice smaller than those of the DD model. In addition, to obtain baseline solutions, the SD2 model requires a minimum of three satellites, while the DD model requires a minimum of four satellites, which makes the SD2 more advantageous in attitude determination under sheltered environments.

  12. Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Marvin [Texas A & M Univ., College Station, TX (United States)

    2017-06-12

    This project has sought to develop methodologies, tailored to phenomena that govern nuclearreactor behavior, to produce predictions (including uncertainties) for quantities of interest (QOIs) in the simulation of steady-state and transient reactor behavior. Examples of such predictions include, for each QOI, an expected value as well as a distribution around this value and an assessment of how much of the distribution stems from each major source of uncertainty. The project has sought to test its methodologies by comparing against measured experimental outcomes. The main experimental platform has been a 1-MW TRIGA reactor. This is a flexible platform for a wide range of experiments, including steady state with and without temperature feedback, slow transients with and without feedback, and rapid transients with strong feedback. The original plan was for the primary experimental data to come from in-core neutron detectors. We made considerable progress toward this goal but did not get as far along as we had planned. We have designed, developed, installed, and tested vertical guide tubes, each able to accept a detector or stack of detectors that can be moved axially inside the tube, and we have tested several new detector designs. One of these shows considerable promise.

  13. Formal Uncertainty and Dispersion of Single and Double Difference Models for GNSS-Based Attitude Determination

    Directory of Open Access Journals (Sweden)

    Wen Chen

    2017-02-01

    Full Text Available With multi-antenna synchronized global navigation satellite system (GNSS receivers, the single difference (SD between two antennas is able to eliminate both satellite and receiver clock error, thus it becomes necessary to reconsider the equivalency problem between the SD and double difference (DD models. In this paper, we quantitatively compared the formal uncertainties and dispersions between multiple SD models and the DD model, and also carried out static and kinematic short baseline experiments. The theoretical and experimental results show that under a non-common clock scheme the SD and DD model are equivalent. Under a common clock scheme, if we estimate stochastic uncalibrated phase delay (UPD parameters every epoch, this SD model is still equivalent to the DD model, but if we estimate only one UPD parameter for all epochs or take it as a known constant, the SD (here called SD2 and DD models are no longer equivalent. For the vertical component of baseline solutions, the formal uncertainties of the SD2 model are two times smaller than those of the DD model, and the dispersions of the SD2 model are even more than twice smaller than those of the DD model. In addition, to obtain baseline solutions, the SD2 model requires a minimum of three satellites, while the DD model requires a minimum of four satellites, which makes the SD2 more advantageous in attitude determination under sheltered environments.

  14. Possibility of reinforcement learning based on event-related potential.

    Science.gov (United States)

    Yamagishi, Yuya; Tsubone, Tadashi; Wada, Yasuhiro

    2008-01-01

    We applied event-related potential (ERP) to reinforcement signals that are equivalent to reward and punishment signals.We conducted an electroencephalogram (EEG) in which volunteers identified the success or failure of a task. We confirmed that there were differences in the EEG depending on whether the task was successful or not and suggested that ERP might be used as a reward of reinforcement leaning. We used a support vector machine (SVM) for recognizing the P300. We selected the feature vector in SVM that was composed of averages of each 50 ms for each of the six channels (C3,Cz,C4,P3,Pz,P4) for a total of 700 ms. We can suggest that reinforcement learning using P300 can be performed accurately.

  15. Using participatory agent-based models to measure flood managers' decision thresholds in extreme event response

    Science.gov (United States)

    Metzger, A.; Douglass, E.; Gray, S. G.

    2016-12-01

    Extreme flooding impacts to coastal cities are not only a function of storm characteristics, but are heavily influenced by decision-making and preparedness in event-level response. While recent advances in climate and hydrological modeling make it possible to predict the influence of climate change on storm and flooding patterns, flood managers still face a great deal of uncertainty related to adapting organizational responses and decision thresholds to these changing conditions. Some decision thresholds related to mitigation of extreme flood impacts are well-understood and defined by organizational protocol, but others are difficult to quantify due to reliance on contextual expert knowledge, experience, and complexity of information necessary to make certain decisions. Our research attempts to address this issue by demonstrating participatory modeling methods designed to help flood managers (1) better understand and parameterize local decision thresholds in extreme flood management situations, (2) collectively learn about scaling management decision thresholds to future local flooding scenarios and (3) identify effective strategies for adaptating flood mitigation actions and organizational response to climate change-intensified flooding. Our agent-based system dynamic models rely on expert knowledge from local flood managers and sophisticated, climate change-informed hydrological models to simulate current and future flood scenarios. Local flood managers from interact with these models by receiving dynamic information and making management decisions as a flood scenario progresses, allowing parametrization of decision thresholds under different scenarios. Flooding impacts are calculated in each iteration as a means of discussing effectiveness of responses and prioritizing response alternatives. We discuss the findings of this participatory modeling and educational process from a case study of Boston, MA, and discuss transferability of these methods to other types

  16. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...... uncertainty was verified from independent measurements of the same sample by demonstrating statistical control of analytical results and the absence of bias. The proposed method takes into account uncertainties of the measurement, as well as of the amount of calibrant. It is applicable to all types...

  17. Asteroid! An Event-Based Science Module. Teacher's Guide. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  18. Asteroid! An Event-Based Science Module. Student Edition. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  19. Oil Spill! An Event-Based Science Module. Student Edition. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  20. Oil Spill!: An Event-Based Science Module. Teacher's Guide. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  1. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  2. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  3. Hurricane! An Event-Based Science Module. Student Edition. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  4. Hurricane!: An Event-Based Science Module. Teacher's Guide. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn about problems with hurricanes and scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning,…

  5. Fraud! An Event-Based Science Module. Student Edition. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  6. Fraud! An Event-Based Science Module. Teacher's Guide. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  7. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing...

  8. Event detection using population-based health care databases in randomized clinical trials

    DEFF Research Database (Denmark)

    Thuesen, Leif; Jensen, Lisette Okkels; Tilsted, Hans Henrik

    2013-01-01

    To describe a new research tool, designed to reflect routine clinical practice and relying on population-based health care databases to detect clinical events in randomized clinical trials.......To describe a new research tool, designed to reflect routine clinical practice and relying on population-based health care databases to detect clinical events in randomized clinical trials....

  9. Parton Shower Uncertainties with Herwig 7: Benchmarks at Leading Order

    CERN Document Server

    Bellm, Johannes; Plätzer, Simon; Schichtel, Peter; Siódmok, Andrzej

    2016-01-01

    We perform a detailed study of the sources of perturbative uncertainty in parton shower predictions within the Herwig 7 event generator. We benchmark two rather different parton shower algorithms, based on angular-ordered and dipole-type evolution, against each other. We deliberately choose leading order plus parton shower as the benchmark setting to identify a controllable set of uncertainties. This will enable us to reliably assess improvements by higher-order contributions in a follow-up work.

  10. Model-Checking of Component-Based Event-Driven Real-Time Embedded Software

    National Research Council Canada - National Science Library

    Gu, Zonghua; Shin, Kang G

    2005-01-01

    .... We discuss application of model-checking to verify system-level concurrency properties of component-based real-time embedded software based on CORBA Event Service, using Avionics Mission Computing...

  11. Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique

    Directory of Open Access Journals (Sweden)

    Bartosz Jachimczyk

    2017-01-01

    Full Text Available The increased potential and effectiveness of Real-time Locating Systems (RTLSs substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system’s configuration and LS’s relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS’ localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision.

  12. Uncertainty Estimator based Nonlinear Feedback Control for Tracking Trajectories in a Class of Continuous Bioreactor

    Directory of Open Access Journals (Sweden)

    Maria Isabel Neria-Gonzále

    2015-04-01

    Full Text Available The main goal of this work is presents an alternative design of a class of nonlinear controller for tracking trajectories in a class of continuous bioreactor. It is assumed that the reaction rate of the controlled variable is unknown, therefore an uncertainty estimator is proposed to infer this important term, and the observer is coupled with a class of nonlinear feedback. The considered controller contains a class of continuous sigmoid feedback in order to provide smooth closed-loop response of the considered bioreactor. A kinetic model of a sulfate-reducing system is experimentally corroborated and is employed as a benchmark for further modeling and simulation of the continuous operation. A linear PI controller, a class of sliding-mode controller and the proposed one are compared and it is show that the proposed controller yields the best performance. The closed-loop behavior of the process is analyzed via numerical experiments.

  13. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, Xiao [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Iaccarino, Gianluca [Stanford Univ., CA (United States); Mittal, Akshay [Stanford Univ., CA (United States)

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  14. A PCE-based multiscale framework for the characterization of uncertainties in complex systems

    Science.gov (United States)

    Mehrez, Loujaine; Fish, Jacob; Aitharaju, Venkat; Rodgers, Will R.; Ghanem, Roger

    2017-11-01

    This paper presents a framework for the modeling and analysis of material systems that exhibit uncertainties in their constituents at all scales. The framework integrates multiscale formalism with a polynomial chaos construction enabling an explicit representation of quantities of interests, at any scale, in terms of any form of underlying uncertain parameters, a key feature to model multiscale dependencies. It is demonstrated how the framework can successfully tackle settings where a hierarchy of scales must be explicitly modeled. The application of this framework is illustrated in the construction of stochastic models of mesoscale and macroscale properties of non-crimp fabric composites. Joint statistical properties of upscaled components of the composite, including properties of tow, laminae and laminate, are computed.

  15. Tighter Einstein-Podolsky-Rosen steering inequality based on the sum-uncertainty relation

    Science.gov (United States)

    Maity, Ananda G.; Datta, Shounak; Majumdar, A. S.

    2017-11-01

    We consider the uncertainty bound on the sum of variances of two incompatible observables in order to derive a corresponding steering inequality. Our steering criterion, when applied to discrete variables, yields the optimum steering range for two-qubit Werner states in the two-measurement and two-outcome scenario. We further employ the derived steering relation for several classes of continuous-variable systems. We show that non-Gaussian entangled states such as the photon-subtracted squeezed vacuum state and the two-dimensional harmonic-oscillator state furnish greater violation of the sum steering relation compared to the Reid criterion as well as the entropic steering criterion. The sum steering inequality provides a tighter steering condition to reveal the steerability of continuous-variable states.

  16. You Never Walk Alone: Recommending Academic Events Based on Social Network Analysis

    Science.gov (United States)

    Klamma, Ralf; Cuong, Pham Manh; Cao, Yiwei

    Combining Social Network Analysis and recommender systems is a challenging research field. In scientific communities, recommender systems have been applied to provide useful tools for papers, books as well as expert finding. However, academic events (conferences, workshops, international symposiums etc.) are an important driven forces to move forwards cooperation among research communities. We realize a SNA based approach for academic events recommendation problem. Scientific communities analysis and visualization are performed to provide an insight into the communities of event series. A prototype is implemented based on the data from DBLP and EventSeer.net, and the result is observed in order to prove the approach.

  17. Uncertainty-Based Map Matching: The Space-Time Prism and k-Shortest Path Algorithm

    Directory of Open Access Journals (Sweden)

    Bart Kuijpers

    2016-11-01

    Full Text Available Location-aware devices can be used to record the positions of moving objects for further spatio-temporal data analysis. For instance, we can analyze the routes followed by a person or a group of people, to discover hidden patterns in trajectory data. Typically, the positions of moving objects are registered by GPS devices, and most of the time, the recorded positions do not match the road actually followed by the object carrying the device, due to different sources of errors. Thus, matching the moving object’s actual position to a location on a digital map is required. The problem of matching GPS-recorded positions to a road network is called map matching (MM. Although many algorithms have been proposed to solve this problem, few of them consider the uncertainty caused by the absence of information about the moving object’s position in-between consecutive recorded locations. In this paper, we study the relationship between map matching and uncertainty, and we propose a novel MM algorithm that uses space-time prisms in combination with weighted k-shortest path algorithms. We applied our algorithm to real-world cases and to computer-generated trajectory samples with a variety of properties. We compare our results against a number of well-known algorithms that we have also implemented and show that it outperforms existing algorithms, allowing us to obtain better matches, with a negligible loss in performance. In addition, we propose a novel accuracy measure that allows a better comparison between different MM algorithms. We applied this novel measure to compare our algorithm against existing algorithms.

  18. Nanoparticles: Uncertainty Risk Analysis

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders

    2012-01-01

    Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard...... approaches. To date, there have been a number of different approaches to assess uncertainty of environmental risks in general, and some have also been proposed in the case of nanoparticles and nanomaterials. In recent years, others have also proposed that broader assessments of uncertainty are also needed...... in order to handle the complex potential risks of nanoparticles, including more descriptive characterizations of uncertainty. Some of these approaches are presented and discussed herein, in which the potential strengths and limitations of these approaches are identified along with further challenges...

  19. Quantifying uncertainties in tracer-based hydrograph separations: a case study for two-, three- and five-component hydrograph separations in a mountainous catchment

    Science.gov (United States)

    Uhlenbrook, Stefan; Hoeg, Simon

    2003-02-01

    The hydrograph separation technique using natural tracers, in which different runoff components are quantified according to their chemical signature, is a widely used method for investigating runoff generation processes at the catchment scale. The first objective of this study is to demonstrate a modified methodology for separating three and five runoff components using 18O and dissolved silica as tracers. The second is to evaluate, with an uncertainty propagation technique using Gaussian error estimators, the hydrograph separation uncertainties that arise due to different error effects.During four summer storm events, an interaction among three main runoff components having distinct dissolved silica concentrations was demonstrated for the mountainous Zastler catchment (18·4 km2, southern Black Forest Mountains, southwest Germany). The three main runoff components are surface storage (low silica, saturated and impermeable areas), shallow ground water (medium silica, periglacial and glacial drift cover), and deep ground water (high silica, crystalline detritus and hard rock aquifer). Together with the event and pre-event water fractions of surface runoff and shallow ground water runoff, five runoff components are considered in all. Pre-event water from shallow ground water storage dominated the total discharge during floods and was also important during low flows. Event water from shallow ground water was detectable only during the falling limb of a larger flood with high antecedent moisture conditions and during the peaks of three events with low antecedent moisture conditions. Runoff from surface storage is only significant during floods and can be composed of event and pre-event water. The latter reacts later and is important only during the peak of the large event with high antecedent moisture conditions. Runoff from the deeper ground water behaves quite consistently (pure pre-event water).It is demonstrated that large relative uncertainties must be considered

  20. A Statistical Framework for Microbial Source Attribution: Measuring Uncertainty in Host Transmission Events Inferred from Genetic Data (Part 2 of a 2 Part Report)

    Energy Technology Data Exchange (ETDEWEB)

    Allen, J; Velsko, S

    2009-11-16

    This report explores the question of whether meaningful conclusions can be drawn regarding the transmission relationship between two microbial samples on the basis of differences observed between the two sample's respective genomes. Unlike similar forensic applications using human DNA, the rapid rate of microbial genome evolution combined with the dynamics of infectious disease require a shift in thinking on what it means for two samples to 'match' in support of a forensic hypothesis. Previous outbreaks for SARS-CoV, FMDV and HIV were examined to investigate the question of how microbial sequence data can be used to draw inferences that link two infected individuals by direct transmission. The results are counter intuitive with respect to human DNA forensic applications in that some genetic change rather than exact matching improve confidence in inferring direct transmission links, however, too much genetic change poses challenges, which can weaken confidence in inferred links. High rates of infection coupled with relatively weak selective pressure observed in the SARS-CoV and FMDV data lead to fairly low confidence for direct transmission links. Confidence values for forensic hypotheses increased when testing for the possibility that samples are separated by at most a few intermediate hosts. Moreover, the observed outbreak conditions support the potential to provide high confidence values for hypothesis that exclude direct transmission links. Transmission inferences are based on the total number of observed or inferred genetic changes separating two sequences rather than uniquely weighing the importance of any one genetic mismatch. Thus, inferences are surprisingly robust in the presence of sequencing errors provided the error rates are randomly distributed across all samples in the reference outbreak database and the novel sequence samples in question. When the number of observed nucleotide mutations are limited due to characteristics of the

  1. Event-Based Proof of the Mutual Exclusion Property of Peterson’s Algorithm

    Directory of Open Access Journals (Sweden)

    Ivanov Ievgen

    2015-12-01

    Full Text Available Proving properties of distributed algorithms is still a highly challenging problem and various approaches that have been proposed to tackle it [1] can be roughly divided into state-based and event-based proofs. Informally speaking, state-based approaches define the behavior of a distributed algorithm as a set of sequences of memory states during its executions, while event-based approaches treat the behaviors by means of events which are produced by the executions of an algorithm. Of course, combined approaches are also possible.

  2. Addressing global uncertainty and sensitivity in first-principles based microkinetic models by an adaptive sparse grid approach

    Science.gov (United States)

    Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian

    2018-01-01

    In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.

  3. Uncertainty Calculation for Spectral-Responsivity Measurements

    National Research Council Canada - National Science Library

    Lehman, John H; Wang, C M; Dowell, Marla L; Hadler, Joshua A

    2009-01-01

    .... Relative expanded uncertainties based on the methods from the Guide to the Expression of Uncertainty in Measurement and from Supplement 1 to the "Guide to the Expression of Uncertainty in Measurement...

  4. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    Science.gov (United States)

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Uncertainty analysis for effluent trading planning using a Bayesian estimation-based simulation-optimization modeling approach.

    Science.gov (United States)

    Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J

    2017-06-01

    In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision

  6. A novel probabilistic framework for event-based speech recognition

    Science.gov (United States)

    Juneja, Amit; Espy-Wilson, Carol

    2003-10-01

    One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.

  7. Wavelet based denoising of power quality events for characterization

    African Journals Online (AJOL)

    user

    Angrisani L., Daponte P., D'Apuuo M. and Testa A., 1996, A new wavelet transform based procedure for electrical power quality analysis, Proceedings of the International Conference on Harmonics and Quality of Power (ICHQP), Las Vegas, Nevada,. USA, pp. 608-614. Bollen Math H.J., 2000, Understanding power quality ...

  8. Event Highlight: Nigeria Evidence-based Health System Initiative

    International Development Research Centre (IDRC) Digital Library (Canada)

    2012-06-01

    Jun 1, 2012 ... Since limited resources are available for life-saving health services in Nigeria, those who plan health programs need to know which interventions are most effective and how to prioritise them. An important objective of the Nigeria Evidence-based Health. System Initiative (NEHSI) is to build the capacity of.

  9. Event-based media processing and analysis: A survey of the literature

    OpenAIRE

    Tzelepis, Christos; Ma, Zhigang; MEZARIS, Vasileios; Ionescu, Bogdan; Kompatsiaris, Ioannis; Boato, Giulia; Sebe, Nicu; Yan, Shuicheng

    2016-01-01

    Research on event-based processing and analysis of media is receiving an increasing attention from the scientific community due to its relevance for an abundance of applications, from consumer video management and video surveillance to lifelogging and social media. Events have the ability to semantically encode relationships of different informational modalities, such as visual-audio-text, time, involved agents and objects, with the spatio-temporal component of events being a key feature for ...

  10. Personalized Event-Based Surveillance and Alerting Support for the Assessment of Risk

    OpenAIRE

    Stewar, Avaré; Lage, Ricardo; Diaz-Aviles, Ernesto; Dolog, Peter

    2011-01-01

    In a typical Event-Based Surveillance setting, a stream of web documents is continuously monitored for disease reporting. A structured representation of the disease reporting events is extracted from the raw text, and the events are then aggregated to produce signals, which are intended to represent early warnings against potential public health threats. To public health officials, these warnings represent an overwhelming list of "one-size-fits-all" information for risk assessment. To reduce ...

  11. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    Science.gov (United States)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  12. Tag and Neighbor based Recommender systems for Medical events

    DEFF Research Database (Denmark)

    Bayyapu, Karunakar Reddy; Dolog, Peter

    2010-01-01

    This paper presents an extension of a multifactor recommendation approach based on user tagging with term neighbours. Neighbours of words in tag vectors and documents provide for hitting larger set of documents and not only those matching with direct tag vectors or content of the documents. Tag...... in the situations where the quality of tags is lower. We discuss the approach on the examples from the existing Medworm system to indicate the usefulness of the approach....

  13. Location-Based Events Detection on Micro-Blogs

    OpenAIRE

    Santos, Augusto Dias Pereira dos; Wives, Leandro Krug; Alvares, Luis Otavio

    2012-01-01

    The increasing use of social networks generates enormous amounts of data that can be used for many types of analysis. Some of these data have temporal and geographical information, which can be used for comprehensive examination. In this paper, we propose a new method to analyze the massive volume of messages available in Twitter to identify places in the world where topics such as TV shows, climate change, disasters, and sports are emerging. The proposed method is based on a neural network t...

  14. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  15. Predictive Event Triggered Control based on Heuristic Dynamic Programming for Nonlinear Continuous Time Systems

    Science.gov (United States)

    2015-08-17

    Control based on Heuristic Dynamic Programming for Nonlinear Continuous-Time Systems In this paper, a novel predictive event-triggered control...method based on heuristic dynamic programming (HDP) algorithm is developed for nonlinear continuous-time systems. A model network is used to estimate...College Road, Suite II Kingston, RI 02881 -1967 ABSTRACT Predictive Event-Triggered Control based on Heuristic Dynamic Programming for Nonlinear

  16. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources.

    Science.gov (United States)

    Solovyev, Valery; Ivanov, Vladimir

    2016-01-01

    Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts) to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessary for natural language processing. In this paper we define a set of linguistic resources that are necessary in development of a knowledge-based event extraction system in Russian: a vocabulary of subordination models, a vocabulary of event triggers, and a vocabulary of Frame Elements that are basic building blocks for semantic patterns. We propose a set of methods for creation of such vocabularies in Russian and other languages using Google Books NGram Corpus. The methods are evaluated in development of event extraction system for Russian.

  17. Randomized control trial investigating the efficacy of a computer-based intolerance of uncertainty intervention.

    Science.gov (United States)

    Oglesby, Mary E; Allan, Nicholas P; Schmidt, Norman B

    2017-08-01

    Intolerance of uncertainty (IU) is an important transdiagnostic variable within various anxiety and mood disorders. Theory suggests that individuals high in IU interpret ambiguous information in a more threatening manner. A parallel line of research has shown that interpretive biases can be modified through cognitive training and previous research aimed at modifying negative interpretations through Cognitive Bias Modification (CBM-I) has yielded promising results. Despite these findings, no research to date has examined the efficacy of an IU-focused CBM-I paradigm. The current study investigated the impact of a brief IU-focused CBM-I on reductions in IU. Participants selected for a high IU interpretation bias (IU-IB) were randomly assigned to an active (IU CBM-I) or control CBM-I condition. Results indicated that our active IU CBM-I was associated with significant changes in IU-IB from pre-to-post intervention as well as with significant reductions in IU at post-intervention and month-one follow-up. Findings also found that the IU CBM-I led to reductions in IU self-report via the hypothesized mechanism. This study is the first to provide evidence that a CBM-I focused on IU is effective in reducing IU-IB and IU across time and suggest that IU CBM-I paradigms may be a novel prevention/intervention treatment for anxiety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...

  19. Projection of changes in the frequency of heavy rain events over Hawaii based on leading Pacific climate modes

    Science.gov (United States)

    Elison Timm, O.; Diaz, H. F.; Giambelluca, T. W.; Takahashi, M.

    2011-02-01

    This study investigates the frequency of heavy rainfall events in Hawaii during the wet season (October-April) 1958-2005 and their conditional dependence on the Pacific-North American (PNA) pattern and El Niño-Southern Oscillation (ENSO). Heavy rain events are defined by the 95% quantile in the rainfall distribution of the wet seasons. Twelve stations with daily reports of rainfall amounts were used to count the number of heavy rain days during wet seasons. Multiple linear regression (MLR) indicated that the PNA index (PNAI) and the Southern Oscillation Index (SOI) can explain a significant amount of the interannual to interdecadal variability for 9 out of 12 stations. Cross validation showed that PNAI and SOI together explain about 18-44% of the variability in the number of heavy rain events. Furthermore, the MLR model reproduces the trend toward fewer heavy rain events in the years after the Pacific climate shift in the mid-1970s. The MLR model was applied to the projected PNAI and SOI indices that were obtained from six IPCC AR4 climate models. The current suite of AR4 simulations based on the A1B and A2 emissions scenarios projects small and equivocal changes in the mean state of the SOI and PNAI during the 21st century. The covariance between PNAI and SOI in these simulations appears to be stable. To the extent that variations in the frequency and magnitude of ENSO and the PNA mode are responsible for modulating extreme rainfall occurrence in Hawaii, our results indicate small changes in the projected number of heavy rainfall days with large uncertainties resulting from disparities among the climate models.

  20. Adaptive grid based multi-objective Cauchy differential evolution for stochastic dynamic economic emission dispatch with wind power uncertainty

    Science.gov (United States)

    Lei, Xiaohui; Wang, Chao; Yue, Dong; Xie, Xiangpeng

    2017-01-01

    Since wind power is integrated into the thermal power operation system, dynamic economic emission dispatch (DEED) has become a new challenge due to its uncertain characteristics. This paper proposes an adaptive grid based multi-objective Cauchy differential evolution (AGB-MOCDE) for solving stochastic DEED with wind power uncertainty. To properly deal with wind power uncertainty, some scenarios are generated to simulate those possible situations by dividing the uncertainty domain into different intervals, the probability of each interval can be calculated using the cumulative distribution function, and a stochastic DEED model can be formulated under different scenarios. For enhancing the optimization efficiency, Cauchy mutation operation is utilized to improve differential evolution by adjusting the population diversity during the population evolution process, and an adaptive grid is constructed for retaining diversity distribution of Pareto front. With consideration of large number of generated scenarios, the reduction mechanism is carried out to decrease the scenarios number with covariance relationships, which can greatly decrease the computational complexity. Moreover, the constraint-handling technique is also utilized to deal with the system load balance while considering transmission loss among thermal units and wind farms, all the constraint limits can be satisfied under the permitted accuracy. After the proposed method is simulated on three test systems, the obtained results reveal that in comparison with other alternatives, the proposed AGB-MOCDE can optimize the DEED problem while handling all constraint limits, and the optimal scheme of stochastic DEED can decrease the conservation of interval optimization, which can provide a more valuable optimal scheme for real-world applications. PMID:28961262

  1. Adaptive grid based multi-objective Cauchy differential evolution for stochastic dynamic economic emission dispatch with wind power uncertainty.

    Science.gov (United States)

    Zhang, Huifeng; Lei, Xiaohui; Wang, Chao; Yue, Dong; Xie, Xiangpeng

    2017-01-01

    Since wind power is integrated into the thermal power operation system, dynamic economic emission dispatch (DEED) has become a new challenge due to its uncertain characteristics. This paper proposes an adaptive grid based multi-objective Cauchy differential evolution (AGB-MOCDE) for solving stochastic DEED with wind power uncertainty. To properly deal with wind power uncertainty, some scenarios are generated to simulate those possible situations by dividing the uncertainty domain into different intervals, the probability of each interval can be calculated using the cumulative distribution function, and a stochastic DEED model can be formulated under different scenarios. For enhancing the optimization efficiency, Cauchy mutation operation is utilized to improve differential evolution by adjusting the population diversity during the population evolution process, and an adaptive grid is constructed for retaining diversity distribution of Pareto front. With consideration of large number of generated scenarios, the reduction mechanism is carried out to decrease the scenarios number with covariance relationships, which can greatly decrease the computational complexity. Moreover, the constraint-handling technique is also utilized to deal with the system load balance while considering transmission loss among thermal units and wind farms, all the constraint limits can be satisfied under the permitted accuracy. After the proposed method is simulated on three test systems, the obtained results reveal that in comparison with other alternatives, the proposed AGB-MOCDE can optimize the DEED problem while handling all constraint limits, and the optimal scheme of stochastic DEED can decrease the conservation of interval optimization, which can provide a more valuable optimal scheme for real-world applications.

  2. Optimizing graph-based patterns to extract biomedical events from the literature

    OpenAIRE

    Liu, Haibin; Verspoor, Karin; Comeau, Donald C; MacKinlay, Andrew D; Wilbur, W John

    2015-01-01

    In BioNLP-ST 2013 We participated in the BioNLP 2013 shared tasks on event extraction. Our extraction method is based on the search for an approximate subgraph isomorphism between key context dependencies of events and graphs of input sentences. Our system was able to address both the GENIA (GE) task focusing on 13 molecular biology related event types and the Cancer Genetics (CG) task targeting a challenging group of 40 cancer biology related event types with varying arguments concerning 18 ...

  3. Vocalizations during post-conflict affiliations from victims toward aggressors based on uncertainty in Japanese macaques.

    Directory of Open Access Journals (Sweden)

    Noriko Katsu

    Full Text Available We investigated the use of vocalizations called "grunts," "girneys," and "coos" accompanied by post-conflict affiliative interaction between former opponents (reconciliation in Japanese macaques (Macaca fuscata. Although reconciliation functions to repair bonds, such interactions sometimes entail risks of receiving further aggression. Vocalizations can be used at a distance from the former opponent; thus, we predict that vocalizations are used particularly by victims of a conflict, and are frequently used in situations of uncertainty when it is difficult for them to estimate whether the former opponent will resume aggression. In addition, we predict that vocalizations are effective in preventing further aggression. To test these hypotheses, we conducted observations of post-conflict and matched-control situations in female Japanese macaques living in a free-ranging group. We found that former opponents tended to be attracted to each other within the first minute following a conflict, thus demonstrating reconciliation behavior. Vocalizations were more frequently used by the victims in post-conflict interactions than under control situations; however, this tendency was not found in aggressors. When affiliation with the former opponent occurred, victims were more likely to use vocalizations towards less familiar opponents. These findings suggest that Japanese macaques used vocalizations more often when interacting with less predictable former opponents. Victims were more likely to receive aggression from former aggressors when engaged in affiliations with them than under no such affiliations. No significant differences were found in the probability of the victims receiving aggression, regardless of whether they used vocalizations; thus, whether the victim benefits from using vocalizations in these contexts remains unclear. Japanese macaques form despotic societies and therefore, further aggression was inevitable, to some degree, after a conflict

  4. Aerosol direct radiative forcing based on GEOS-Chem-APM and uncertainties

    Directory of Open Access Journals (Sweden)

    X. Ma

    2012-06-01

    Full Text Available Aerosol direct radiative forcing (DRF plays an important role in global climate change but has a large uncertainty. Here we investigate aerosol DRF with GEOS-Chem-APM, a recently developed global aerosol microphysical model that is designed to capture key particle properties (size, composition, coating of primary particles by volatile species, etc.. The model, with comprehensive chemistry, microphysics and up-to-date emission inventories, is driven by assimilated meteorology, which is presumably more realistic compared to the model-predicted meteorology. For this study, the model is extended by incorporating a radiation transfer model. Optical properties are calculated using Mie theory, where the core-shell configuration could be treated with the refractive indices from the recently updated values available in the literature. The surface albedo is taken from MODIS satellite retrievals for the simulation year, in which the data set for the 8-day mean at 0.05° (5600 m resolution for 7 wavebands is provided. We derive the total and anthropogenic aerosol DRF, mainly focus on the results of anthropogenic aerosols, and then compare with those values reported in previous studies. In addition, we examine the anthropogenic aerosol DRF's dependence on several key factors, including the particle size of black carbon (BC and primary organic carbon (POC, the density of BC and the mixing state. Our studies show that the anthropogenic aerosol DRF at top of atmosphere (TOA for all sky is −0.41 W m−2. However, the sensitivity experiments suggest that the magnitude could vary from −0.08 W m−2 to −0.61 W m−2, depending on assumptions regarding the mixing state, size and density of particles.

  5. Projected changes, climate change signal, and uncertainties in the CMIP5-based projections of ocean surface wave heights

    Science.gov (United States)

    Wang, Xiaolan; Feng, Yang; Swail, Val R.

    2016-04-01

    Ocean surface waves can be major hazards in coastal and offshore activities. However, wave observations are available only at limited locations and cover only the recent few decades. Also, there exists very limited information on ocean wave behavior in response to climate change, because such information is not simulated in current global climate models. In a recent study, we used a multivariate regression model with lagged dependent variable to make statistical global projections of changes in significant wave heights (Hs) using mean sea level pressure (SLP) information from 20 CMIP5 climate models for the twenty-first century. The statistical model was calibrated and validated using the ERA-Interim reanalysis of Hs and SLP for the period 1981-2010. The results show Hs increases in the tropics (especially in the eastern tropical Pacific) and in southern hemisphere high-latitudes. Under the projected 2070-2099 climate condition of the RCP8.5 scenario, the occurrence frequency of the present-day one-in-10-year extreme wave heights is likely to double or triple in several coastal regions around the world (e.g., the Chilean coast, Gulf of Oman, Gulf of Bengal, Gulf of Mexico). More recently, we used the analysis of variance approaches to quantify the climate change signal and uncertainty in multi-model ensembles of statistical Hs simulations globally, which are based on the CMIP5 historical, RCP4.5 and RCP8.5 forcing scenario simulations of SLP. In a 4-model 3-run ensemble, the 4-model common signal of climate change is found to strengthen over time, as would be expected. For the historical followed by RCP8.5 scenario, the common signal in annual mean Hs is found to be significant over 16.6%, 55.0% and 82.2% of the area by year 2005, 2050 and 2099, respectively. For the annual maximum, the signal is much weaker. The signal is strongest in the eastern tropical Pacific, featuring significant increases in both the annual mean and maximum of Hs in this region. The climate

  6. Risk-based Operation and Maintenance Approach for Wave Energy Converters Taking Weather Forecast Uncertainties into Account

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kramer, Morten Mejlhede; Sørensen, John Dalsgaard

    2016-01-01

    Inspection and maintenance costs are significant contributors to the cost of energy for wave energy converters. Maintenance can be performed after failure (corrective) or before a breakdown (preventive) occurs. Furthermore, helicopter and boat can be used to transport equipment and personnel to t...... as uncertainties related with imperfect weather forecasts, costs, structural damage accumulation, inspection accuracy and the applied maintenance strategies. This article contains a case study where the risk-based maintenance strategy is applied for the Wavestar device....... to the device for operation and maintenance actions. This article focusses on a risk-based inspection and maintenance planning approach involving minimization of the overall repair costs including costs due to lost electricity production. The study includes real weather data and damage accumulation as well...

  7. Risk-based Operation and Maintenance Approach for Wave Energy Converters Taking Weather Forecast Uncertainties into Account

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kramer, Morten Mejlhede; Sørensen, John Dalsgaard

    2016-01-01

    Inspection and maintenance costs are significant contributors to the cost of energy for wave energy converters. Maintenance can be performed after failure (corrective) or before a breakdown (preventive) occurs. Furthermore, helicopter and boat can be used to transport equipment and personnel...... to the device for operation and maintenance actions. This article focusses on a risk-based inspection and maintenance planning approach involving minimization of the overall repair costs including costs due to lost electricity production. The study includes real weather data and damage accumulation as well...... as uncertainties related with imperfect weather forecasts, costs, structural damage accumulation, inspection accuracy and the applied maintenance strategies. This article contains a case study where the risk-based maintenance strategy is applied for the Wavestar device....

  8. Pymote: High Level Python Library for Event-Based Simulation and Evaluation of Distributed Algorithms

    National Research Council Canada - National Science Library

    Arbula, Damir; Lenac, Kristijan

    2013-01-01

    .... Simulation is a fundamental part of distributed algorithm design and evaluation process. In this paper, we present a library for event-based simulation and evaluation of distributed algorithms...

  9. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  10. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources

    OpenAIRE

    Solovyev, Valery; Ivanov, Vladimir

    2016-01-01

    Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts) to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessar...

  11. Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model ?

    OpenAIRE

    Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse

    2017-01-01

    Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algori...

  12. Matters of uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

      The relationship between lexico-grammar, semantics and context, mapped as scales of delicacy, has been shown to be crucial in the description of the complex ways in which semiotic events change with changing situations. In such processes the three context parameters Field, Tenor and Mode...... are equally important, but some semiotic events tend to foreground one parameter more than others. For instance texts (spoken as well as written) with salient tenor relationships tend to select interpersonal lexico-grammatical resources to express intersubjective stance and psychological states of mind...... such as probability, uncertainty, insecurity, etc. Primary resources for this purpose are the system of speech functions described by Halliday (1994) and the Appraisal framework developed primarily by Martin and White (2005).     This paper explores how uncertainty is expressed by different actors in different...

  13. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  14. Event-Based Control for Average Consensus of Wireless Sensor Networks with Stochastic Communication Noises

    Directory of Open Access Journals (Sweden)

    Chuan Ji

    2013-01-01

    Full Text Available This paper focuses on the average consensus problem for the wireless sensor networks (WSNs with fixed and Markovian switching, undirected and connected network topologies in the noise environment. Event-based protocol is applied to each sensor node to reach the consensus. An event triggering strategy is designed based on a Lyapunov function. Under the event trigger condition, some sufficient conditions for average consensus in mean square are obtained. Finally, some numerical simulations are given to illustrate the effectiveness of the results derived in this paper.

  15. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  16. Dosimetric impact of contouring and needle reconstruction uncertainties in US-, CT- and MRI-based high-dose-rate prostate brachytherapy treatment planning.

    Science.gov (United States)

    Rylander, Susanne; Buus, Simon; Pedersen, Erik M; Bentzen, Lise; Tanderup, Kari

    2017-04-01

    The purpose was to evaluate the dosimetric impact of target contouring and needle reconstruction uncertainties in an US-, CT- and MRI-based HDR prostate BT treatment planning. US, CT, and MR images were acquired post-needle insertion in 22 HDR-BT procedures for 11 consecutive patients. Dose plans were simulated for an US-, CT- and MRI-based HDR-BT treatment planning procedure. Planning uncertainties in US- and CT-based plans were evaluated using MRI-based planning as reference. Target (CTV Prostate ) was re-contoured on MRI. Dose results were expressed in total equivalent dose given in 2Gy fractionation dose for EBRT (46Gy) plus 2 HDR-BT fractions. Uncertainties in US- and CT-based planning caused the planned CTV Prostate -D 90% to decrease with a mean of 2.9±5.0Gy (p=0.03) and 2.9±2.9Gy (p=0.001), respectively. The intra-observer contouring variation on MRI resulted in a mean variation of 1.6±1.5Gy in CTV Prostate -D 90% . Reconstruction uncertainties on US resulted in a dose variation of±3Gy to the urethra, whereas data for CT were not available for this. Uncertainties related to contouring and reconstruction in US- and CT-based HDR-BT treatment plans resulted in a systematic overestimation of the prescribed target dose. Inter-modality uncertainties (US and CT versus MR) were larger than MR intra-observer uncertainties. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Uncertainty, Pluralism, and the Knowledge-Based Theory of the Firm

    DEFF Research Database (Denmark)

    Reihlen, Markus; Ringberg, Torsten

    2013-01-01

    J.-C. Spender’s award-winning, knowledge-based theory of the firm is based on four premises: (1) The firm can be sufficiently understood as a system of knowledge, (2) explicit and implicit knowing can be clearly dissociated, (3) organizations are conceived as cognizing entities, and (4) intuition...

  18. Prospective memory while driving: comparison of time- and event-based intentions.

    Science.gov (United States)

    Trawley, Steven L; Stephens, Amanda N; Rendell, Peter G; Groeger, John A

    2017-06-01

    Prospective memories can divert attentional resources from ongoing activities. However, it is unclear whether these effects and the theoretical accounts that seek to explain them will generalise to a complex real-world task such as driving. Twenty-four participants drove two simulated routes while maintaining a fixed headway with a lead vehicle. Drivers were given either event-based (e.g. arriving at a filling station) or time-based errands (e.g. on-board clock shows 3:30). In contrast to the predominant view in the literature which suggests time-based tasks are more demanding, drivers given event-based errands showed greater difficulty in mirroring lead vehicle speed changes compared to the time-based group. Results suggest that common everyday secondary tasks, such as scouting the roadside for a bank, may have a detrimental impact on driving performance. The additional finding that this cost was only evident with the event-based task highlights a potential area of both theoretical and practical interest. Practitioner Summary: Drivers were given either time- or event-based errands whilst engaged in a simulated drive. We examined the effect of errands on an ongoing vehicle follow task. In contrast to previous non-driving studies, event-based errands are more disruptive. Common everyday errands may have a detrimental impact on driving performance.

  19. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  20. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  1. Digital disease detection: A systematic review of event-based internet biosurveillance systems.

    Science.gov (United States)

    O'Shea, Jesse

    2017-05-01

    Internet access and usage has changed how people seek and report health information. Meanwhile,infectious diseases continue to threaten humanity. The analysis of Big Data, or vast digital data, presents an opportunity to improve disease surveillance and epidemic intelligence. Epidemic intelligence contains two components: indicator based and event-based. A relatively new surveillance type has emerged called event-based Internet biosurveillance systems. These systems use information on events impacting health from Internet sources, such as social media or news aggregates. These systems circumvent the limitations of traditional reporting systems by being inexpensive, transparent, and flexible. Yet, innovations and the functionality of these systems can change rapidly. To update the current state of knowledge on event-based Internet biosurveillance systems by identifying all systems, including current functionality, with hopes to aid decision makers with whether to incorporate new methods into comprehensive programmes of surveillance. A systematic review was performed through PubMed, Scopus, and Google Scholar databases, while also including grey literature and other publication types. 50 event-based Internet systems were identified, including an extraction of 15 attributes for each system, described in 99 articles. Each system uses different innovative technology and data sources to gather data, process, and disseminate data to detect infectious disease outbreaks. The review emphasises the importance of using both formal and informal sources for timely and accurate infectious disease outbreak surveillance, cataloguing all event-based Internet biosurveillance systems. By doing so, future researchers will be able to use this review as a library for referencing systems, with hopes of learning, building, and expanding Internet-based surveillance systems. Event-based Internet biosurveillance should act as an extension of traditional systems, to be utilised as an

  2. Assessing flood forecast uncertainty with fuzzy arithmetic

    Directory of Open Access Journals (Sweden)

    de Bruyn Bertrand

    2016-01-01

    Full Text Available Providing forecasts for flow rates and water levels during floods have to be associated with uncertainty estimates. The forecast sources of uncertainty are plural. For hydrological forecasts (rainfall-runoff performed using a deterministic hydrological model with basic physics, two main sources can be identified. The first obvious source is the forcing data: rainfall forecast data are supplied in real time by meteorological forecasting services to the Flood Forecasting Service within a range between a lowest and a highest predicted discharge. These two values define an uncertainty interval for the rainfall variable provided on a given watershed. The second source of uncertainty is related to the complexity of the modeled system (the catchment impacted by the hydro-meteorological phenomenon, the number of variables that may describe the problem and their spatial and time variability. The model simplifies the system by reducing the number of variables to a few parameters. Thus it contains an intrinsic uncertainty. This model uncertainty is assessed by comparing simulated and observed rates for a large number of hydro-meteorological events. We propose a method based on fuzzy arithmetic to estimate the possible range of flow rates (and levels of water making a forecast based on possible rainfalls provided by forcing and uncertainty model. The model uncertainty is here expressed as a range of possible values. Both rainfall and model uncertainties are combined with fuzzy arithmetic. This method allows to evaluate the prediction uncertainty range. The Flood Forecasting Service of Oise and Aisne rivers, in particular, monitors the upstream watershed of the Oise at Hirson. This watershed’s area is 310 km2. Its response time is about 10 hours. Several hydrological models are calibrated for flood forecasting in this watershed and use the rainfall forecast. This method presents the advantage to be easily implemented. Moreover, it permits to be carried out

  3. An uncertainty-based framework to quantifying climate change impacts on coastal flood vulnerability: case study of New York City.

    Science.gov (United States)

    Zahmatkesh, Zahra; Karamouz, Mohammad

    2017-10-17

    The continued development efforts around the world, growing population, and the increased probability of occurrence of extreme hydrologic events have adversely affected natural and built environments. Flood damages and loss of lives from the devastating storms, such as Irene and Sandy on the East Coast of the USA, are examples of the vulnerability to flooding that even developed countries have to face. The odds of coastal flooding disasters have been increased due to accelerated sea level rise, climate change impacts, and communities' interest to live near the coastlines. Climate change, for instance, is becoming a major threat to sustainable development because of its adverse impacts on the hydrologic cycle. Effective management strategies are thus required for flood vulnerability reduction and disaster preparedness. This paper is an extension to the flood resilience studies in the New York City coastal watershed. Here, a framework is proposed to quantify coastal flood vulnerability while accounting for climate change impacts. To do so, a multi-criteria decision making (MCDM) approach that combines watershed characteristics (factors) and their weights is proposed to quantify flood vulnerability. Among the watershed characteristics, potential variation in the hydrologic factors under climate change impacts is modeled utilizing the general circulation models' (GCMs) outputs. The considered factors include rainfall, extreme water level, and sea level rise that exacerbate flood vulnerability through increasing exposure and susceptibility to flooding. Uncertainty in the weights as well as values of factors is incorporated in the analysis using the Monte Carlo (MC) sampling method by selecting the best-fitted distributions to the parameters with random nature. A number of low impact development (LID) measures are then proposed to improve watershed adaptive capacity to deal with coastal flooding. Potential range of current and future vulnerability to flooding is

  4. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  5. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    Science.gov (United States)

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when

  6. Uncertainties in Instantaneous Rainfall Rate Estimates: Satellite vs. Ground-Based Observations

    Science.gov (United States)

    Amitai, E.; Huffman, G. J.; Goodrich, D. C.

    2012-12-01

    High-resolution precipitation intensities are significant in many fields. For example, hydrological applications such as flood forecasting, runoff accommodation, erosion prediction, and urban hydrological studies depend on an accurate representation of the rainfall that does not infiltrate the soil, which is controlled by the rain intensities. Changes in the rain rate pdf over long periods are important for climate studies. Are our estimates accurate enough to detect such changes? While most evaluation studies are focusing on the accuracy of rainfall accumulation estimates, evaluation of instantaneous rainfall intensity estimates is relatively rare. Can a speceborne radar help in assessing ground-based radar estimates of precipitation intensities or is it the other way around? In this presentation we will provide some insight on the relative accuracy of instantaneous precipitation intensity fields from satellite and ground-based observations. We will examine satellite products such as those from the TRMM Precipitation Radar and those from several passive microwave imagers and sounders by comparing them with advanced high-resolution ground-based products taken at overpass time (snapshot comparisons). The ground based instantaneous rain rate fields are based on in situ measurements (i.e., the USDA/ARS Walnut Gulch dense rain gauge network), remote sensing observations (i.e., the NOAA/NSSL NMQ/Q2 radar-only national mosaic), and multi-sensor products (i.e., high-resolution gauge adjusted radar national mosaics, which we have developed by applying a gauge correction on the Q2 products).

  7. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Directory of Open Access Journals (Sweden)

    Jian Wan

    2011-06-01

    Full Text Available This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  8. Robust optimization-based DC optimal power flow for managing wind generation uncertainty

    Science.gov (United States)

    Boonchuay, Chanwit; Tomsovic, Kevin; Li, Fangxing; Ongsakul, Weerakorn

    2012-11-01

    Integrating wind generation into the wider grid causes a number of challenges to traditional power system operation. Given the relatively large wind forecast errors, congestion management tools based on optimal power flow (OPF) need to be improved. In this paper, a robust optimization (RO)-based DCOPF is proposed to determine the optimal generation dispatch and locational marginal prices (LMPs) for a day-ahead competitive electricity market considering the risk of dispatch cost variation. The basic concept is to use the dispatch to hedge against the possibility of reduced or increased wind generation. The proposed RO-based DCOPF is compared with a stochastic non-linear programming (SNP) approach on a modified PJM 5-bus system. Primary test results show that the proposed DCOPF model can provide lower dispatch cost than the SNP approach.

  9. Research on reverse logistics location under uncertainty environment based on grey prediction

    Science.gov (United States)

    Zhenqiang, Bao; Congwei, Zhu; Yuqin, Zhao; Quanke, Pan

    This article constructs reverse logistic network based on uncertain environment, integrates the reverse logistics network and distribution network, and forms a closed network. An optimization model based on cost is established to help intermediate center, manufacturing center and remanufacturing center make location decision. A gray model GM (1, 1) is used to predict the product holdings of the collection points, and then prediction results are carried into the cost optimization model and a solution is got. Finally, an example is given to verify the effectiveness and feasibility of the model.

  10. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  11. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  12. Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation

    NARCIS (Netherlands)

    Minasny, B.; Vrugt, J.A.; McBratney, A.B.

    2011-01-01

    This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior

  13. Base cation deposition in Europe - Part I. Model description, results and uncertainties

    NARCIS (Netherlands)

    Draaijers, G.P.J.; Leeuwen, E.P. van; Jong, P.G.H. de; Erisman, J.W.

    1997-01-01

    Deposition of base cations (Na+, Mg2+, Ca2+, K+) in Europe was mapped for 1989 with a spatial resolution of 10 x 20 km using the so-called inferential modeling technique. Deposition fields resembled the geographic variability of sources, land-use and climate. Dry deposition constituted on average

  14. Analysis of Sensitivity and Uncertainty in an Individual-Based Model of a Threatened Wildlife Species

    Science.gov (United States)

    We present a multi-faceted sensitivity analysis of a spatially explicit, individual-based model (IBM) (HexSim) of a threatened species, the Northern Spotted Owl (Strix occidentalis caurina) on a national forest in Washington, USA. Few sensitivity analyses have been conducted on ...

  15. Uncertainty and Sensitivity Analysis for an Ibuprofen Synthesis Model Based on Hoechst Path

    DEFF Research Database (Denmark)

    da Conceicao Do Carmo Montes, Frederico; Gernaey, Krist V.; Sin, Gürkan

    2017-01-01

    . To this end, we integrated different models in this work to obtain a comprehensive synthesis model for ibuprofen, in a MATLAB /Simulink modelinterface. The process flowsheet is based on the Hoechst path, starting from the Friedel-Craftsacetylation of isobutylbenzene to 4-isobutylphenylacetophenone, its...

  16. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  17. Non-Cooperative Regulation Coordination Based on Game Theory for Wind Farm Clusters during Ramping Events

    DEFF Research Database (Denmark)

    Qi, Yongzhi; Liu, Yutian; Wu, Qiuwei

    2017-01-01

    With increasing penetration of wind power in power systems, it is important to track scheduled wind power output as much as possible during ramping events to ensure security of the system. In this paper, a non‐cooperative coordination strategy based on the game theory is proposed for the regulation...... of wind farm clusters (WFCs) in order to track scheduled wind power of the WFC during ramping events. In the proposed strategy, a non‐cooperative game is formulated and wind farms compete to provide regulation to the WFC during ramping events. A regulation revenue function is proposed to evaluate...

  18. A Simulation Based Approach to Optimize Berth Throughput Under Uncertainty at Marine Container Terminals

    Science.gov (United States)

    Golias, Mihalis M.

    2011-01-01

    Berth scheduling is a critical function at marine container terminals and determining the best berth schedule depends on several factors including the type and function of the port, size of the port, location, nearby competition, and type of contractual agreement between the terminal and the carriers. In this paper we formulate the berth scheduling problem as a bi-objective mixed-integer problem with the objective to maximize customer satisfaction and reliability of the berth schedule under the assumption that vessel handling times are stochastic parameters following a discrete and known probability distribution. A combination of an exact algorithm, a Genetic Algorithms based heuristic and a simulation post-Pareto analysis is proposed as the solution approach to the resulting problem. Based on a number of experiments it is concluded that the proposed berth scheduling policy outperforms the berth scheduling policy where reliability is not considered.

  19. FCI: an R-based algorithm for evaluating uncertainty of absolute real-time PCR quantification

    OpenAIRE

    Gallo Fabio; Pizzamiglio Sara; Verderio Paolo; Ramsden Simon C

    2008-01-01

    Abstract Background FCI is an R code for analyzing data from real-time PCR experiments. This algorithm estimates standard curve features as well as nucleic acid concentrations and confidence intervals according to Fieller's theorem. Results In order to describe the features of FCI four situations were selected from real data collected during an international external quality assessment program for quantitative assays based on real-time PCR. The code generates a diagnostic figure suitable for ...

  20. An AIS-based high-resolution ship emission inventory and its uncertainty in Pearl River Delta region, China.

    Science.gov (United States)

    Li, Cheng; Yuan, Zibing; Ou, Jiamin; Fan, Xiaoli; Ye, Siqi; Xiao, Teng; Shi, Yuqi; Huang, Zhijiong; Ng, Simon K W; Zhong, Zhuangmin; Zheng, Junyu

    2016-12-15

    Ship emissions contribute significantly to air pollution and impose health risks to residents along the coastal area. By using the refined data from the Automatic Identification System (AIS), this study developed a highly resolved ship emission inventory for the Pearl River Delta (PRD) region, China, home to three of ten busiest ports in the world. The region-wide SO2, NOX, CO, PM10, PM2.5, and VOC emissions in 2013 were estimated to be 61,484, 103,717, 10,599, 7155, 6605, and 4195t, respectively. Ocean going vessels were the largest contributors of the total emissions, followed by coastal vessels and river vessels. In terms of ship type, container ship was the leading contributor, followed by conventional cargo ship, dry bulk carrier, fishing ship, and oil tanker. These five ship types accounted for >90% of total emissions. The spatial distributions of emissions revealed that the key emission hot spots all concentrated within the newly proposed emission control area (ECA) and ship emissions within ECA covered >80% of total ship emissions in the PRD, highlighting the importance of ECA in emissions reduction in the PRD. The uncertainties of emission estimates of pollutants were quantified, with lower bounds of -24.5% to -21.2% and upper bounds of 28.6% to 33.3% at 95% confidence intervals. The lower uncertainties in this study highlighted the powerfulness of AIS data in improving ship emission estimates. The AIS-based bottom-up methodology can be used for developing and upgrading ship emission inventory and formulating effective control measures on ship emissions in other port regions wherever possible. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. A multiple distributed representation method based on neural network for biomedical event extraction.

    Science.gov (United States)

    Wang, Anran; Wang, Jian; Lin, Hongfei; Zhang, Jianhai; Yang, Zhihao; Xu, Kan

    2017-12-20

    Biomedical event extraction is one of the most frontier domains in biomedical research. The two main subtasks of biomedical event extraction are trigger identification and arguments detection which can both be considered as classification problems. However, traditional state-of-the-art methods are based on support vector machine (SVM) with massive manually designed one-hot represented features, which require enormous work but lack semantic relation among words. In this paper, we propose a multiple distributed representation method for biomedical event extraction. The method combines context consisting of dependency-based word embedding, and task-based features represented in a distributed way as the input of deep learning models to train deep learning models. Finally, we used softmax classifier to label the example candidates. The experimental results on Multi-Level Event Extraction (MLEE) corpus show higher F-scores of 77.97% in trigger identification and 58.31% in overall compared to the state-of-the-art SVM method. Our distributed representation method for biomedical event extraction avoids the problems of semantic gap and dimension disaster from traditional one-hot representation methods. The promising results demonstrate that our proposed method is effective for biomedical event extraction.

  2. Object-Based Land Use Classification of Agricultural Land by Coupling Multi-Temporal Spectral Characteristics and Phenological Events in Germany

    Science.gov (United States)

    Knoefel, Patrick; Loew, Fabian; Conrad, Christopher

    2015-04-01

    Crop maps based on classification of remotely sensed data are of increased attendance in agricultural management. This induces a more detailed knowledge about the reliability of such spatial information. However, classification of agricultural land use is often limited by high spectral similarities of the studied crop types. More, spatially and temporally varying agro-ecological conditions can introduce confusion in crop mapping. Classification errors in crop maps in turn may have influence on model outputs, like agricultural production monitoring. One major goal of the PhenoS project ("Phenological structuring to determine optimal acquisition dates for Sentinel-2 data for field crop classification"), is the detection of optimal phenological time windows for land cover classification purposes. Since many crop species are spectrally highly similar, accurate classification requires the right selection of satellite images for a certain classification task. In the course of one growing season, phenological phases exist where crops are separable with higher accuracies. For this purpose, coupling of multi-temporal spectral characteristics and phenological events is promising. The focus of this study is set on the separation of spectrally similar cereal crops like winter wheat, barley, and rye of two test sites in Germany called "Harz/Central German Lowland" and "Demmin". However, this study uses object based random forest (RF) classification to investigate the impact of image acquisition frequency and timing on crop classification uncertainty by permuting all possible combinations of available RapidEye time series recorded on the test sites between 2010 and 2014. The permutations were applied to different segmentation parameters. Then, classification uncertainty was assessed and analysed, based on the probabilistic soft-output from the RF algorithm at the per-field basis. From this soft output, entropy was calculated as a spatial measure of classification uncertainty

  3. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk From Trichloroethylene-Contaminated Ground Water Beale Air Force Base in California: Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    Energy Technology Data Exchange (ETDEWEB)

    Bogen, K.T.

    1999-09-29

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability after applying a unified probabilistic approach to the distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such an approach was applied to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub g}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA, based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and <10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and >10{sup -4}, respectively. It was estimated that no TCE-related harm is likely occur due any plausible residential exposure scenario involving the site. The unified approach illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  4. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk from Trichloroethylene-Contaminated Ground Water at Beale Air Force Base in California:Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    Energy Technology Data Exchange (ETDEWEB)

    Bogen, K T

    2001-05-24

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability within a systematic probabilistic framework to integrate the joint effects on risk of distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such a framework was used to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub G}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA{sub c} based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and 10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and 10{sup -4}, respectively. It was estimated that no TCE-related harm is likely to occur due to any plausible residential exposure scenario involving the site. The systematic probabilistic framework illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  5. Efficiency of Event-Based Sampling According to Error Energy Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2010-03-01

    Full Text Available The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control. Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  6. Efficiency of event-based sampling according to error energy criterion.

    Science.gov (United States)

    Miskowicz, Marek

    2010-01-01

    The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control). Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  7. FCI: an R-based algorithm for evaluating uncertainty of absolute real-time PCR quantification

    Directory of Open Access Journals (Sweden)

    Gallo Fabio

    2008-01-01

    Full Text Available Abstract Background FCI is an R code for analyzing data from real-time PCR experiments. This algorithm estimates standard curve features as well as nucleic acid concentrations and confidence intervals according to Fieller's theorem. Results In order to describe the features of FCI four situations were selected from real data collected during an international external quality assessment program for quantitative assays based on real-time PCR. The code generates a diagnostic figure suitable for assessing the quality of the quantification process. Conclusion We have provided a freeware programme using this algorithm specifically designed to increase the information content of the real-time PCR assay.

  8. FCI: an R-based algorithm for evaluating uncertainty of absolute real-time PCR quantification.

    Science.gov (United States)

    Verderio, Paolo; Pizzamiglio, Sara; Gallo, Fabio; Ramsden, Simon C

    2008-01-10

    FCI is an R code for analyzing data from real-time PCR experiments. This algorithm estimates standard curve features as well as nucleic acid concentrations and confidence intervals according to Fieller's theorem. In order to describe the features of FCI four situations were selected from real data collected during an international external quality assessment program for quantitative assays based on real-time PCR. The code generates a diagnostic figure suitable for assessing the quality of the quantification process. We have provided a freeware programme using this algorithm specifically designed to increase the information content of the real-time PCR assay.

  9. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  10. Explorative research of methods for discrete space/time simulation integrated with the event-based approach and agent concept

    Science.gov (United States)

    Zhou, Yuhong; de By, Rolf; Augustijn, Ellen-Wien

    2006-10-01

    Geographic Information Science (GIS) has provided the methodological and technical supports for modeling and simulation in the geographical domain. However, research methods on building complex simulations in which agents behave and interact in discrete time and space are lacking. The existing simulation systems/software are application-oriented and do not provide a theoretical (conceptual) view. The simulation theories and methods that exist do not incorporate spatial issues, which are the key to linking GIS with simulation theory and practice. This paper introduces a method for developing a conceptual theoretical framework for a spatial simulation system which can potentially be integrated with GIS. Firstly, based on classical discrete event simulation and fresh agent technology, a simulation theory is proposed, which is represented by a conceptual simulation model using UML-based visual syntax. In this theoretical framework, spatial issues including spatial setting, spatial constraints, spatial effects and spatial awareness are emphasized. Next, a testing scenario in the microscopic traffic simulation domain is set up to examine the feasibility of the simulation philosophy. Finally, the method is evaluated from the aspects of feasibility, uncertainty and applicability.

  11. A Risk-Based Interval Two-Stage Programming Model for Agricultural System Management under Uncertainty

    Directory of Open Access Journals (Sweden)

    Ye Xu

    2016-01-01

    Full Text Available Nonpoint source (NPS pollution caused by agricultural activities is main reason that water quality in watershed becomes worse, even leading to deterioration. Moreover, pollution control is accompanied with revenue’s fall for agricultural system. How to design and generate a cost-effective and environmentally friendly agricultural production pattern is a critical issue for local managers. In this study, a risk-based interval two-stage programming model (RBITSP was developed. Compared to general ITSP model, significant contribution made by RBITSP model was that it emphasized importance of financial risk under various probabilistic levels, rather than only being concentrated on expected economic benefit, where risk is expressed as the probability of not meeting target profit under each individual scenario realization. This way effectively avoided solutions’ inaccuracy caused by traditional expected objective function and generated a variety of solutions through adjusting weight coefficients, which reflected trade-off between system economy and reliability. A case study of agricultural production management with the Tai Lake watershed was used to demonstrate superiority of proposed model. Obtained results could be a base for designing land-structure adjustment patterns and farmland retirement schemes and realizing balance of system benefit, system-failure risk, and water-body protection.

  12. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    Science.gov (United States)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  13. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) assessment of property model prediction errors, (iii) effect of outliers and data pre-treatment, (iv) formulation of parameter estimation problem (e.g. weighted least squares, ordinary least squares, robust regression, etc.) In this study a comprehensive methodology is developed to perform a rigorous...... and step-by-step assessment and solution of the pitfalls involved in developing models. The methodology takes into account of the following steps. 1) Experimental data collection and providing structural information of molecules. 2) Choice of the regression model: a) ordinary least square b) robust or c...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...

  14. Group-Contribution based Property Estimation and Uncertainty analysis for Flammability-related Properties

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens

    2016-01-01

    .0% and 0.99 for FP as well as 6.4% and 0.76 for AIT. Moreover, the temperature-dependence of LFL property was studied. A compound specific proportionality constant (KLFL) between LFL and temperature is introduced and an MG GC model to estimate KLFL is developed. Overall the ability to predict flammability......This study presents new group contribution (GC) models for the prediction of Lower and Upper Flammability Limits (LFL and UFL), Flash Point (FP) and Auto Ignition Temperature (AIT) of organic chemicals applying the Marrero/Gani (MG) method. Advanced methods for parameter estimation using robust...... regression and outlier treatment have been applied to achieve high accuracy. Furthermore, linear error propagation based on covariance matrix of estimated parameters was performed. Therefore, every estimated property value of the flammability-related properties is reported together with its corresponding 95...

  15. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...... the high rate of exit seen in the first years of exporting. Finally, when faced with multiple countries in which to export, some firms will choose to sequentially export in order to slowly learn more about its chances for success in untested markets....

  16. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  17. PHUIMUS: A Potential High Utility Itemsets Mining Algorithm Based on Stream Data with Uncertainty

    Directory of Open Access Journals (Sweden)

    Ju Wang

    2017-01-01

    Full Text Available High utility itemsets (HUIs mining has been a hot topic recently, which can be used to mine the profitable itemsets by considering both the quantity and profit factors. Up to now, researches on HUIs mining over uncertain datasets and data stream had been studied respectively. However, to the best of our knowledge, the issue of HUIs mining over uncertain data stream is seldom studied. In this paper, PHUIMUS (potential high utility itemsets mining over uncertain data stream algorithm is proposed to mine potential high utility itemsets (PHUIs that represent the itemsets with high utilities and high existential probabilities over uncertain data stream based on sliding windows. To realize the algorithm, potential utility list over uncertain data stream (PUS-list is designed to mine PHUIs without rescanning the analyzed uncertain data stream. And transaction weighted probability and utility tree (TWPUS-tree over uncertain data stream is also designed to decrease the number of candidate itemsets generated by the PHUIMUS algorithm. Substantial experiments are conducted in terms of run-time, number of discovered PHUIs, memory consumption, and scalability on real-life and synthetic databases. The results show that our proposed algorithm is reasonable and acceptable for mining meaningful PHUIs from uncertain data streams.

  18. Selecting Tanker Steaming Speeds under Uncertainty: A Rule-Based Bayesian Reasoning Approach

    Directory of Open Access Journals (Sweden)

    N.S.F. Abdul Rahman

    2015-06-01

    Full Text Available In the tanker industry, there are a lot of uncertain conditions that tanker companies have to deal with. For example, the global financial crisis and economic recession, the increase of bunker fuel prices and global climate change. Such conditions have forced tanker companies to change tankers speed from full speed to slow speed, extra slow speed and super slow speed. Due to such conditions, the objective of this paper is to present a methodology for determining vessel speeds of tankers that minimize the cost of the vessels under such conditions. The four levels of vessel speed in the tanker industry will be investigated and will incorporate a number of uncertain conditions. This will be done by developing a scientific model using a rule-based Bayesian reasoning method. The proposed model has produced 96 rules that can be used as guidance in the decision making process. Such results help tanker companies to determine the appropriate vessel speed to be used in a dynamic operational environmental.

  19. A process mining-based investigation of adverse events in care processes.

    Science.gov (United States)

    Caron, Filip; Vanthienen, Jan; Vanhaecht, Kris; Van Limbergen, Erik; Deweerdt, Jochen; Baesens, Bart

    2014-01-01

    This paper proposes the Clinical Pathway Analysis Method (CPAM) approach that enables the extraction of valuable organisational and medical information on past clinical pathway executions from the event logs of healthcare information systems. The method deals with the complexity of real-world clinical pathways by introducing a perspective-based segmentation of the date-stamped event log. CPAM enables the clinical pathway analyst to effectively and efficiently acquire a profound insight into the clinical pathways. By comparing the specific medical conditions of patients with the factors used for characterising the different clinical pathway variants, the medical expert can identify the best therapeutic option. Process mining-based analytics enables the acquisition of valuable insights into clinical pathways, based on the complete audit traces of previous clinical pathway instances. Additionally, the methodology is suited to assess guideline compliance and analyse adverse events. Finally, the methodology provides support for eliciting tacit knowledge and providing treatment selection assistance.

  20. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    Science.gov (United States)

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  1. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  2. A Risk Explicit Interval Linear Programming Model for Uncertainty-Based Environmental Economic Optimization in the Lake Fuxian Watershed, China

    Directory of Open Access Journals (Sweden)

    Xiaoling Zhang

    2013-01-01

    Full Text Available The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers’ preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  3. A risk explicit interval linear programming model for uncertainty-based environmental economic optimization in the Lake Fuxian watershed, China.

    Science.gov (United States)

    Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  4. RSS-Based Method for Sensor Localization with Unknown Transmit Power and Uncertainty in Path Loss Exponent.

    Science.gov (United States)

    Huang, Jiyan; Liu, Peng; Lin, Wei; Gui, Guan

    2016-09-08

    The localization of a sensor in wireless sensor networks (WSNs) has now gained considerable attention. Since the transmit power and path loss exponent (PLE) are two critical parameters in the received signal strength (RSS) localization technique, many RSS-based location methods, considering the case that both the transmit power and PLE are unknown, have been proposed in the literature. However, these methods require a search process, and cannot give a closed-form solution to sensor localization. In this paper, a novel RSS localization method with a closed-form solution based on a two-step weighted least squares estimator is proposed for the case with the unknown transmit power and uncertainty in PLE. Furthermore, the complete performance analysis of the proposed method is given in the paper. Both the theoretical variance and Cramer-Rao lower bound (CRLB) are derived. The relationships between the deterministic CRLB and the proposed stochastic CRLB are presented. The paper also proves that the proposed method can reach the stochastic CRLB.

  5. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources

    Directory of Open Access Journals (Sweden)

    Valery Solovyev

    2016-01-01

    Full Text Available Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessary for natural language processing. In this paper we define a set of linguistic resources that are necessary in development of a knowledge-based event extraction system in Russian: a vocabulary of subordination models, a vocabulary of event triggers, and a vocabulary of Frame Elements that are basic building blocks for semantic patterns. We propose a set of methods for creation of such vocabularies in Russian and other languages using Google Books NGram Corpus. The methods are evaluated in development of event extraction system for Russian.

  6. Quality of life and uncertainty in illness for chronic patients

    Directory of Open Access Journals (Sweden)

    Valeria Caruso

    2014-09-01

    Full Text Available The experience of chronic illness, together with physical impairment and hospitalization in some cases, can be a difficult occurrence to manage. Illness determines changes in patients’ life style and limitations, that often cause psychological distress. It may happen that patients neither understand the meaning of the events correlated with illness, nor can predict when such events will occur. This uncertainty augments the negative impact of the state of chronic illness on patients’ quality of life. The present study has the purpose to examine the correlations between uncertainty due to  chronic disease and patients’ quality of life, keeping into account the diverse coping strategies adopted and the anxiety/depression feelings developed during hospitalization. There is an inverse correlation between chronic patients’ quality of life and the diverse dimensions of uncertainty in illness as identified by the Mishel Uncertainty in Illness Scale. The paper suggests how uncertainty hampers the possibility that patients choose coping strategies, involving their active management of illness. The lower the uncertainty, the higher is the possibility of activate coping mechanisms based on the acceptance of illness, together with a reflexive attitude concerning the actions to be taken to reduce the risk of anxiety/depression during hospitalization. Finally, the present study presents some policy implications, suggesting how the medical staff should not only treat patients, but also help patients to elaborate problem solving strategies and to positively accept their chronic health state.

  7. An event-based neurobiological recognition system with orientation detector for objects in multiple orientations

    Directory of Open Access Journals (Sweden)

    Hanyu Wang

    2016-11-01

    Full Text Available A new multiple orientation event-based neurobiological recognition system is proposed by integrating recognition and tracking function in this paper, which is used for asynchronous address-event representation (AER image sensors. The characteristic of this system has been enriched to recognize the objects in multiple orientations with only training samples moving in a single orientation. The system extracts multi-scale and multi-orientation line features inspired by models of the primate visual cortex. An orientation detector based on modified Gaussian blob tracking algorithm is introduced for object tracking and orientation detection. The orientation detector and feature extraction block work in simultaneous mode, without any increase in categorization time. An addresses lookup table (addresses LUT is also presented to adjust the feature maps by addresses mapping and reordering, and they are categorized in the trained spiking neural network. This recognition system is evaluated with the MNIST dataset which have played important roles in the development of computer vision, and the accuracy is increase owing to the use of both ON and OFF events. AER data acquired by a DVS are also tested on the system, such as moving digits, pokers, and vehicles. The experimental results show that the proposed system can realize event-based multi-orientation recognition.The work presented in this paper makes a number of contributions to the event-based vision processing system for multi-orientation object recognition. It develops a new tracking-recognition architecture to feedforward categorization system and an address reorder approach to classify multi-orientation objects using event-based data. It provides a new way to recognize multiple orientation objects with only samples in single orientation.

  8. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  9. Measurement of the underlying event using track-based event shapes in Z→l{sup +}l{sup -} events with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, Holger

    2014-09-11

    This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 fb{sup -1}) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called ''pile-up'' (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called ''Underlying Event'' and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called ''Tuning''. It became apparent, however, that the underlying Sjoestrand-Zijl model is unable to give a good description of the measured event-shape distributions.

  10. Multi-scale event synchronization analysis for unravelling climate processes: a wavelet-based approach

    Science.gov (United States)

    Agarwal, Ankit; Marwan, Norbert; Rathinasamy, Maheswaran; Merz, Bruno; Kurths, Jürgen

    2017-10-01

    The temporal dynamics of climate processes are spread across different timescales and, as such, the study of these processes at only one selected timescale might not reveal the complete mechanisms and interactions within and between the (sub-)processes. To capture the non-linear interactions between climatic events, the method of event synchronization has found increasing attention recently. The main drawback with the present estimation of event synchronization is its restriction to analysing the time series at one reference timescale only. The study of event synchronization at multiple scales would be of great interest to comprehend the dynamics of the investigated climate processes. In this paper, the wavelet-based multi-scale event synchronization (MSES) method is proposed by combining the wavelet transform and event synchronization. Wavelets are used extensively to comprehend multi-scale processes and the dynamics of processes across various timescales. The proposed method allows the study of spatio-temporal patterns across different timescales. The method is tested on synthetic and real-world time series in order to check its replicability and applicability. The results indicate that MSES is able to capture relationships that exist between processes at different timescales.

  11. Uncertainty and innovation: Understanding the role of cell-based manufacturing facilities in shaping regulatory and commercialization environments.

    Science.gov (United States)

    Isasi, Rosario; Rahimzadeh, Vasiliki; Charlebois, Kathleen

    2016-12-01

    The purpose of this qualitative study is to elucidate stakeholder perceptions of, and institutional practices related to cell-based therapies and products (CTP) regulation and commercialization in Canada. The development of reproducible, safe and effective CTPs is predicated on regulatory and commercialization environments that enable innovation. Manufacturing processes constitute a critical step for CTP development in this regard. The road from CTP manufacturing to translation in the clinic, however, has yet to be paved. This study aims to fill an empirical gap in the literature by exploring how CTP manufacturing facilities navigate Canadian regulatory and commercialization environments, which together drive the translation of novel CTPs from bench to bedside. Using the multi-level model of practice-driven institutional change proposed by Smets et al., we demonstrate how CTP manufacturing practices are governed by established standards, yet meaningfully shape higher-order regulatory and commercial norms in CTP research and development. We identify four key themes that undergird such processes of innovation: 1) managing regulatory uncertainty, which stems from an inability to classify CTPs within existing regulatory categories for approval and commercialization purposes; 2) building a 'business case' whereby a CTP's market potential is determined in large part by proving its safety and effectiveness; 3) standardizing manufacturing procedures that mobilize CTPs from a research and development phase to a commercialization one; and 4) networking between researchers and regulators to develop responsible commercialization processes that reflect the uniqueness of CTPs as distinct from other biologics and medical devices.

  12. A Generalized Perturbation Theory Solver In Rattlesnake Based On PETSc With Application To TREAT Steady State Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, Sebastian; Wang, Congjian; Wang, Yaqi; Kong, Fande; Ortensi, Javier; Baker, Benjamin; Gleicher, Frederick; DeHart, Mark; Martineau, Richard

    2017-04-01

    Rattlesnake and MAMMOTH are the designated TREAT analysis tools currently being developed at the Idaho National Laboratory. Concurrent with development of the multi-physics, multi-scale capabilities, sensitivity analysis and uncertainty quantification (SA/UQ) capabilities are required for predicitive modeling of the TREAT reactor. For steady-state SA/UQ, that is essential for setting initial conditions for the transients, generalized perturbation theory (GPT) will be used. This work describes the implementation of a PETSc based solver for the generalized adjoint equations that constitute a inhomogeneous, rank deficient problem. The standard approach is to use an outer iteration strategy with repeated removal of the fundamental mode contamination. The described GPT algorithm directly solves the GPT equations without the need of an outer iteration procedure by using Krylov subspaces that are orthogonal to the operator’s nullspace. Three test problems are solved and provide sufficient verification for the Rattlesnake’s GPT capability. We conclude with a preliminary example evaluating the impact of the Boron distribution in the TREAT reactor using perturbation theory.

  13. Uncertainty of large-area estimates of indicators of forest structural gamma diversity: A study based on national forest inventory data

    Science.gov (United States)

    Susanne Winter; Andreas Böck; Ronald E. McRoberts

    2012-01-01

    Tree diameter and height are commonly measured forest structural variables, and indicators based on them are candidates for assessing forest diversity. We conducted our study on the uncertainty of estimates for mostly large geographic scales for four indicators of forest structural gamma diversity: mean tree diameter, mean tree height, and standard deviations of tree...

  14. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    Science.gov (United States)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  15. Ensemble-based analysis of Front Range severe convection on 6-7 June 2012: Forecast uncertainty and communication of weather information to Front Range decision-makers

    Science.gov (United States)

    Vincente, Vanessa

    The variation of topography in Colorado not only adds to the beauty of its landscape, but also tests our ability to predict warm season severe convection. Deficient radar coverage and limited observations make quantitative precipitation forecasting quite a challenge. Past studies have suggested that greater forecast skill of mesoscale convection initiation and precipitation characteristics are achievable considering an ensemble with explicitly predicted convection compared to one that has parameterized convection. The range of uncertainty and probabilities in these forecasts can help forecasters in their precipitation predictions and communication of weather information to emergency managers (EMs). EMs serve an integral role in informing and protecting communities in anticipation of hazardous weather. An example of such an event occurred on the evening of 6 June 2012, where areas to the lee of the Rocky Mountain Front Range were impacted by flash-flood-producing severe convection that included heavy rain and copious amounts of hail. Despite the discrepancy in the timing, location and evolution of convection, the convection-allowing ensemble forecasts generally outperformed those of the convection-parameterized ensemble in representing the mesoscale processes responsible for the 6-7 June severe convective event. Key features sufficiently reproduced by several of the convection-allowing ensemble members resembled the observations: 1) general location of a convergence boundary east of Denver, 2) convective initiation along the boundary, 3) general location of a weak cold front near the Wyoming/Nebraska border, and 4) cold pools and moist upslope characteristics that contributed to the backbuilding of convection. Members from the convection-parameterized ensemble that failed to reproduce these results displaced the convergence boundary, produced a cold front that moved southeast too quickly, and used the cold front for convective initiation. The convection

  16. A browser-based event display for the CMS experiment at the LHC

    Science.gov (United States)

    Hategan, M.; McCauley, T.; Nguyen, P.

    2012-12-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  17. Event-based Plausibility Immediately Influences On-line Language Comprehension

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically-relevant lexical knowledge such as selectional restrictions is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients. Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns such as hair when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge, rather than lexical-grammatical knowledge. PMID:21517222

  18. A browser-based event display for the CMS experiment at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Hategan, M. [Chicago U.; McCauley, T. [Fermilab; Nguyen, P. [Fermilab

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  19. Uncertainties in Biologically-Based Modeling of Formaldehyde-Induced Respiratory Cancer Risk: Identification of Key Issues

    Science.gov (United States)

    Subramaniam, Ravi P.; Chen, Chao; Crump, Kenny S.; DeVoney, Danielle; Fox, John F.; Portier, Christopher J.; Schlosser, Paul M.; Thompson, Chad M.; White, Paul

    2009-01-01

    In a series of articles and a health-risk assessment report, scientists at the CIIT Hamner Institutes developed a model (CIIT model) for estimating respiratory cancer risk due to inhaled formaldehyde within a conceptual framework incorporating extensive mechanistic information and advanced computational methods at the toxicokinetic and toxicodynamic levels. Several regulatory bodies have utilized predictions from this model; on the other hand, upon detailed evaluation the California EPA has decided against doing so. In this article, we study the CIIT model to identify key biological and statistical uncertainties that need careful evaluation if such two-stage clonal expansion models are to be used for extrapolation of cancer risk from animal bioassays to human exposure. Broadly, these issues pertain to the use and interpretation of experimental labeling index and tumor data, the evaluation and biological interpretation of estimated parameters, and uncertainties in model specification, in particular that of initiated cells. We also identify key uncertainties in the scale-up of the CIIT model to humans, focusing on assumptions underlying model parameters for cell replication rates and formaldehyde-induced mutation. We discuss uncertainties in identifying parameter values in the model used to estimate and extrapolate DNA protein cross-link levels. The authors of the CIIT modeling endeavor characterized their human risk estimates as “conservative in the face of modeling uncertainties.” The uncertainties discussed in this article indicate that such a claim is premature. PMID:18564991

  20. Multi-agent system-based event-triggered hybrid control scheme for energy internet

    DEFF Research Database (Denmark)

    Dou, Chunxia; Yue, Dong; Han, Qing Long

    2017-01-01

    This paper is concerned with an event-triggered hybrid control for the energy Internet based on a multi-agent system approach with which renewable energy resources can be fully utilized to meet load demand with high security and well dynamical quality. In the design of control, a multi-agent system...

  1. Component-Based Data-Driven Predictive Maintenance to Reduce Unscheduled Maintenance Events

    NARCIS (Netherlands)

    Verhagen, W.J.C.; Curran, R.; de Boer, L.W.M.; Chen, C.H.; Trappey, A.C.; Peruzzini, M.; Stjepandić, J.; Wognum, N.

    2017-01-01

    Costs associated with unscheduled and preventive maintenance can contribute significantly to an airline's expenditure. Reliability analysis can help to identify and plan for maintenance events. Reliability analysis in industry is often limited to statistically based

  2. Crisis response simulation combining discrete-event and agent-based modeling

    NARCIS (Netherlands)

    Gonzalez, R.A.

    2009-01-01

    This paper presents a crisis response simulation model architecture combining a discrete-event simulation (DES) environment for a crisis scenario with an agent-based model of the response organization. In multi-agent systems (MAS) as a computational organization, agents are modeled and implemented

  3. Event-based prospective memory in depression: The impact of cue focality

    NARCIS (Netherlands)

    Altgassen, A.M.; Kliegel, M.; Martin, M.

    2009-01-01

    This study is the first to compare event-based prospective memory performance in individuals with depression and healthy controls. The degree to which self-initiated processing is required to perform the prospective memory task was varied. Twenty-eight individuals with depression and 32 healthy

  4. Using Story-Based Causal Diagrams to Analyze Disagreements about Complex Events.

    Science.gov (United States)

    Shapiro, Brian P.; And Others

    1995-01-01

    Describes procedures for constructing story-based causal diagrams. Discusses the cognitive and pragmatic constraints that govern the tendency to attribute events to incomplete causes. Uses causal diagrams to analyze major disagreements about the 1987 stock market crash. Explores how causal diagrams may mitigate the constraints on causal…

  5. Ontology-based combinatorial comparative analysis of adverse events associated with killed and live influenza vaccines.

    Directory of Open Access Journals (Sweden)

    Sirarat Sarntivijai

    Full Text Available Vaccine adverse events (VAEs are adverse bodily changes occurring after vaccination. Understanding the adverse event (AE profiles is a crucial step to identify serious AEs. Two different types of seasonal influenza vaccines have been used on the market: trivalent (killed inactivated influenza vaccine (TIV and trivalent live attenuated influenza vaccine (LAIV. Different adverse event profiles induced by these two groups of seasonal influenza vaccines were studied based on the data drawn from the CDC Vaccine Adverse Event Report System (VAERS. Extracted from VAERS were 37,621 AE reports for four TIVs (Afluria, Fluarix, Fluvirin, and Fluzone and 3,707 AE reports for the only LAIV (FluMist. The AE report data were analyzed by a novel combinatorial, ontology-based detection of AE method (CODAE. CODAE detects AEs using Proportional Reporting Ratio (PRR, Chi-square significance test, and base level filtration, and groups identified AEs by ontology-based hierarchical classification. In total, 48 TIV-enriched and 68 LAIV-enriched AEs were identified (PRR>2, Chi-square score >4, and the number of cases >0.2% of total reports. These AE terms were classified using the Ontology of Adverse Events (OAE, MedDRA, and SNOMED-CT. The OAE method provided better classification results than the two other methods. Thirteen out of 48 TIV-enriched AEs were related to neurological and muscular processing such as paralysis, movement disorders, and muscular weakness. In contrast, 15 out of 68 LAIV-enriched AEs were associated with inflammatory response and respiratory system disorders. There were evidences of two severe adverse events (Guillain-Barre Syndrome and paralysis present in TIV. Although these severe adverse events were at low incidence rate, they were found to be more significantly enriched in TIV-vaccinated patients than LAIV-vaccinated patients. Therefore, our novel combinatorial bioinformatics analysis discovered that LAIV had lower chance of inducing these

  6. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  7. Visual Sensor Based Abnormal Event Detection with Moving Shadow Removal in Home Healthcare Applications

    OpenAIRE

    Young-Sook Lee; Wan-Young Chung

    2012-01-01

    Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects...

  8. Detection of vulnerable relays and sensitive controllers under cascading events based on performance indices

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Hu, Yanting

    2014-01-01

    ) based detection strategy is proposed to identify the vulnerable relays and sensitive controllers under the overloading situation during cascading events. Based on the impedance margin sensitivity, diverse performance indices are proposed to help improving this detection. A study case of voltage...... instability induced cascaded blackout built in real time digital simulator (RTDS) will be used to demonstrate the proposed strategy. The simulation results indicate this strategy can effectively detect the vulnerable relays and sensitive controllers under overloading situations....

  9. Multi-agent system-based event-triggered hybrid control scheme for energy internet

    DEFF Research Database (Denmark)

    Dou, Chunxia; Yue, Dong; Han, Qing Long

    2017-01-01

    This paper is concerned with an event-triggered hybrid control for the energy Internet based on a multi-agent system approach with which renewable energy resources can be fully utilized to meet load demand with high security and well dynamical quality. In the design of control, a multi-agent system...... of event-triggered hybrid control strategies whereby the multi-agent system implements the hierarchical hybrid control to achieve multiple control objectives. Finally, the effectiveness of the proposed control is validated by means of simulation results....

  10. A mobile robots experimental environment with event-based wireless communication.

    Science.gov (United States)

    Guinaldo, María; Fábregas, Ernesto; Farias, Gonzalo; Dormido-Canto, Sebastián; Chaos, Dictino; Sánchez, José; Dormido, Sebastián

    2013-07-22

    An experimental platform to communicate between a set of mobile robots through a wireless network has been developed. The mobile robots get their position through a camera which performs as sensor. The video images are processed in a PC and a Waspmote card sends the corresponding position to each robot using the ZigBee standard. A distributed control algorithm based on event-triggered communications has been designed and implemented to bring the robots into the desired formation. Each robot communicates to its neighbors only at event times. Furthermore, a simulation tool has been developed to design and perform experiments with the system. An example of usage is presented.

  11. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  12. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Science.gov (United States)

    Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred

    2018-01-01

    Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  13. Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.

    Science.gov (United States)

    Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse

    2017-03-24

    Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.

  14. Towards a quantitative, measurement-based estimate of the uncertainty in photon mass attenuation coefficients at radiation therapy energies.

    Science.gov (United States)

    Ali, E S M; Spencer, B; McEwen, M R; Rogers, D W O

    2015-02-21

    In this study, a quantitative estimate is derived for the uncertainty in the XCOM photon mass attenuation coefficients in the energy range of interest to external beam radiation therapy-i.e. 100 keV (orthovoltage) to 25 MeV-using direct comparisons of experimental data against Monte Carlo models and theoretical XCOM data. Two independent datasets are used. The first dataset is from our recent transmission measurements and the corresponding EGSnrc calculations (Ali et al 2012 Med. Phys. 39 5990-6003) for 10-30 MV photon beams from the research linac at the National Research Council Canada. The attenuators are graphite and lead, with a total of 140 data points and an experimental uncertainty of ∼0.5% (k = 1). An optimum energy-independent cross section scaling factor that minimizes the discrepancies between measurements and calculations is used to deduce cross section uncertainty. The second dataset is from the aggregate of cross section measurements in the literature for graphite and lead (49 experiments, 288 data points). The dataset is compared to the sum of the XCOM data plus the IAEA photonuclear data. Again, an optimum energy-independent cross section scaling factor is used to deduce the cross section uncertainty. Using the average result from the two datasets, the energy-independent cross section uncertainty estimate is 0.5% (68% confidence) and 0.7% (95% confidence). The potential for energy-dependent errors is discussed. Photon cross section uncertainty is shown to be smaller than the current qualitative 'envelope of uncertainty' of the order of 1-2%, as given by Hubbell (1999 Phys. Med. Biol 44 R1-22).

  15. An electronic trigger based on care escalation to identify preventable adverse events in hospitalised patients.

    Science.gov (United States)

    Bhise, Viraj; Sittig, Dean F; Vaghani, Viralkumar; Wei, Li; Baldwin, Jessica; Singh, Hardeep

    2017-09-21

    Methods to identify preventable adverse events typically have low yield and efficiency. We refined the methods of Institute of Healthcare Improvement's Global Trigger Tool (GTT) application and leveraged electronic health record (EHR) data to improve detection of preventable adverse events, including diagnostic errors. We queried the EHR data repository of a large health system to identify an 'index hospitalization' associated with care escalation (defined as transfer to the intensive care unit (ICU) or initiation of rapid response team (RRT) within 15 days of admission) between March 2010 and August 2015. To enrich the record review sample with unexpected events, we used EHR clinical data to modify the GTT algorithm and limited eligible patients to those at lower risk for care escalation based on younger age and presence of minimal comorbid conditions. We modified the GTT review methodology; two physicians independently reviewed eligible 'e-trigger' positive records to identify preventable diagnostic and care management events. Of 88 428 hospitalisations, 887 were associated with care escalation (712 ICU transfers and 175 RRTs), of which 92 were flagged as trigger-positive and reviewed. Preventable adverse events were detected in 41 cases, yielding a trigger positive predictive value of 44.6% (reviewer agreement 79.35%; Cohen's kappa 0.573). We identified 7 (7.6%) diagnostic errors and 34 (37.0%) care management-related events: 24 (26.1%) adverse drug events, 4 (4.3%) patient falls, 4 (4.3%) procedure-related complications and 2 (2.2%) hospital-associated infections. In most events (73.1%), there was potential for temporary harm. We developed an approach using an EHR data-based trigger and modified review process to efficiently identify hospitalised patients with preventable adverse events, including diagnostic errors. Such e-triggers can help overcome limitations of currently available methods to detect preventable harm in hospitalised patients. © Article

  16. Improvement of hydrological flood forecasting through an event based output correction method

    Science.gov (United States)

    Klotz, Daniel; Nachtnebel, Hans Peter

    2014-05-01

    This contribution presents an output correction method for hydrological models. A conceptualisation of the method is presented and tested in an alpine basin in Salzburg, Austria. The aim is to develop a method which is not prone to the drawbacks of autoregressive models. Output correction methods are an attractive option for improving hydrological predictions. They are complementary to the main modelling process and do not interfere with the modelling process itself. In general, output correction models estimate the future error of a prediction and use the estimation to improve the given prediction. Different estimation techniques are available dependent on the utilized information and the estimation procedure itself. Autoregressive error models are widely used for such corrections. Autoregressive models with exogenous inputs (ARX) allow the use of additional information for the error modelling, e.g. measurements from upper basins or predicted input-signals. Autoregressive models do however exhibit deficiencies, since the errors of hydrological models do generally not behave in an autoregressive manner. The decay of the error is usually different from an autoregressive function and furthermore the residuals exhibit different patterns under different circumstances. As for an example, one might consider different error-propagation behaviours under high- and low-flow situations or snow melt driven conditions. This contribution presents a conceptualisation of an event-based correction model and focuses on flood events only. The correction model uses information about the history of the residuals and exogenous variables to give an error-estimation. The structure and parameters of the correction models can be adapted to given event classes. An event-class is a set of flood events that exhibit a similar pattern for the residuals or the hydrological conditions. In total, four different event-classes have been identified in this study. Each of them represents a different

  17. A Novel Idea for Optimizing Condition-Based Maintenance Using Genetic Algorithms and Continuous Event Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-01-01

    Full Text Available Effective maintenance strategies are of utmost significance for system engineering due to their direct linkage with financial aspects and safety of the plants’ operation. At a point where the state of a system, for instance, level of its deterioration, can be constantly observed, a strategy based on condition-based maintenance (CBM may be affected; wherein upkeep of the system is done progressively on the premise of monitored state of the system. In this article, a multicomponent framework is considered that is continuously kept under observation. In order to decide an optimal deterioration stage for the said system, Genetic Algorithm (GA technique has been utilized that figures out when its preventive maintenance should be carried out. The system is configured into a multiobjective problem that is aimed at optimizing the two desired objectives, namely, profitability and accessibility. For the sake of reality, a prognostic model portraying the advancements of deteriorating system has been employed that will be based on utilization of continuous event simulation techniques. In this regard, Monte Carlo (MC simulation has been shortlisted as it can take into account a wide range of probable options that can help in reducing uncertainty. The inherent benefits proffered by the said simulation technique are fully utilized to display various elements of a deteriorating system working under stressed environment. The proposed synergic model (GA and MC is considered to be more effective due to the employment of “drop-by-drop approach” that permits successful drive of the related search process with regard to the best optimal solutions.

  18. Life events, salivary cortisol, and cognitive performance in nondemented subjects: a population-based study.

    Science.gov (United States)

    Ouanes, Sami; Castelao, Enrique; Gebreab, Sirak; von Gunten, Armin; Preisig, Martin; Popp, Julius

    2017-03-01

    Older people are particularly exposed to stressful events, known to activate the hypothalamus-pituitary-adrenal axis resulting in increased cortisol levels. High cortisol has been associated with deleterious effects on cognition. We hypothesized that stressful life events could increase cortisol secretion leading to cognitive impairment. A cross-sectional analysis was conducted using data from Colaus/PsyColaus, a longitudinal population-based study among Lausanne residents. Salivary cortisol samples were obtained from 796 nondemented subjects aged at least 65. A neuropsychological battery was used to assess cognitive performance and determine the Clinical Dementia Rating Sum of Boxes (CDRSOB). Lifetime life events and their subjective impact were assessed using a validated questionnaire. The total impact of life events was associated neither with cortisol area under the curve (AUC) nor with CDRSOB nor with any cognitive domain performance. The CDRSOB was associated with the cortisol AUC, controlling for age, sex, body mass index, education and depressive symptoms (p = 0.003; B = 0.686 [0.240; 1.333]; r = 0.114). This association between CDRSOB and the cortisol AUC remained significant after controlling for life events total impact (p = 0.040; B = 0.591 [0.027; 1.155]; r = 0.106). These findings do not support the hypothesis that stressful life events increase cortisol secretion leading to cognitive impairment. The association of higher cortisol levels with poorer cognition might be not a mere reflection of stressful events but rather explained by other factors, yet to be elucidated. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    DEFF Research Database (Denmark)

    Jensen, Ninna Reitzel; Schomacker, Kristian Juul

    2015-01-01

    model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across......Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death...... and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our...

  20. Power and sample size calculation for paired recurrent events data based on robust nonparametric tests.

    Science.gov (United States)

    Su, Pei-Fang; Chung, Chia-Hua; Wang, Yu-Wen; Chi, Yunchan; Chang, Ying-Ju

    2017-05-20

    The purpose of this paper is to develop a formula for calculating the required sample size for paired recurrent events data. The developed formula is based on robust non-parametric tests for comparing the marginal mean function of events between paired samples. This calculation can accommodate the associations among a sequence of paired recurrent event times with a specification of correlated gamma frailty variables for a proportional intensity model. We evaluate the performance of the proposed method with comprehensive simulations including the impacts of paired correlations, homogeneous or nonhomogeneous processes, marginal hazard rates, censoring rate, accrual and follow-up times, as well as the sensitivity analysis for the assumption of the frailty distribution. The use of the formula is also demonstrated using a premature infant study from the neonatal intensive care unit of a tertiary center in southern Taiwan. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Stabilization of Networked Distributed Systems with Partial and Event-Based Couplings

    Directory of Open Access Journals (Sweden)

    Sufang Zhang

    2015-01-01

    Full Text Available The stabilization problem of networked distributed systems with partial and event-based couplings is investigated. The channels, which are used to transmit different levels of information of agents, are considered. The channel matrix is introduced to indicate the work state of the channels. An event condition is designed for each channel to govern the sampling instants of the channel. Since the event conditions are separately given for different channels, the sampling instants of channels are mutually independent. To stabilize the system, the state feedback controllers are implemented in the system. The control signals also suffer from the two communication constraints. The sufficient conditions in terms of linear matrix equalities are proposed to ensure the stabilization of the controlled system. Finally, a numerical example is given to demonstrate the advantage of our results.

  2. Economic risk-based analysis: Effect of technical and market price uncertainties on the production of glycerol-based isobutanol

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Gernaey, Krist; Sin, Gürkan

    2016-01-01

    In this study, the production of glycerol-based isobutanol is critically assessed in terms of its techno-economic performance through the estimation of economic indicators, net present value (NPV) and minimum selling price (MSP). The Monte Carlo method with Latin Hypercube Sampling (LHS) is used...

  3. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  4. Gait-Event-Based Synchronization Method for Gait Rehabilitation Robots via a Bioinspired Adaptive Oscillator.

    Science.gov (United States)

    Chen, Gong; Qi, Peng; Guo, Zhao; Yu, Haoyong

    2017-06-01

    In the field of gait rehabilitation robotics, achieving human-robot synchronization is very important. In this paper, a novel human-robot synchronization method using gait event information is proposed. This method includes two steps. First, seven gait events in one gait cycle are detected in real time with a hidden Markov model; second, an adaptive oscillator is utilized to estimate the stride percentage of human gait using any one of the gait events. Synchronous reference trajectories for the robot are then generated with the estimated stride percentage. This method is based on a bioinspired adaptive oscillator, which is a mathematical tool, first proposed to explain the phenomenon of synchronous flashing among fireflies. The proposed synchronization method is implemented in a portable knee-ankle-foot robot and tested in 15 healthy subjects. This method has the advantages of simple structure, flexible selection of gait events, and fast adaptation. Gait event is the only information needed, and hence the performance of synchronization holds when an abnormal gait pattern is involved. The results of the experiments reveal that our approach is efficient in achieving human-robot synchronization and feasible for rehabilitation robotics application.

  5. Development of a time-oriented data warehouse based on a medical information event model.

    Science.gov (United States)

    Yamamoto, Yuichiro; Namikawa, Hirokazu; Inamura, Kiyonari

    2002-01-01

    We designed a new medical information event model and developed a time-oriented data warehouse based on the model. Here, the medical information event in a basic data unit is handled by a medical information system. The timing of decision making and treatment for a patient in the processing of his medical information is sometimes very critical. The time-oriented data warehouse was developed, to provide a search feature on the time axis. Our medical information event model has a unique simple data structure. PC-ORDERING2000 developed by NEC, which used Oracle, had about 600 pages of tables. However, we reduced these 600 complicated data structures to one unique and simple event model. By means of shifting clinical data from the old type order entry system into the new order entry system of the medical information event model, we produced a simple and flexible system, and the easy secondary use of clinical data of patients was realized. Evaluation of our system revealed heightened data retrieval efficiency and shortened response time 1:600 at a terminal, owing to the 1:600 reduction of the number of tables as mentioned above.

  6. The InfiniBand based Event Builder implementation for the LHCb upgrade

    Science.gov (United States)

    Falabella, A.; Giacomini, F.; Manzali, M.; Marconi, U.; Neufeld, N.; Valat, S.; Voneki, B.

    2017-10-01

    The LHCb experiment will undergo a major upgrade during the second long shutdown (2019 - 2020). The upgrade will concern both the detector and the Data Acquisition system, which are to be rebuilt in order to optimally exploit the foreseen higher event rate. The Event Builder is the key component of the DAQ system, for it gathers data from the sub-detectors and builds up the whole event. The Event Builder network has to manage an incoming data rate of 32 Tb/s from a 40 MHz bunch-crossing frequency, with a cardinality of about 500 nodes. In this contribution we present an Event Builder implementation based on the InfiniBand network technology. This software relies on the InfiniBand verbs, which offers a user space interface to employ the Remote Direct Memory Access capabilities provided by the InfiniBand network devices. We will present the performance of the software on a cluster connected with 100 Gb/s InfiniBand network.

  7. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  8. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  9. Ontology-based time information representation of vaccine adverse events in VAERS for temporal analysis

    Directory of Open Access Journals (Sweden)

    Tao Cui

    2012-12-01

    Full Text Available Abstract Background The U.S. FDA/CDC Vaccine Adverse Event Reporting System (VAERS provides a valuable data source for post-vaccination adverse event analyses. The structured data in the system has been widely used, but the information in the write-up narratives is rarely included in these kinds of analyses. In fact, the unstructured nature of the narratives makes the data embedded in them difficult to be used for any further studies. Results We developed an ontology-based approach to represent the data in the narratives in a “machine-understandable” way, so that it can be easily queried and further analyzed. Our focus is the time aspect in the data for time trending analysis. The Time Event Ontology (TEO, Ontology of Adverse Events (OAE, and Vaccine Ontology (VO are leveraged for the semantic representation of this purpose. A VAERS case report is presented as a use case for the ontological representations. The advantages of using our ontology-based Semantic web representation and data analysis are emphasized. Conclusions We believe that representing both the structured data and the data from write-up narratives in an integrated, unified, and “machine-understandable” way can improve research for vaccine safety analyses, causality assessments, and retrospective studies.

  10. Ecological status of seagrass ecosystems: An uncertainty analysis of the meadow classification based on the Posidonia oceanica multivariate index (POMI).

    Science.gov (United States)

    Bennett, Scott; Roca, Guillem; Romero, Javier; Alcoverro, Teresa

    2011-08-01

    Quantifying the uncertainty associated with monitoring protocols is essential to prevent the misclassification of ecological status and to improve sampling design. We assessed the Posidonia oceanica multivariate index (POMI) bio-monitoring program for its robustness in classifying the ecological status of coastal waters within the Water Framework Directive. We used a 7-year data set covering 30 sites along 500 km of the Catalonian coastline to examine which version of POMI (14 or 9 metrics) maximises precision in classifying the ecological status of meadows. Five factors (zones within a site, sites within a water body, depth, years and surveyors) that potentially generate classification uncertainty were examined in detail. Of these, depth was a major source of uncertainty, while all the remaining spatial and temporal factors displayed low variability. POMI 9 matched POMI 14 in all factors, and could effectively replace it in future monitoring programs. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. 3-D uncertainty-based topographic change detection with structure-from-motion photogrammetry and precision maps

    Science.gov (United States)

    James, Mike R.; Robson, Stuart; Smith, Mark W.

    2017-04-01

    Structure-from-motion (SfM) software greatly facilitates the generation of 3-D surface models from photographs, but doesn't provide the detailed error metrics that are characteristic of rigorous photogrammetry. Here, we present a novel approach to generate maps of 3-D survey precision which describe the spatial variability in 3-D photogrammetric and georeferencing precision across surveys. Such maps then enable confidence-bounded quantification of 3-D topographic change that, for the first time, specifically account for the precision characteristics of photo-based surveys. Precision maps for surveys georeferenced either directly using camera positions or by ground control, illustrate the spatial variability in precision that is associated with the relative influences of photogrammetric (e.g. image network geometry, tie point quality) and georeferencing considerations. For common SfM-based software (which does not provide precision estimates directly), precision maps can be generated using a Monte Carlo procedure. Confidence-bounded full 3-D change detection between repeat surveys with associated precision maps, is then derived through adapting a state-of-the-art point-cloud comparison (M3C2; Lague, et al., 2013). We demonstrate the approach using annual aerial SfM surveys of an eroding badland, benchmarked against TLS data for validation. 3-D precision maps enable more probable erosion patterns to be identified than existing analyses. If precision is limited by weak georeferencing (e.g. using direct georeferencing with camera positions of multi-metre precision, such as from a consumer UAV), then overall survey precision scales as n-1 /2 of the control precision (n = number of images). However, direct georeferencing results from SfM software (PhotoScan) were not consistent with those from rigorous photogrammetric analysis. Our method not only enables confidence-bounded 3-D change detection and uncertainty-based DEM processing, but also provides covariance

  12. Analysis of ISO NE Balancing Requirements: Uncertainty-based Secure Ranges for ISO New England Dynamic Inerchange Adjustments

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Makarov, Yuri V.; Wu, Di; Hou, Zhangshuan; Sun, Yannan; Maslennikov, S.; Luo, X.; Zheng, T.; George, S.; Knowland, T.; Litvinov, E.; Weaver, S.; Sanchez, E.

    2013-01-31

    The document describes detailed uncertainty quantification (UQ) methodology developed by PNNL to estimate secure ranges of potential dynamic intra-hour interchange adjustments in the ISO-NE system and provides description of the dynamic interchange adjustment (DINA) tool developed under the same contract. The overall system ramping up and down capability, spinning reserve requirements, interchange schedules, load variations and uncertainties from various sources that are relevant to the ISO-NE system are incorporated into the methodology and the tool. The DINA tool has been tested by PNNL and ISO-NE staff engineers using ISO-NE data.

  13. Event-triggered hybrid control based on multi-Agent systems for Microgrids

    DEFF Research Database (Denmark)

    Dou, Chun-xia; Liu, Bin; Guerrero, Josep M.

    2014-01-01

    This paper is focused on a multi-agent system based event-triggered hybrid control for intelligently restructuring the operating mode of an microgrid (MG) to ensure the energy supply with high security, stability and cost effectiveness. Due to the microgrid is composed of different types of distr......This paper is focused on a multi-agent system based event-triggered hybrid control for intelligently restructuring the operating mode of an microgrid (MG) to ensure the energy supply with high security, stability and cost effectiveness. Due to the microgrid is composed of different types...... of distributed energy resources, thus it is typical hybrid dynamic network. Considering the complex hybrid behaviors, a hierarchical decentralized coordinated control scheme is firstly constructed based on multi-agent sys-tem, then, the hybrid model of the microgrid is built by using differential hybrid Petri...... nets. Based on the hybrid models, an event-triggered hybrid control including three kinds of switching controls is constructed by designing multiple enabling functions that can be activated by different triggering conditions, and the interactive coordination among different switching controls is im...

  14. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  15. Knowledge-based extraction of adverse drug events from biomedical text.

    Science.gov (United States)

    Kang, Ning; Singh, Bharat; Bui, Chinh; Afzal, Zubair; van Mulligen, Erik M; Kors, Jan A

    2014-03-04

    Many biomedical relation extraction systems are machine-learning based and have to be trained on large annotated corpora that are expensive and cumbersome to construct. We developed a knowledge-based relation extraction system that requires minimal training data, and applied the system for the extraction of adverse drug events from biomedical text. The system consists of a concept recognition module that identifies drugs and adverse effects in sentences, and a knowledge-base module that establishes whether a relation exists between the recognized concepts. The knowledge base was filled with information from the Unified Medical Language System. The performance of the system was evaluated on the ADE corpus, consisting of 1644 abstracts with manually annotated adverse drug events. Fifty abstracts were used for training, the remaining abstracts were used for testing. The knowledge-based system obtained an F-score of 50.5%, which was 34.4 percentage points better than the co-occurrence baseline. Increasing the training set to 400 abstracts improved the F-score to 54.3%. When the system was compared with a machine-learning system, jSRE, on a subset of the sentences in the ADE corpus, our knowledge-based system achieved an F-score that is 7 percentage points higher than the F-score of jSRE trained on 50 abstracts, and still 2 percentage points higher than jSRE trained on 90% of the corpus. A knowledge-based approach can be successfully used to extract adverse drug events from biomedical text without need for a large training set. Whether use of a knowledge base is equally advantageous for other biomedical relation-extraction tasks remains to be investigated.

  16. Public understanding of visual representations of uncertainty in temperature forecasts

    NARCIS (Netherlands)

    Tak, Susanne; Toet, Alexander; van Erp, Johannes Bernardus Fransiscus

    Multiday weather forecasts often include graphical representations of uncertainty. However, visual representations of probabilistic events are often misinterpreted by the general public. Although various uncertainty visualizations are now in use, the parameters that determine their successful

  17. Public understanding of visual representations of uncertainty in temperature forecasts

    NARCIS (Netherlands)

    Tak, S.W.; Toet, A.; Erp, J.B.F. van

    2015-01-01

    Multiday weather forecasts often include graphical representations of uncertainty. However, visual representations of probabilistic events are often misinterpreted by the general public. Although various uncertainty visualizations are now in use, the parameters that determine their successful

  18. Event-based prospective memory in children with sickle cell disease: effect of cue distinctiveness.

    Science.gov (United States)

    McCauley, Stephen R; Pedroza, Claudia

    2010-01-01

    Event-based prospective memory (EB-PM) is the formation of an intention and remembering to perform it in response to a specific event. Currently, EB-PM performance in children with sickle cell disease (SCD) is unknown. In this study, we designed a computer-based task of EB-PM; No-Stroke, Silent-Infarct, and Overt-Stroke groups performed significantly below the demographically similar control group without SCD. Cue distinctiveness was varied to determine if EB-PM could be improved. All groups, with the exception of the Overt-Stroke group, performed significantly better with a perceptually distinctive cue. Overall, these results suggest that EB-PM can be improved significantly in many children with SCD.

  19. Signature Based Detection of User Events for Post-mortem Forensic Analysis

    Science.gov (United States)

    James, Joshua Isaac; Gladyshev, Pavel; Zhu, Yuandong

    This paper introduces a novel approach to user event reconstruction by showing the practicality of generating and implementing signature-based analysis methods to reconstruct high-level user actions from a collection of low-level traces found during a post-mortem forensic analysis of a system. Traditional forensic analysis and the inferences an investigator normally makes when given digital evidence, are examined. It is then demonstrated that this natural process of inferring high-level events from low-level traces may be encoded using signature-matching techniques. Simple signatures using the defined method are created and applied for three popular Windows-based programs as a proof of concept.

  20. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.