Probabilistic Flood Defence Assessment Tools
Directory of Open Access Journals (Sweden)
Slomp Robert
2016-01-01
institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands
Probabilistic flood extent estimates from social media flood observations
Brouwer, Tom; Eilander, Dirk; van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen
2017-05-01
The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from Twitter messages that mention locations of flooding. A deterministic flood map created for the December 2015 flood in the city of York (UK) showed good performance (F(2) = 0.69; a statistic ranging from 0 to 1, with 1 expressing a perfect fit with validation data). The probabilistic flood maps we created showed that, in the York case study, the uncertainty in flood extent was mainly induced by errors in the precise locations of flood observations as derived from Twitter data. Errors in the terrain elevation data or in the parameters of the applied algorithm contributed less to flood extent uncertainty. Although these maps tended to overestimate the actual probability of flooding, they gave a reasonable representation of flood extent uncertainty in the area. This study illustrates that inherently uncertain data from social media can be used to derive information about flooding.
Probabilistic, meso-scale flood loss modelling
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2016-04-01
Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.
Probabilistic Flood Mapping using Volunteered Geographical Information
Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.
2016-12-01
Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).
Adaptive Probabilistic Flooding for Multipath Routing
Betoule, Christophe; Clavier, Remi; Rossi, Dario; Rossini, Giuseppe; Thouenon, Gilles
2011-01-01
In this work, we develop a distributed source routing algorithm for topology discovery suitable for ISP transport networks, that is however inspired by opportunistic algorithms used in ad hoc wireless networks. We propose a plug-and-play control plane, able to find multiple paths toward the same destination, and introduce a novel algorithm, called adaptive probabilistic flooding, to achieve this goal. By keeping a small amount of state in routers taking part in the discovery process, our technique significantly limits the amount of control messages exchanged with flooding -- and, at the same time, it only minimally affects the quality of the discovered multiple path with respect to the optimal solution. Simple analytical bounds, confirmed by results gathered with extensive simulation on four realistic topologies, show our approach to be of high practical interest.
Opportunities of probabilistic flood loss models
Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno
2016-04-01
Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved
A probabilistic bridge safety evaluation against floods.
Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho
2016-01-01
To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.
A framework for probabilistic pluvial flood nowcasting for urban areas
DEFF Research Database (Denmark)
Ntegeka, Victor; Murla, Damian; Wang, Lipen;
2016-01-01
the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme...... was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS...... (12.5 – 50 m2) and low flood hazard areas (75 – 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based...
Imprecise probabilistic estimation of design floods with epistemic uncertainties
Qi, Wei; Zhang, Chi; Fu, Guangtao; Zhou, Huicheng
2016-06-01
An imprecise probabilistic framework for design flood estimation is proposed on the basis of the Dempster-Shafer theory to handle different epistemic uncertainties from data, probability distribution functions, and probability distribution parameters. These uncertainties are incorporated in cost-benefit analysis to generate the lower and upper bounds of the total cost for flood control, thus presenting improved information for decision making on design floods. Within the total cost bounds, a new robustness criterion is proposed to select a design flood that can tolerate higher levels of uncertainty. A variance decomposition approach is used to quantify individual and interactive impacts of the uncertainty sources on total cost. Results from three case studies, with 127, 104, and 54 year flood data sets, respectively, show that the imprecise probabilistic approach effectively combines aleatory and epistemic uncertainties from the various sources and provides upper and lower bounds of the total cost. Between the total cost and the robustness of design floods, a clear trade-off which is beyond the information that can be provided by the conventional minimum cost criterion is identified. The interactions among data, distributions, and parameters have a much higher contribution than parameters to the estimate of the total cost. It is found that the contributions of the various uncertainty sources and their interactions vary with different flood magnitude, but remain roughly the same with different return periods. This study demonstrates that the proposed methodology can effectively incorporate epistemic uncertainties in cost-benefit analysis of design floods.
Testing probabilistic adaptive real-time flood forecasting models
Smith, P.J.; Beven, K.J.; Leedal, D.; Weerts, A.H.; Young, P.C.
2014-01-01
Operational flood forecasting has become a complex and multifaceted task, increasingly being treated in probabilistic ways to allow for the inherent uncertainties in the forecasting process. This paper reviews recent applications of data-based mechanistic (DBM) models within the operational UK Natio
Probabilistic modeling of financial exposure to flood in France
Moncoulon, David; Quantin, Antoine; Leblois, Etienne
2014-05-01
CCR is a French reinsurance company which offers natural catastrophe covers with the State guarantee. Within this framework, CCR develops its own models to assess its financial exposure to floods, droughts, earthquakes and other perils, and thus the exposure of insurers and the French State. A probabilistic flood model has been developed in order to estimate the financial exposure of the Nat Cat insurance market to flood events, depending on their annual occurrence probability. This presentation is organized in two parts. The first part is dedicated to the development of a flood hazard and damage model (ARTEMIS). The model calibration and validation on historical events are then described. In the second part, the coupling of ARTEMIS with two generators of probabilistic events is achieved: a stochastic flow generator and a stochastic spatialized precipitation generator, adapted from the SAMPO model developed by IRSTEA. The analysis of the complementary nature of these two generators is proposed: the first one allows generating floods on the French hydrological station network; the second allows simulating surface water runoff and Small River floods, even on ungauged rivers. Thus, the simulation of thousands of non-occured, but possible events allows us to provide for the first time an estimate of the financial exposure to flooding in France at different scales (commune, department, country) and from different points of view (hazard, vulnerability and damages).
Framework for probabilistic flood risk assessment in an Alpine region
Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann
2014-05-01
Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the
Probabilistic modelling of flood events using the entropy copula
Li, Fan; Zheng, Qian
2016-11-01
The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.
What do we gain with Probabilistic Flood Loss Models?
Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.
2015-12-01
The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Flood forecasting using medium-range probabilistic weather prediction
Directory of Open Access Journals (Sweden)
B. T. Gouweleeuw
2005-01-01
Full Text Available Following the developments in short- and medium-range weather forecasting over the last decade, operational flood forecasting also appears to show a shift from a so-called single solution or 'best guess' deterministic approach towards a probabilistic approach based on ensemble techniques. While this probabilistic approach is now more or less common practice and well established in the meteorological community, operational flood forecasters have only started to look for ways to interpret and mitigate for end-users the prediction products obtained by combining so-called Ensemble Prediction Systems (EPS of Numerical Weather Prediction (NWP models with rainfall-runoff models. This paper presents initial results obtained by combining deterministic and EPS hindcasts of the global NWP model of the European Centre for Medium-Range Weather Forecasts (ECMWF with the large-scale hydrological model LISFLOOD for two historic flood events: the river Meuse flood in January 1995 and the river Odra flood in July 1997. In addition, a possible way to interpret the obtained ensemble based stream flow prediction is proposed.
Forecaster priorities for improving probabilistic flood forecasts
Wetterhall, Fredrik; Pappenberger, Florian; Alfieri, Lorenzo; Cloke, Hannah; Thielen, Jutta
2014-05-01
Hydrological ensemble prediction systems (HEPS) have in recent years been increasingly used for the operational forecasting of floods by European hydrometeorological agencies. The most obvious advantage of HEPS is that more of the uncertainty in the modelling system can be assessed. In addition, ensemble prediction systems generally have better skill than deterministic systems both in the terms of the mean forecast performance and the potential forecasting of extreme events. Research efforts have so far mostly been devoted to the improvement of the physical and technical aspects of the model systems, such as increased resolution in time and space and better description of physical processes. Developments like these are certainly needed; however, in this paper we argue that there are other areas of HEPS that need urgent attention. This was also the result from a group exercise and a survey conducted to operational forecasters within the European Flood Awareness System (EFAS) to identify the top priorities of improvement regarding their own system. They turned out to span a range of areas, the most popular being to include verification of an assessment of past forecast performance, a multi-model approach for hydrological modelling, to increase the forecast skill on the medium range (>3 days) and more focus on education and training on the interpretation of forecasts. In light of limited resources, we suggest a simple model to classify the identified priorities in terms of their cost and complexity to decide in which order to tackle them. This model is then used to create an action plan of short-, medium- and long-term research priorities with the ultimate goal of an optimal improvement of EFAS in particular and to spur the development of operational HEPS in general.
A framework for probabilistic pluvial flood nowcasting for urban areas
Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick
2016-04-01
Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the
Probabilistic flood forecast: Exact and approximate predictive distributions
Krzysztofowicz, Roman
2014-09-01
For quantification of predictive uncertainty at the forecast time t0, the future hydrograph is viewed as a discrete-time continuous-state stochastic process {Hn: n=1,…,N}, where Hn is the river stage at time instance tn>t0. The probabilistic flood forecast (PFF) should specify a sequence of exceedance functions {F‾n: n=1,…,N} such that F‾n(h)=P(Zn>h), where P stands for probability, and Zn is the maximum river stage within time interval (t0,tn], practically Zn=max{H1,…,Hn}. This article presents a method for deriving the exact PFF from a probabilistic stage transition forecast (PSTF) produced by the Bayesian forecasting system (BFS). It then recalls (i) the bounds on F‾n, which can be derived cheaply from a probabilistic river stage forecast (PRSF) produced by a simpler version of the BFS, and (ii) an approximation to F‾n, which can be constructed from the bounds via a recursive linear interpolator (RLI) without information about the stochastic dependence in the process {H1,…,Hn}, as this information is not provided by the PRSF. The RLI is substantiated by comparing the approximate PFF against the exact PFF. Being reasonably accurate and very simple, the RLI may be attractive for real-time flood forecasting in systems of lesser complexity. All methods are illustrated with a case study for a 1430 km headwater basin wherein the PFF is produced for a 72-h interval discretized into 6-h steps.
Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.
2016-12-01
Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the
Development of a Probabilistic Flood Hazard Assessment (PFHA) for the nuclear safety
Ben Daoued, Amine; Guimier, Laurent; Hamdi, Yasser; Duluc, Claire-Marie; Rebour, Vincent
2016-04-01
The purpose of this study is to lay the basis for a probabilistic evaluation of flood hazard (PFHA). Probabilistic assessment of external floods has become a current topic of interest to the nuclear scientific community. Probabilistic approaches complement deterministic approaches by exploring a set of scenarios and associating a probability to each of them. These approaches aim to identify all possible failure scenarios, combining their probability, in order to cover all possible sources of risk. They are based on the distributions of initiators and/or the variables caracterizing these initiators. The PFHA can characterize the water level for example at defined point of interest in the nuclear site. This probabilistic flood hazard characterization takes into account all the phenomena that can contribute to the flooding of the site. The main steps of the PFHA are: i) identification of flooding phenomena (rains, sea water level, etc.) and screening of relevant phenomena to the nuclear site, ii) identification and probabilization of parameters associated to selected flooding phenomena, iii) spreading of the probabilized parameters from the source to the point of interest in the site, v) obtaining hazard curves and aggregation of flooding phenomena contributions at the point of interest taking into account the uncertainties. Within this framework, the methodology of the PFHA has been developed for several flooding phenomena (rain and/or sea water level, etc.) and then implemented and tested with a simplified case study. In the same logic, our study is still in progress to take into account other flooding phenomena and to carry out more case studies.
A framework for probabilistic pluvial flood nowcasting for urban areas
DEFF Research Database (Denmark)
Ntegeka, Victor; Murla, Damian; Wang, Lipen
2016-01-01
-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts’ verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer...... for recent historical flood events. The rainfall nowcasting, hydraulic sewer and 2D inundation modelling and socio-economical flood risk results each could be partly evaluated: the rainfall nowcasting results based on radar data and rain gauges; the hydraulic sewer model results based on water level...... and discharge data at pumping stations; the 2D inundation modelling results based on limited data on some recent flood locations and inundation depths; the results for the socio-economical flood consequences of the most extreme events based on claims in the database of the national disaster agency. Different...
Verification of a probabilistic flood forecasting system for an Alpine Region of northern Italy
Laiolo, P.; Gabellani, S.; Rebora, N.; Rudari, R.; Ferraris, L.; Ratto, S.; Stevenin, H.
2012-04-01
Probabilistic hydrometeorological forecasting chains are increasingly becoming an operational tool used by civil protection centres for issuing flood alerts. One of the most important requests of decision makers is to have reliable systems, for this reason an accurate verification of their predictive performances become essential. The aim of this work is to validate a probabilistic flood forecasting system: Flood-PROOFS. The system works in real time, since 2008, in an alpine Region of northern Italy, Valle d'Aosta. It is used by the Civil Protection regional service to issue warnings and by the local water company to protect its facilities. Flood-PROOFS uses as input Quantitative Precipitation Forecast (QPF) derived from the Italian limited area model meteorological forecast (COSMO-I7) and forecasts issued by regional expert meteorologists. Furthermore the system manages and uses both real time meteorological and satellite data and real time data on the maneuvers performed by the water company on dams and river devices. The main outputs produced by the computational chain are deterministic and probabilistic discharge forecasts in different cross sections of the considered river network. The validation of the flood prediction system has been conducted on a 25 months period considering different statistical methods such as Brier score, Rank histograms and verification scores. The results highlight good performances of the system as support system for emitting warnings but there is a lack of statistics especially for huge discharge events.
Probabilistic mapping of urban flood risk: Application to extreme events in Surat, India
Ramirez, Jorge; Rajasekar, Umamaheshwaran; Coulthard, Tom; Keiler, Margreth
2016-04-01
Surat, India is a coastal city that lies on the banks of the river Tapti and is located downstream from the Ukai dam. Given Surat's geographic location, the population of five million people are repeatedly exposed to flooding caused by high tide combined with large emergency dam releases into the Tapti river. In 2006 such a flood event occurred when intense rainfall in the Tapti catchment caused a dam release near 25,000 m3 s-1 and flooded 90% of the city. A first step towards strengthening resilience in Surat requires a robust method for mapping potential flood risk that considers the uncertainty in future dam releases. Here, in this study we develop many combinations of dam release magnitude and duration for the Ukai dam. Afterwards we use these dam releases to drive a two dimensional flood model (CAESAR-Lisflood) of Surat that also considers tidal effects. Our flood model of Surat utilizes fine spatial resolution (30m) topography produced from an extensive differential global positioning system survey and measurements of river cross-sections. Within the city we have modelled scenarios that include extreme conditions with near maximum dam release levels (e.g. 1:250 year flood) and high tides. Results from all scenarios have been summarized into probabilistic flood risk maps for Surat. These maps are currently being integrated within the city disaster management plan for taking both mitigation and adaptation measures for different scenarios of flooding.
Probabilistic mapping of flood-induced backscatter changes in SAR time series
Schlaffer, Stefan; Chini, Marco; Giustarini, Laura; Matgen, Patrick
2017-04-01
The information content of flood extent maps can be increased considerably by including information on the uncertainty of the flood area delineation. This additional information can be of benefit in flood forecasting and monitoring. Furthermore, flood probability maps can be converted to binary maps showing flooded and non-flooded areas by applying a threshold probability value pF = 0.5. In this study, a probabilistic change detection approach for flood mapping based on synthetic aperture radar (SAR) time series is proposed. For this purpose, conditional probability density functions (PDFs) for land and open water surfaces were estimated from ENVISAT ASAR Wide Swath (WS) time series containing >600 images using a reference mask of permanent water bodies. A pixel-wise harmonic model was used to account for seasonality in backscatter from land areas caused by soil moisture and vegetation dynamics. The approach was evaluated for a large-scale flood event along the River Severn, United Kingdom. The retrieved flood probability maps were compared to a reference flood mask derived from high-resolution aerial imagery by means of reliability diagrams. The obtained performance measures indicate both high reliability and confidence although there was a slight under-estimation of the flood extent, which may in part be attributed to topographically induced radar shadows along the edges of the floodplain. Furthermore, the results highlight the importance of local incidence angle for the separability between flooded and non-flooded areas as specular reflection properties of open water surfaces increase with a more oblique viewing geometry.
Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions
Ostenaa, D.; O'Connell, D.; Creed, B.
2009-05-01
The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the
Directory of Open Access Journals (Sweden)
L. Mediero
2012-12-01
Full Text Available Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.
A probabilistic view on the August 2005 floods in the upper Rhine catchment
Directory of Open Access Journals (Sweden)
S. Jaun
2008-04-01
Full Text Available Appropriate precautions in the case of flood occurrence often require long lead times (several days in hydrological forecasting. This in turn implies large uncertainties that are mainly inherited from the meteorological precipitation forecast. Here we present a case study of the extreme flood event of August 2005 in the Swiss part of the Rhine catchment (total area 34 550 km^{2}. This event caused tremendous damage and was associated with precipitation amounts and flood peaks with return periods beyond 10 to 100 years. To deal with the underlying intrinsic predictability limitations, a probabilistic forecasting system is tested, which is based on a hydrological-meteorological ensemble prediction system. The meteorological component of the system is the operational limited-area COSMO-LEPS that downscales the ECMWF ensemble prediction system to a horizontal resolution of 10 km, while the hydrological component is based on the semi-distributed hydrological model PREVAH with a spatial resolution of 500 m. We document the setup of the coupled system and assess its performance for the flood event under consideration.
We show that the probabilistic meteorological-hydrological ensemble prediction chain is quite effective and provides additional guidance for extreme event forecasting, in comparison to a purely deterministic forecasting system. For the case studied, it is also shown that most of the benefits of the probabilistic approach may be realized with a comparatively small ensemble size of 10 members.
Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen; Ward, Philip
2015-04-01
In recent years, global flood losses are increasing due to socio-economic development and climate change, with the largest risk increases in developing countries such as Indonesia. For countries to undertake effective risk-management, an accurate understanding of both current and future risk is required. However, detailed information is rarely available, particularly for developing countries. We present a first of its kind country-scale analysis of flood risk using globally available data that combines a global inundation model with a land use change model and more local data on flood damages. To assess the contribution and uncertainty of different drivers of future risk, we integrate thousands of socio-economic and climate projections in a probabilistic way and include multiple adaptation strategies. Indonesia is used as a case-study as it a country that already faces high flood risk, and is undergoing rapid urbanization. We developed probabilistic and spatially-explicit urban expansion projections from 2000 to 2030 that show that the increase in urban extent ranges from 215% to 357% (5th and 95th percentile). We project rapidly rising flood risk, both for coastal and river floods. This increase is largely driven by economic growth and urban expansion (i.e. increasing exposure). Whilst sea level rise will amply this trend, the response of river floods to climate change is uncertain with the impact of the mean ensemble of 20 climate projections (5 GCMs and 4 RCPs) being close to zero. However, as urban expansion is the main driving force of future risk, we argue that the implementation of adaptation measures is increasingly pressing, regardless of the wide uncertainty in climate projections. Hence, we evaluated the effectiveness of two adaptation measures: spatial planning in flood prone areas and enhanced flood protection. Both strategies have a large potential to effectively offset the increasing risk trend. The risk reduction is in the range of 22-85% and 53
Design flood estimation in ungauged basins: probabilistic extension of the design-storm concept
Berk, Mario; Špačková, Olga; Straub, Daniel
2016-04-01
Design flood estimation in ungauged basins is an important hydrological task, which is in engineering practice typically solved with the design storm concept. However, neglecting the uncertainty in the hydrological response of the catchment through the assumption of average-recurrence-interval (ARI) neutrality between rainfall and runoff can lead to flawed design flood estimates. Additionally, selecting a single critical rainfall duration neglects the contribution of other rainfall durations on the probability of extreme flood events. In this study, the design flood problem is approached with concepts from structural reliability that enable a consistent treatment of multiple uncertainties in estimating the design flood. The uncertainty of key model parameters are represented probabilistically and the First-Order Reliability Method (FORM) is used to compute the flood exceedance probability. As an important by-product, the FORM analysis provides the most likely parameter combination to lead to a flood with a certain exceedance probability; i.e. it enables one to find representative scenarios for e.g., a 100 year or a 1000 year flood. Possible different rainfall durations are incorporated by formulating the event of a given design flood as a series system. The method is directly applicable in practice, since for the description of the rainfall depth-duration characteristics, the same inputs as for the classical design storm methods are needed, which are commonly provided by meteorological services. The proposed methodology is applied to a case study of Trauchgauer Ach catchment in Bavaria, SCS Curve Number (CN) and Unit hydrograph models are used for modeling the hydrological process. The results indicate, in accordance with past experience, that the traditional design storm concept underestimates design floods.
HESS Opinions "Forecaster priorities for improving probabilistic flood forecasts"
Wetterhall, F.; Pappenberger, F.; Alfieri, L.; Cloke, H. L.; Thielen-del Pozo, J.; Balabanova, S.; Daňhelka, J.; Vogelbacher, A.; Salamon, P.; Carrasco, I.; Cabrera-Tordera, A. J.; Corzo-Toscano, M.; Garcia-Padilla, M.; Garcia-Sanchez, R. J.; Ardilouze, C.; Jurela, S.; Terek, B.; Csik, A.; Casey, J.; Stankūnavičius, G.; Ceres, V.; Sprokkereef, E.; Stam, J.; Anghel, E.; Vladikovic, D.; Alionte Eklund, C.; Hjerdt, N.; Djerv, H.; Holmberg, F.; Nilsson, J.; Nyström, K.; Sušnik, M.; Hazlinger, M.; Holubecka, M.
2013-11-01
Hydrological ensemble prediction systems (HEPS) have in recent years been increasingly used for the operational forecasting of floods by European hydrometeorological agencies. The most obvious advantage of HEPS is that more of the uncertainty in the modelling system can be assessed. In addition, ensemble prediction systems generally have better skill than deterministic systems both in the terms of the mean forecast performance and the potential forecasting of extreme events. Research efforts have so far mostly been devoted to the improvement of the physical and technical aspects of the model systems, such as increased resolution in time and space and better description of physical processes. Developments like these are certainly needed; however, in this paper we argue that there are other areas of HEPS that need urgent attention. This was also the result from a group exercise and a survey conducted to operational forecasters within the European Flood Awareness System (EFAS) to identify the top priorities of improvement regarding their own system. They turned out to span a range of areas, the most popular being to include verification of an assessment of past forecast performance, a multi-model approach for hydrological modelling, to increase the forecast skill on the medium range (>3 days) and more focus on education and training on the interpretation of forecasts. In light of limited resources, we suggest a simple model to classify the identified priorities in terms of their cost and complexity to decide in which order to tackle them. This model is then used to create an action plan of short-, medium- and long-term research priorities with the ultimate goal of an optimal improvement of EFAS in particular and to spur the development of operational HEPS in general.
Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO.
Kreibich, Heidi; Botto, Anna; Merz, Bruno; Schröter, Kai
2017-04-01
Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. © 2016 Society for Risk Analysis.
Quinn, Niall; Freer, Jim; Coxon, Gemma; Dunne, Toby; Neal, Jeff; Bates, Paul; Sampson, Chris; Smith, Andy; Parkin, Geoff
2017-04-01
Computationally efficient flood inundation modelling systems capable of representing important hydrological and hydrodynamic flood generating processes over relatively large regions are vital for those interested in flood preparation, response, and real time forecasting. However, such systems are currently not readily available. This can be particularly important where flood predictions from intense rainfall are considered as the processes leading to flooding often involve localised, non-linear spatially connected hillslope-catchment responses. Therefore, this research introduces a novel hydrological-hydraulic modelling framework for the provision of probabilistic flood inundation predictions across catchment to regional scales that explicitly account for spatial variability in rainfall-runoff and routing processes. Approaches have been developed to automate the provision of required input datasets and estimate essential catchment characteristics from freely available, national datasets. This is an essential component of the framework as when making predictions over multiple catchments or at relatively large scales, and where data is often scarce, obtaining local information and manually incorporating it into the model quickly becomes infeasible. An extreme flooding event in the town of Morpeth, NE England, in 2008 was used as a first case study evaluation of the modelling framework introduced. The results demonstrated a high degree of prediction accuracy when comparing modelled and reconstructed event characteristics for the event, while the efficiency of the modelling approach used enabled the generation of relatively large ensembles of realisations from which uncertainty within the prediction may be represented. This research supports previous literature highlighting the importance of probabilistic forecasting, particularly during extreme events, which can be often be poorly characterised or even missed by deterministic predictions due to the inherent
Probabilistic approach to estimating the effects of channel reaches on flood frequencies
Guo, Yiping; Hansen, David; Li, Chuan
2009-08-01
A host of physical parameters and characteristics of catchments and channel reaches are normally needed in watershed planning and stormwater management studies. Some of these are also design variables, such as channel cross-section size, shape, roughness, and (to a lesser extent) bed slope. Conventional channel routing techniques employ continuity and some form of the momentum equation to determine the downstream impacts of individual flood events. With the introduction of the concept of storage-induced delay time, a probabilistic approach is developed wherein the role of a given channel reach on the frequency distribution of floods from the catchment upstream can be directly determined. The approach uses the same kinds of channel-reach parameters as are typically used by many conventional flood routing algorithms. Its physically based nature makes it suitable for watershed planning and stormwater management studies wherein little or no flow data are available for parameter estimation or flow frequency analysis. The validity of this probabilistic approach is demonstrated by comparing its outcomes with the results of a suite of conventional continuous simulations using rainfall data from Halifax, Canada.
Performance Evaluation with Different Mobility Models for Dynamic Probabilistic Flooding in MANETs
Directory of Open Access Journals (Sweden)
Abdalla M. Hanashi
2009-01-01
Full Text Available Broadcasting is an essential and effective data propagation mechanism, with several of important applications such as route discovery, address resolution, as well as many other network services. As data broadcasting has many advantages, also causing a lot of contention, collision, and congestion, which induces what is known as "broadcast storm problems". Broadcasting has traditionally been based on the flooding protocol, which simply overflows the network with high number of rebroadcast messages until the messages reach to all network nodes. A good probabilistic broadcasting protocol can achieve higher saved rebroadcast, low collisions and less number of relays. In this paper, we propose a dynamic probabilistic approach that dynamically fine-tunes the rebroadcasting probability according to the number of neighbour's nodes distributed in the ad hoc network for routing request packets (RREQs. The performance of the proposed approach is investigated and compared with the simple AODVand fixed probabilistic schemes using the GloMoSim network simulator under different mobility models. The performance results reveal that the improved approach is able to achieve higher saved rebroadcast and low collision as well as low number of relays than the fixed probabilistic scheme and simple AODV.
Mebarki, A.; Valencia, N.; Salagnac, J. L.; Barroca, B.
2012-05-01
This paper deals with the failure risk of masonry constructions under the effect of floods. It is developed within a probabilistic framework, with loads and resistances considered as random variables. Two complementary approaches have been investigated for this purpose: - a global approach based on combined effects of several governing parameters with individual weighted contribution (material quality and geometry, presence and distance between columns, beams, openings, resistance of the soil and its slope. . .), - and a reliability method using the failure mechanism of masonry walls standing out-plane pressure. The evolution of the probability of failure of masonry constructions according to the flood water level is analysed. The analysis of different failure probability scenarios for masonry walls is conducted to calibrate the influence of each "vulnerability governing parameter" in the global approach that is widely used in risk assessment at the urban or regional scale. The global methodology is implemented in a GIS that provides the spatial distribution of damage risk for different flood scenarios. A real case is considered for the simulations, i.e. Cheffes sur Sarthe (France), for which the observed river discharge, the hydraulic load according to the Digital Terrain Model, and the structural resistance are considered as random variables. The damage probability values provided by both approaches are compared. Discussions are also developed about reduction and mitigation of the flood disaster at various scales (set of structures, city, region) as well as resilience.
Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico
Plant, Nathaniel G.; Wahl, Thomas; Long, Joseph W.
2016-01-01
We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.
Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico
Wahl, Thomas; Plant, Nathaniel G.; Long, Joseph W.
2016-05-01
We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.
Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J
2015-12-15
An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries.
Directory of Open Access Journals (Sweden)
J. C. Bartholmes
2009-02-01
Full Text Available Since 2005 the European Flood Alert System (EFAS has been producing probabilistic hydrological forecasts in pre-operational mode at the Joint Research Centre (JRC of the European Commission. EFAS aims at increasing preparedness for floods in trans-national European river basins by providing medium-range deterministic and probabilistic flood forecasting information, from 3 to 10 days in advance, to national hydro-meteorological services.
This paper is Part 2 of a study presenting the development and skill assessment of EFAS. In Part 1, the scientific approach adopted in the development of the system has been presented, as well as its basic principles and forecast products. In the present article, two years of existing operational EFAS forecasts are statistically assessed and the skill of EFAS forecasts is analysed with several skill scores. The analysis is based on the comparison of threshold exceedances between proxy-observed and forecasted discharges. Skill is assessed both with and without taking into account the persistence of the forecasted signal during consecutive forecasts.
Skill assessment approaches are mostly adopted from meteorology and the analysis also compares probabilistic and deterministic aspects of EFAS. Furthermore, the utility of different skill scores is discussed and their strengths and shortcomings illustrated. The analysis shows the benefit of incorporating past forecasts in the probability analysis, for medium-range forecasts, which effectively increases the skill of the forecasts.
DEFF Research Database (Denmark)
Mirzapour, S. A.; Wong, K. Y.; Govindan, K.
2013-01-01
, a p-center location problem is considered in order to determine the locations of some relief rooms in a city and their corresponding allocation clusters. This study presents a mixed integer nonlinear programming model of a capacitated facility location-allocation problem which simultaneously considers......Potential consequences of flood disasters, including severe loss of life and property, induce emergency managers to find the appropriate locations of relief rooms to evacuate people from the origin points to a safe place in order to lessen the possible impact of flood disasters. In this research...... the probabilistic distribution of demand locations and a fixed line barrier in a region. The proposed model aims at minimizing the maximum expected weighted distance from the relief rooms to all the demand regions in order to decrease the evacuation time of people from the affected areas before flood occurrence...
Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game
Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian
2016-08-01
Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.
Mediero, L.; Garrote, L.; Requena, A.; Chávez, A.
2012-04-01
Flood events are among the natural disasters that cause most economic and social damages in Europe. Information and Communication Technology (ICT) developments in last years have enabled hydrometeorological observations available in real-time. High performance computing promises the improvement of real-time flood forecasting systems and makes the use of post processing techniques easier. This is the case of data assimilation techniques, which are used to develop an adaptive forecast model. In this paper, a real-time framework for probabilistic flood forecasting is presented and two data assimilation techniques are compared. The first data assimilation technique uses genetic programming to adapt the model to the observations as new information is available, updating the estimation of the probability distribution of the model parameters. The second data assimilation technique uses an ensemble Kalman filter to quantify errors in both hydrologic model and observations, updating estimates of system states. Both forecast models take the result of the hydrologic model calibration as a starting point and adapts the individuals of this first population to the new observations in each operation time step. Data assimilation techniques have great potential when are used in hydrological distributed models. The distributed RIBS (Real-time Interactive Basin Simulator) rainfall-runoff model was selected to simulate the hydrological process in the basin. The RIBS model is deterministic, but it is run in a probabilistic way through Monte Carlo simulations over the probability distribution functions that best characterise the most relevant model parameters, which were identified by a probabilistic multi-objective calibration developed in a previous work. The Manzanares River basin was selected as a case study. Data assimilation processes are computationally intensive. Therefore, they are well suited to test the applicability of the potential of the Grid technology to
A Probabilistic Framework for Risk Analysis of Widespread Flood Events: A Proof-of-Concept Study.
Schneeberger, Klaus; Huttenlau, Matthias; Winter, Benjamin; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann
2017-07-27
This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top-kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof-of-concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately. © 2017 Society for Risk Analysis.
Xu, X.
2016-12-01
The uncertainty associated with inflow boundary forcing data has been recognized as an important and dominant source of uncertainties in hydraulic model. Here, we develop a real-time probabilistic channel flood forecasting model with a novel function to incorporate the uncertainty of forcing inflow. This new approach couples a hydraulic model with the Sequential Monte Carlo (SMC), Particle filter (PF), data assimilation algorithm. Stage observations at hydrological stations along the channel are assimilated at each time step to update the model states in order to improve next time step's forecasting. We test this new approach for a real flood event occurred during June 27, 2009 and July 9, 2009 in the river reach from upstream Cuntan station to downstream Zhongxian station of the Middle Yangtze River, China. As compared with open loop model simulations, model evaluations with several quantitative deterministic and probabilistic metrics indicate that accuracy of the ensemble mean prediction and reliability of the uncertainty quantification are improved pronouncedly as a result of the PF assimilation. Further assessment of forecasting performance at different lead times shows that the degree of model improvement weakens with the increase of lead time due to the gradual diminishing of updating effect on initial conditions. The examination of different number of particles shows that the optimal number of particles can be chosen as a tradeoff between model performance and computation burden. The analysis of different assimilation frequency indicates that higher assimilation frequency can help improve the model performance by incorporating more observation information and updating model states to better represent instantaneous flood conditions.
Cole, Steven J.; Moore, Robert J.; Robson, Alice J.; Mattingley, Paul S.
2014-05-01
prediction (NWP) model can provide realistic looking rainfall forecasts, significant uncertainties remain in timing, location and whether a particular feature develops or not. Generally the smaller the scale of the rainfall feature, the shorter the lead-time at which these uncertainties become important. Therefore ensembles are needed to provide uncertainty context for longer lead-time G2G flow forecasts, particularly for small-scale RRCs. A systematic assessment framework has been developed for exploring and understanding the utility of G2G flood forecasts for RRCs. Firstly perfect knowledge of rainfall observations is assumed for past and future times, so as not to confound the hydrological model analysis with errors from rainfall forecasts. Secondly an assessment is made of using deterministic rainfall forecasts (from NWP UKV) in a full emulation of real-time G2G forecasts, and using foreknowledge of rainfall observations as a reference baseline. Finally use of rainfall forecast ensembles with G2G to produce probabilistic flood forecasts is considered, empploying a combination of case-study and longer-term analyses. Blended Ensemble rainfall forecasts (combining radar ensemble nowcast and NWP rainfalls) are assessed in two forms: forecasts out to 24 hours updated 4 times a day, and nowcasts out to 7 hours updated every 15 minutes. Results from the assessment will be presented along with candidates for new operational products and tools that can support flood warning for RRCs, taking account of the inherent uncertainty in the forecasts.
Directory of Open Access Journals (Sweden)
Seyed Ali Mirzapour
2013-01-01
Full Text Available Potential consequences of flood disasters, including severe loss of life and property, induce emergency managers to find the appropriate locations of relief rooms to evacuate people from the origin points to a safe place in order to lessen the possible impact of flood disasters. In this research, a p-center location problem is considered in order to determine the locations of some relief rooms in a city and their corresponding allocation clusters. This study presents a mixed integer nonlinear programming model of a capacitated facility location-allocation problem which simultaneously considers the probabilistic distribution of demand locations and a fixed line barrier in a region. The proposed model aims at minimizing the maximum expected weighted distance from the relief rooms to all the demand regions in order to decrease the evacuation time of people from the affected areas before flood occurrence. A real-world case study has been carried out to examine the effectiveness and applicability of the proposed model.
Directory of Open Access Journals (Sweden)
J. Dietrich
2008-03-01
Full Text Available Flood forecasts are essential to issue reliable flood warnings and to initiate flood control measures on time. The accuracy and the lead time of the predictions for head waters primarily depend on the meteorological forecasts. Ensemble forecasts are a means of framing the uncertainty of the potential future development of the hydro-meteorological situation.
This contribution presents a flood management strategy based on probabilistic hydrological forecasts driven by operational meteorological ensemble prediction systems. The meteorological ensemble forecasts are transformed into discharge ensemble forecasts by a rainfall-runoff model. Exceedance probabilities for critical discharge values and probabilistic maps of inundation areas can be computed and presented to decision makers. These results can support decision makers in issuing flood alerts. The flood management system integrates ensemble forecasts with different spatial resolution and different lead times. The hydrological models are controlled in an adaptive way, mainly depending on the lead time of the forecast, the expected magnitude of the flood event and the availability of measured data.
The aforementioned flood forecast techniques have been applied to a case study. The Mulde River Basin (South-Eastern Germany, Czech Republic has often been affected by severe flood events including local flash floods. Hindcasts for the large scale extreme flood in August 2002 have been computed using meteorological predictions from both the COSMO-LEPS ensemble prediction system and the deterministic COSMO-DE local model. The temporal evolution of a the meteorological forecast uncertainty and b the probability of exceeding flood alert levels is discussed. Results from the hindcast simulations demonstrate, that the systems would have predicted a high probability of an extreme flood event, if they would already have been operational in 2002. COSMO-LEPS showed a reasonably good
Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game
Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian
2016-04-01
Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.
INSYDE: a synthetic, probabilistic flood damage model based on explicit cost analysis
Dottori, Francesco; Figueiredo, Rui; Martina, Mario L. V.; Molinari, Daniela; Scorzini, Anna Rita
2016-12-01
Methodologies to estimate economic flood damages are increasingly important for flood risk assessment and management. In this work, we present a new synthetic flood damage model based on a component-by-component analysis of physical damage to buildings. The damage functions are designed using an expert-based approach with the support of existing scientific and technical literature, loss adjustment studies, and damage surveys carried out for past flood events in Italy. The model structure is designed to be transparent and flexible, and therefore it can be applied in different geographical contexts and adapted to the actual knowledge of hazard and vulnerability variables. The model has been tested in a recent flood event in northern Italy. Validation results provided good estimates of post-event damages, with similar or superior performances when compared with other damage models available in the literature. In addition, a local sensitivity analysis was performed in order to identify the hazard variables that have more influence on damage assessment results.
Floods are common in the United States. Weather such as heavy rain, thunderstorms, hurricanes, or tsunamis can ... is breached, or when a dam breaks. Flash floods, which can develop quickly, often have a dangerous ...
U.S. Geological Survey, Department of the Interior — PROBZONES is a generalized polygon layer outlining areas in the Seaside-Gearhart, Oregon, area subject to the 100-year and 500-year flood as determined by...
Douglas, E. M.; Kirshen, P. H.; Bosma, K.; Watson, C.; Miller, S.; McArthur, K.
2015-12-01
There now exists a plethora of information attesting to the reality of our changing climate and its impacts on both human and natural systems. There also exists a growing literature linking climate change impacts and transportation infrastructure (highways, bridges, tunnels, railway, shipping ports, etc.) which largely agrees that the nation's transportation systems are vulnerable. To assess this vulnerability along the coast, flooding due to sea level rise and storm surge has most commonly been evaluated by simply increasing the water surface elevation and then estimating flood depth by comparing the new water surface elevation with the topographic elevations of the land surface. While this rudimentary "bathtub" approach may provide a first order identification of potential areas of vulnerability, accurate assessment requires a high resolution, physically-based hydrodynamic model that can simulate inundation due to the combined effects of sea level rise, storm surge, tides and wave action for site-specific locations. Furthermore, neither the "bathtub" approach nor other scenario-based approaches can quantify the probability of flooding due to these impacts. We developed a high resolution coupled ocean circulation-wave model (ADCIRC/SWAN) that utilizes a Monte Carlo approach for predicting the depths and associated exceedance probabilities of flooding due to both tropical (hurricanes) and extra-tropical storms under current and future climate conditions. This required the development of an entirely new database of meteorological forcing (e.g. pressure, wind speed, etc.) for historical Nor'easters in the North Atlantic basin. Flooding due to hurricanes and Nor'easters was simulated separately and then composite flood probability distributions were developed. Model results were used to assess the vulnerability of the Central Artery/Tunnel system in Boston, Massachusetts to coastal flooding now and in the future. Local and regional adaptation strategies were
考虑降雨不确定性的洪水概率预报方法%Probabilistic flood forecasting considering rainfall uncertainty
Institute of Scientific and Technical Information of China (English)
梁忠民; 蒋晓蕾; 曹炎煦; 彭顺风; 王凯; 王栋
2016-01-01
Based on the principle of the rainfall station sampling method, the probability distribution of the true value of areal rainfall was deduced, using the estimated areal rainfall, in order to describe the uncertainty of areal rainfall calculation under the conditions of existing rainfall stations. Then, probabilistic flood forecasting was performed in combination with deterministic prediction models. Application of this method to the Huangnizhuang Basin of the Huaihe River shows that it can obtain estimates of the probability distribution of the true value of areal rainfall for any periods of time in the basin and describe the uncertainty of the areal rainfall calculation. Furthermore, using hydrologic model (e.g Xinanjiang model) and Monte-Carlo sampling technique, the probability distribution of the predicted flow rate can be estimated, and probabilistic flood forecasting can be carried out.%基于流域雨量站网布设的抽站法原理，推导以面雨量计算值为条件的面雨量真值的概率分布，用以描述现有测站数目条件下流域面雨量计算的不确定性。在此基础上，结合确定性预报模型，展开洪水概率预报研究。以淮河黄泥庄流域为研究对象，对该方法进行应用，结果表明：该方法不仅可以实现任一时段流域面雨量真值概率分布的估计，描述面雨量计算的不确定性；同时，通过与水文模型(如新安江模型)耦合，结合Monte-Carlo抽样技术，可以实现预报流量概率分布的估计，从而实现洪水概率预报。
Geist, Eric; Jones, Henry; McBride, Mark; Fedors, Randy
2013-01-01
Panel 5 focused on tsunami flooding with an emphasis on Probabilistic Tsunami Hazard Analysis (PTHA) as derived from its counterpart, Probabilistic Seismic Hazard Analysis (PSHA) that determines seismic ground-motion hazards. The Panel reviewed current practices in PTHA and determined the viability of extending the analysis to extreme design probabilities (i.e., 10-4 to 10-6). In addition to earthquake sources for tsunamis, PTHA for extreme events necessitates the inclusion of tsunamis generated by submarine landslides, and treatment of the large attendant uncertainty in source characterization and recurrence rates. Tsunamis can be caused by local and distant earthquakes, landslides, volcanism, and asteroid/meteorite impacts. Coastal flooding caused by storm surges and seiches is covered in Panel 7. Tsunamis directly tied to earthquakes, the similarities with (and path forward offered by) the PSHA approach for PTHA, and especially submarine landslide tsunamis were a particular focus of Panel 5.
Institute of Scientific and Technical Information of China (English)
蒋晓蕾; 梁忠民; 王春青; 刘晓伟; 刘龙庆
2015-01-01
采用马斯京根演算法作为确定性预报模型，并选用贝叶斯预报系统（BFS）的水文不确定性处理器（HUP）作为概率预报模型，获得预报变量的概率分布，实现黄河潼关站洪水的概率预报。将预报变量概率分布的中位数作为定值预报与确定性预报进行对比，发现预报精度有所提高，表明贝叶斯模型的预报校正能力较强。通过设定不同确定性预报精度的情景方案，探讨了确定性预报精度对概率预报可靠度的影响。结果表明，随着确定性预报精度的提高，概率预报区间宽度和离散度均有所减小；HUP洪水概率预报的可靠度对确定性预报的偶然性误差比较敏感，对系统偏差相对不敏感。%On the basis of deterministic forecasting with Muskingum routing approach,the hydrologic uncertainty processor (HUP)of Bayes-ian forecasting system (BFS)was applied to obtain the probability distribution of the prediction,on which flood probabilistic forecasting of Tongguan Station was carried out. Taking the median of the probability distribution of prediction as a result of the probabilistic forecasting,it was used to compare with the deterministic forecasting. The compared results show that the forecast accuracy of probabilistic forecast is im-proved,which indicates the high correcting ability in forecast of the Bayesian model. Then it investigated the probabilistic forecasting accura-cy influenced by deterministic forecasting accuracy through different deterministic forecast sets. It demonstrates that the width and dispersion of probabilistic forecast interval are decreasing by improving the accuracy of deterministic forecasting. Meanwhile,the reliability of probabi-listic forecast based on HUP is sensitive to random error of deterministic forecasting,but relatively insensitive to systematic error.
Application of probabilistic precipitation forecasts from a ...
African Journals Online (AJOL)
Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... An ensemble set of 30 adjacent basins is then identified as ensemble members for each ...
Wakker, P. P.; Thaler, R.H.; Tversky, A.
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...
Wakker, P.P.; Thaler, R.H.; Tversky, A.
1997-01-01
Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be
P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these
DEFF Research Database (Denmark)
Jensen, Finn Verner; Lauritzen, Steffen Lilholt
2001-01-01
This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....
P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these pref
DEFF Research Database (Denmark)
Thorndahl, Søren; Willems, Patrick
2007-01-01
Failure of urban drainage systems may occur due to surcharge or flooding at specific manholes in the system, or due to overflows from combined sewer systems to receiving waters. To quantify the probability or return period of failure, standard approaches make use of the simulation of design storm...
DEFF Research Database (Denmark)
Liu, Dedi; Li, Xiang; Guo, Shenglian;
2015-01-01
inflow values and their uncertainties obtained from the BFS, the reservoir operation results from different schemes can be analyzed in terms of benefits, dam safety, and downstream impacts during the flood season. When the reservoir FLWL dynamic control operation is implemented, there are two fundamental...
Floods and flash flooding Now is the time to determine your area’s flood risk. If you are not sure whether you ... If you are in a floodplain, consider buying flood insurance. Do not drive around barricades. If your ...
Suciu, Dan; Koch, Christop
2011-01-01
Probabilistic databases are databases where the value of some attributes or the presence of some records are uncertain and known only with some probability. Applications in many areas such as information extraction, RFID and scientific data management, data cleaning, data integration, and financial risk assessment produce large volumes of uncertain data, which are best modeled and processed by a probabilistic database. This book presents the state of the art in representation formalisms and query processing techniques for probabilistic data. It starts by discussing the basic principles for rep
Kopp, R. E., III; Delgado, M.; Horton, R. M.; Houser, T.; Little, C. M.; Muir-Wood, R.; Oppenheimer, M.; Rasmussen, D. M., Jr.; Strauss, B.; Tebaldi, C.
2014-12-01
Global mean sea level (GMSL) rise projections are insufficient for adaptation planning; local decisions require local projections that characterize risk over a range of timeframes and tolerances. We present a global set of local sea level (LSL) projections to inform decisions on timescales ranging from the coming decades through the 22nd century. We present complete probability distributions, informed by a combination of expert community assessment, expert elicitation, and process modeling [1]. We illustrate the application of this framework by estimating the joint distribution of future sea-level change and coastal flooding, and associated economic costs [1,2]. In much of the world in the current century, differences in median LSL projections are due primarily to varying levels of non-climatic uplift or subsidence. In the 22nd century and in the high-end tails, larger ice sheet contributions, particularly from the Antarctic ice sheet (AIS), contribute significantly to site-to-site differences. Uncertainty in GMSL and most LSL projections is dominated by the uncertain AIS component. Sea-level rise dramatically reshapes flood risk. For example, at the New York City (Battery) tide gauge, our projections indicate a likely (67% probability) 21st century LSL rise under RCP 8.5 of 65--129 cm (1-in-20 chance of exceeding 154 cm). Convolving the distribution of projected sea-level rise with the extreme value distribution of flood return periods indicates that this rise will cause the current 1.80 m `1-in-100 year' flood event to occur an expected nine times over the 21st century -- equivalent to the expected number of `1-in-11 year' floods in the absence of sea-level change. Projected sea-level rise for 2100 under RCP 8.5 would likely place 80-160 billion of current property in New York below the high tide line, with a 1-in-20 chance of losses >190 billion. Even without accounting for potential changes in storms themselves, it would likely increase average annual storm
Brooks, K.N.; Fallon, J.D.; Lorenz, D.L.; Stark, J.R.; Menard, Jason; Easter, K.W.; Perry, Jim
2011-01-01
Floods result in great human disasters globally and nationally, causing an average of $4 billion of damages each year in the United States. Minnesota has its share of floods and flood damages, and the state has awarded nearly $278 million to local units of government for flood mitigation projects through its Flood Hazard Mitigation Grant Program. Since 1995, flood mitigation in the Red River Valley has exceeded $146 million. Considerable local and state funding has been provided to manage and mitigate problems of excess stormwater in urban areas, flooding of farmlands, and flood damages at road crossings. The cumulative costs involved with floods and flood mitigation in Minnesota are not known precisely, but it is safe to conclude that flood mitigation is a costly business. This chapter begins with a description of floods in Minneosta to provide examples and contrasts across the state. Background material is presented to provide a basic understanding of floods and flood processes, predication, and management and mitigation. Methods of analyzing and characterizing floods are presented because they affect how we respond to flooding and can influence relevant practices. The understanding and perceptions of floods and flooding commonly differ among those who work in flood forecasting, flood protection, or water resource mamnagement and citizens and businesses affected by floods. These differences can become magnified following a major flood, pointing to the need for better understanding of flooding as well as common language to describe flood risks and the uncertainty associated with determining such risks. Expectations of accurate and timely flood forecasts and our ability to control floods do not always match reality. Striving for clarity is important in formulating policies that can help avoid recurring flood damages and costs.
Douven, Igor; Horsten, Leon; Romeijn, Jan-Willem
2010-01-01
Until now, antirealists have offered sketches of a theory of truth, at best. In this paper, we present a probabilist account of antirealist truth in some formal detail, and we assess its ability to deal with the problems that are standardly taken to beset antirealism.
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Burcharth, H. F.
This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...
Bod, R.; Heine, B.; Narrog, H.
2010-01-01
Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter
Do probabilistic forecasts lead to better decisions?
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2013-06-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.
40 CFR Appendix F to Part 112 - Facility-Specific Response Plan
2010-07-01
... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Facility-Specific Response Plan F Appendix F to Part 112 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS...., topography, drainage); (5) Location of the material discharged (i.e., on a concrete pad or directly on the...
Directory of Open Access Journals (Sweden)
Mikaël Cozic
2016-11-01
Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.
Flooding Fragility Experiments and Prediction
Energy Technology Data Exchange (ETDEWEB)
Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Tahhan, Antonio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Muchmore, Cody [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nichols, Larinda [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bhandari, Bishwo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pope, Chad [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2016-09-01
This report describes the work that has been performed on flooding fragility, both the experimental tests being carried out and the probabilistic fragility predictive models being produced in order to use the text results. Flooding experiments involving full-scale doors have commenced in the Portal Evaluation Tank. The goal of these experiments is to develop a full-scale component flooding experiment protocol and to acquire data that can be used to create Bayesian regression models representing the fragility of these components. This work is in support of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluation research and development.
Use of documentary sources on past flood events for flood risk management and land planning
Cœur, Denis; Lang, Michel
2008-09-01
The knowledge of past catastrophic events can improve flood risk mitigation policy, with a better awareness against risk. As such historical information is usually available in Europe for the past five centuries, historians are able to understand how past society dealt with flood risk, and hydrologists can include information on past floods into an adapted probabilistic framework. In France, Flood Risk Mitigation Maps are based either on the largest historical known flood event or on the 100-year flood event if it is greater. Two actions can be suggested in terms of promoting the use of historical information for flood risk management: (1) the development of a regional flood data base, with both historical and current data, in order to get a good feedback on recent events and to improve the flood risk education and awareness; (2) the commitment to keep a persistent/perennial management of a reference network of hydrometeorological observations for climate change studies.
Developing a Malaysia flood model
Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina
2014-05-01
Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.
Schweizer, B
2005-01-01
Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.
Probabilistic Concurrent Kleene Algebra
Directory of Open Access Journals (Sweden)
Annabelle McIver
2013-06-01
Full Text Available We provide an extension of concurrent Kleene algebras to account for probabilistic properties. The algebra yields a unified framework containing nondeterminism, concurrency and probability and is sound with respect to the set of probabilistic automata modulo probabilistic simulation. We use the resulting algebra to generalise the algebraic formulation of a variant of Jones' rely/guarantee calculus.
Palán, Ladislav; Punčochář, Petr
2017-04-01
Looking on the impact of flooding from the World-wide perspective, in last 50 years flooding has caused over 460,000 fatalities and caused serious material damage. Combining economic loss from ten costliest flood events (from the same period) returns a loss (in the present value) exceeding 300bn USD. Locally, in Brazil, flood is the most damaging natural peril with alarming increase of events frequencies as 5 out of the 10 biggest flood losses ever recorded have occurred after 2009. The amount of economic and insured losses particularly caused by various flood types was the key driver of the local probabilistic flood model development. Considering the area of Brazil (being 5th biggest country in the World) and the scattered distribution of insured exposure, a domain covered by the model was limited to the entire state of Sao Paolo and 53 additional regions. The model quantifies losses on approx. 90 % of exposure (for regular property lines) of key insurers. Based on detailed exposure analysis, Impact Forecasting has developed this tool using long term local hydrological data series (Agencia Nacional de Aguas) from riverine gauge stations and digital elevation model (Instituto Brasileiro de Geografia e Estatística). To provide most accurate representation of local hydrological behaviour needed for the nature of probabilistic simulation, a hydrological data processing focused on frequency analyses of seasonal peak flows - done by fitting appropriate extreme value statistical distribution and stochastic event set generation consisting of synthetically derived flood events respecting realistic spatial and frequency patterns visible in entire period of hydrological observation. Data were tested for homogeneity, consistency and for any significant breakpoint occurrence in time series so the entire observation or only its subparts were used for further analysis. The realistic spatial patterns of stochastic events are reproduced through the innovative use of d-vine copula
Punčochář, P.; Podlaha, A.
2012-04-01
A new flood model for Austria quantifying fluvial flood losses based on probabilistic event set developed by Impact Forecasting (Aon Benfield's model development centre) was released in June 2011. It was successfully validated with two serious past flood events - August 2002 and August 2005. The model is based on 10 meters cell size digital terrain model with 1cm vertical step and uses daily mean flows from 548 gauge stations of series of average length ~ 60 years. The even set is based on monthly maxima flows correlation, generating 12 stochastic events per year and allows to calculate annual and occurrence exceedance probability loss estimates. The model contains flood extents for more than 24,000 km of modelled river network compatible with HORA project (HOchwasserRisikoflächen Austria) for design flows ranging from 2 to 10,000 years. Model is primarily constructed to work with postal level resolution insurance data reducing positional uncertainty by weighting over more than 2.5 millions address points from Austria Post's ACGeo database. Countrywide flood protections were provided by the Austrian Ministry of Environment. The model was successfully tested with property portfolios of 8 global and local insurance companies and was also successfully validated with August 2002 and August 2005 past events evaluating their return period on the probabilistic simulation basis.
Probabilistic Algorithms in Robotics
Thrun, Sebastian
2000-01-01
This article describes a methodology for programming robots known as probabilistic robotics. The probabilistic paradigm pays tribute to the inherent uncertainty in robot perception, relying on explicit representations of uncertainty when determining what to do. This article surveys some of the progress in the field, using in-depth examples to illustrate some of the nuts and bolts of the basic approach. My central conjecture is that the probabilistic approach to robotics scales better to compl...
Probabilistic liver atlas construction
Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E.
2017-01-01
Background Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. Results A new method for probabilistic atlas con...
Probabilistic Logical Characterization
DEFF Research Database (Denmark)
Hermanns, Holger; Parma, Augusto; Segala, Roberto;
2011-01-01
Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance model...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....
Methods and tools to support real time risk-based flood forecasting - a UK pilot application
Directory of Open Access Journals (Sweden)
Brown Emma
2016-01-01
Full Text Available Flood managers have traditionally used probabilistic models to assess potential flood risk for strategic planning and non-operational applications. Computational restrictions on data volumes and simulation times have meant that information on the risk of flooding has not been available for operational flood forecasting purposes. In practice, however, the operational flood manager has probabilistic questions to answer, which are not completely supported by the outputs of traditional, deterministic flood forecasting systems. In a collaborative approach, HR Wallingford and Deltares have developed methods, tools and techniques to extend existing flood forecasting systems with elements of strategic flood risk analysis, including probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. This paper presents the results of the application of these new operational flood risk management tools to a pilot catchment in the UK. It discusses the problems of performing probabilistic flood risk assessment in real time and how these have been addressed in this study. It also describes the challenges of the communication of risk to operational flood managers and to the general public, and how these new methods and tools can provide risk-based supporting evidence to assist with this process.
Duplicate Detection in Probabilistic Data
Panse, Fabian; Keulen, van Maurice; Keijzer, de Ander; Ritter, Norbert
2009-01-01
Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused o
Probabilistic Dynamic Epistemic Logic
Kooi, B.P.
2003-01-01
In this paper I combine the dynamic epistemic logic of Gerbrandy (1999) with the probabilistic logic of Fagin and Halpern (1999). The result is a new probabilistic dynamic epistemic logic, a logic for reasoning about probability, information, and information change that takes higher order informatio
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian
2016-01-01
We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...
Leijala, Ulpu; Björkqvist, Jan-Victor; Johansson, Milla M.; Pellikka, Havu
2017-04-01
Future coastal management continuously strives for more location-exact and precise methods to investigate possible extreme sea level events and to face flooding hazards in the most appropriate way. Evaluating future flooding risks by understanding the behaviour of the joint effect of sea level variations and wind waves is one of the means to make more comprehensive flooding hazard analysis, and may at first seem like a straightforward task to solve. Nevertheless, challenges and limitations such as availability of time series of the sea level and wave height components, the quality of data, significant locational variability of coastal wave height, as well as assumptions to be made depending on the study location, make the task more complicated. In this study, we present a statistical method for combining location-specific probability distributions of water level variations (including local sea level observations and global mean sea level rise) and wave run-up (based on wave buoy measurements). The goal of our method is to obtain a more accurate way to account for the waves when making flooding hazard analysis on the coast compared to the approach of adding a separate fixed wave action height on top of sea level -based flood risk estimates. As a result of our new method, we gain maximum elevation heights with different return periods of the continuous water mass caused by a combination of both phenomena, "the green water". We also introduce a sensitivity analysis to evaluate the properties and functioning of our method. The sensitivity test is based on using theoretical wave distributions representing different alternatives of wave behaviour in relation to sea level variations. As these wave distributions are merged with the sea level distribution, we get information on how the different wave height conditions and shape of the wave height distribution influence the joint results. Our method presented here can be used as an advanced tool to minimize over- and
Probabilistic Ensemble Forecast of Summertime Temperatures in Pakistan
Directory of Open Access Journals (Sweden)
Muhammad Hanif
2014-01-01
Full Text Available Snowmelt flooding triggered by intense heat is a major temperature related weather hazard in northern Pakistan, and the frequency of such extreme flood events has increased during the recent years. In this study, the probabilistic temperature forecasts at seasonal and subseasonal time scales based on hindcasts simulations from three state-of-the-art models within the DEMETER project are assessed by the relative operating characteristic (ROC verification method. Results based on direct model outputs reveal significant skill for hot summers in February 3–5 (ROC area=0.707 with lower 95% confidence limit of 0.538 and February 4-5 (ROC area=0.771 with lower 95% confidence limit of 0.623 forecasts when validated against observations. Results for ERA-40 reanalysis also show skill for hot summers. Skilful probabilistic ensemble forecasts of summertime temperatures may be valuable in providing the foreknowledge of snowmelt flooding and water management in Pakistan.
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
An operational procedure for rapid flood risk assessment in Europe
Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc
2017-07-01
The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.
Probabilistic transmission system planning
Li, Wenyuan
2011-01-01
"The book is composed of 12 chapters and three appendices, and can be divided into four parts. The first part includes Chapters 2 to 7, which discuss the concepts, models, methods and data in probabilistic transmission planning. The second part, Chapters 8 to 11, addresses four essential issues in probabilistic transmission planning applications using actual utility systems as examples. Chapter 12, as the third part, focuses on a special issue, i.e. how to deal with uncertainty of data in probabilistic transmission planning. The fourth part consists of three appendices, which provide the basic knowledge in mathematics for probabilistic planning. Please refer to the attached table of contents which is given in a very detailed manner"--
Conditioning Probabilistic Databases
Koch, Christoph
2008-01-01
Past research on probabilistic databases has studied the problem of answering queries on a static database. Application scenarios of probabilistic databases however often involve the conditioning of a database using additional information in the form of new evidence. The conditioning problem is thus to transform a probabilistic database of priors into a posterior probabilistic database which is materialized for subsequent query processing or further refinement. It turns out that the conditioning problem is closely related to the problem of computing exact tuple confidence values. It is known that exact confidence computation is an NP-hard problem. This has lead researchers to consider approximation techniques for confidence computation. However, neither conditioning nor exact confidence computation can be solved using such techniques. In this paper we present efficient techniques for both problems. We study several problem decomposition methods and heuristics that are based on the most successful search techn...
Directory of Open Access Journals (Sweden)
Azad Wan Hazdy
2017-01-01
Full Text Available Flood disaster occurs quite frequently in Malaysia and has been categorized as the most threatening natural disaster compared to landslides, hurricanes, tsunami, haze and others. A study by Department of Irrigation and Drainage (DID show that 9% of land areas in Malaysia are prone to flood which may affect approximately 4.9 million of the population. 2 Dimensional floods routing modelling demonstrate is turning out to be broadly utilized for flood plain display and is an extremely viable device for evaluating flood. Flood propagations can be better understood by simulating the flow and water level by using hydrodynamic modelling. The hydrodynamic flood routing can be recognized by the spatial complexity of the schematization such as 1D model and 2D model. It was found that most of available hydrological models for flood forecasting are more focus on short duration as compared to long duration hydrological model using the Probabilistic Distribution Moisture Model (PDM. The aim of this paper is to discuss preliminary findings on development of flood forecasting model using Probabilistic Distribution Moisture Model (PDM for Kelantan river basin. Among the findings discuss in this paper includes preliminary calibrated PDM model, which performed reasonably for the Dec 2014, but underestimated the peak flows. Apart from that, this paper also discusses findings on Soil Moisture Deficit (SMD and flood plain analysis. Flood forecasting is the complex process that begins with an understanding of the geographical makeup of the catchment and knowledge of the preferential regions of heavy rainfall and flood behaviour for the area of responsibility. Therefore, to decreases the uncertainty in the model output, so it is important to increase the complexity of the model.
Forecasting Extreme Flooding in South Asia (Invited)
Webster, P. J.
2010-12-01
In most years there is extensive flooding across India, Pakistan and Bangladesh. On average, 40 million people are displaced by floods in India and half that many again in Bangladesh. Occasionally, even more extensive and severe flooding occurs across South Asia. In 2007 and 2008 the Brahmaputra flooded three times causing severe disruption of commerce, agriculture and life in general. Systems set up by an international collaboration predicted these Bangladesh floods with an operational system at the 10 and 15-day horizon. These forecasts determined the risk of flooding and allowed the Bangladeshis in peril to prepare, harvesting crops and storing of household and agricultural assets. Savings in increments of annual income resulted form the forecasts. In July and August 2010, severe flooding occurred in Pakistan causing horrendous damage and loss of life. But these floods were also predictable at the 10-day time scale if the same forecasting system developed for Bangladesh had been implemented. Similar systems could be implemented in India but would require local cooperation. We describe the manner in which quantified probabilistic precipitation forecasts, coupled with hydrological models can provide useful and timely extended warnings of flooding.
Bayesian flood forecasting methods: A review
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been
Probabilistic Belief Logic and Its Probabilistic Aumann Semantics
Institute of Scientific and Technical Information of China (English)
CAO ZiNing(曹子宁); SHI ChunYi(石纯一)
2003-01-01
In this paper, we present a logic system for probabilistic belief named PBL,which expands the language of belief logic by introducing probabilistic belief. Furthermore, wegive the probabilistic Aumann semantics of PBL. We also list some valid properties of belief andprobabilistic belief, which form the deduction system of PBL. Finally, we prove the soundness andcompleteness of these properties with respect to probabilistic Aumann semantics.
Formalizing Probabilistic Safety Claims
Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.
2011-01-01
A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.
Probabilistic approach to mechanisms
Sandler, BZ
1984-01-01
This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.
Probabilistic conditional independence structures
Studeny, Milan
2005-01-01
Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.
Identification and classification of Serbia's historic floods
Directory of Open Access Journals (Sweden)
Prohaska Stevan
2009-01-01
Full Text Available River flooding in Serbia is a natural phenomenon which largely exceeds the scope of water management and hydraulic engineering, and has considerable impact on the development of Serbian society. Today, the importance and value of areas threatened by floods are among the key considerations of sustainable development. As a result, flood protection techniques and procedures need to be continually refined and updated, following innovations in the fields of science and technology. Knowledge of high flows is key for sizing hydraulic structures and for gauging the cost-effectiveness and safety of the component structures of flood protection systems. However, sizing of hydraulic structures based on computed high flows does not ensure absolute safety; there is a residual flood risk and a risk of structural failure, if a flood exceeds computed levels. In hydrological practice, such floods are often referred to as historic/loads. The goal of this paper is to present a calculation procedure for the objective identification of historic floods, using long, multiple-year series of data on high flows of natural watercourses in Serbia. At its current stage of development, the calculation procedure is based on maximum annual discharges recorded at key monitoring stations of the Hydro-Meteorological Service of Serbia (HMS Serbia. When applied, the procedure results in the identification of specific historic maximum stages/floods (if any at all gauge sites included in the analysis. The probabilistic theory is then applied to assess the statistical significance of each identified historic flood and to classify the historic flood, as appropriate. At the end of the paper, the results of the applied methodology are shown in tabular and graphic form for various Serbian rivers. All identified historic floods are ranked based on their probability of occurrence (i.e., return period.
Institute of Scientific and Technical Information of China (English)
YIN PUMIN
2010-01-01
@@ Drenched riverside towns in central and south parts of China were preparing for even worse flooding as water levels in the country's huge rivers surged and rainstorms continued. As of July 27,accumulated precipitation since June 16 in 70 percent of the drainage areas of the Yangtze River had exceeded 50 mm,after three rounds of rainstorms,said Cai Qihua,Deputy Director of the Yangtze River Flood Control and Drought Relief Headquarters.
Probabilistic Causation without Probability.
Holland, Paul W.
The failure of Hume's "constant conjunction" to describe apparently causal relations in science and everyday life has led to various "probabilistic" theories of causation of which the study by P. C. Suppes (1970) is an important example. A formal model that was developed for the analysis of comparative agricultural experiments…
Probabilistic simple sticker systems
Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod
2017-04-01
A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.
Probabilistic parsing strategies
Nederhof, Mark-Jan; Satta, Giorgio
We present new results on the relation between purely symbolic context-free parsing strategies and their probabilistic counterparts. Such parsing strategies are seen as constructions of push-down devices from grammars. We show that preservation of probability distribution is possible under two
Bergstra, J.A.; Middelburg, C.A.
2015-01-01
We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution
DEFF Research Database (Denmark)
Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte
2008-01-01
This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...
Probabilistic dynamic belief revision
Baltag, A.; Smets, S.
2008-01-01
We investigate the discrete (finite) case of the Popper-Renyi theory of conditional probability, introducing discrete conditional probabilistic models for knowledge and conditional belief, and comparing them with the more standard plausibility models. We also consider a related notion, that of safe
Confronting uncertainty in flood damage predictions
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2015-04-01
Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Institute of Scientific and Technical Information of China (English)
1998-01-01
In summer and autumn of 1998, the river vatleys of the Changjiang, Songhua and Nenjiang rivers were stricken by exceptionally serious floods, As of the, 22nd of August, the flooded areas stretched over 52.4 million acres. More than 223 million people were affected by the flood. 4.97 million houses were ruined, economic losses totaled RMB 166 billion, and most tragically, 3,004 people lost their byes. It was one of the costliest disasters in Chinese history. Millions of People’s Liberation Army soldiers and local people joined hands to battle the floodwaters. Thanks to their unified efforts and tenacious struggle, they successfully withstood the rising, water, resumed production and began to rebuild their homes.
Probabilistic authenticated quantum dialogue
Hwang, Tzonelih; Luo, Yi-Ping
2015-12-01
This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.
Probabilistic Event Categorization
Wiebe, J; Duan, L; Wiebe, Janyce; Bruce, Rebecca; Duan, Lei
1997-01-01
This paper describes the automation of a new text categorization task. The categories assigned in this task are more syntactically, semantically, and contextually complex than those typically assigned by fully automatic systems that process unseen test data. Our system for assigning these categories is a probabilistic classifier, developed with a recent method for formulating a probabilistic model from a predefined set of potential features. This paper focuses on feature selection. It presents a number of fully automatic features. It identifies and evaluates various approaches to organizing collocational properties into features, and presents the results of experiments covarying type of organization and type of property. We find that one organization is not best for all kinds of properties, so this is an experimental parameter worth investigating in NLP systems. In addition, the results suggest a way to take advantage of properties that are low frequency but strongly indicative of a class. The problems of rec...
Probabilistic approaches to recommendations
Barbieri, Nicola; Ritacco, Ettore
2014-01-01
The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus
Geothermal probabilistic cost study
Energy Technology Data Exchange (ETDEWEB)
Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.
1981-08-01
A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)
On probabilistic Mandelbrot maps
Energy Technology Data Exchange (ETDEWEB)
Andreadis, Ioannis [International School of The Hague, Wijndaelerduin 1, 2554 BX The Hague (Netherlands)], E-mail: i.andreadis@ish-rijnlandslyceum.nl; Karakasidis, Theodoros E. [Department of Civil Engineering, University of Thessaly, GR-38334 Volos (Greece)], E-mail: thkarak@uth.gr
2009-11-15
In this work, we propose a definition for a probabilistic Mandelbrot map in order to extend and support the study initiated by Argyris et al. [Argyris J, Andreadis I, Karakasidis Th. On perturbations of the Mandelbrot map. Chaos, Solitons and Fractals 2000;11:1131-1136.] with regard to the numerical stability of the Mandelbrot and Julia set of the Mandelbrot map when subjected to noise.
Amplification of flood frequencies with local sea level rise and emerging flood regimes
Buchanan, Maya K.; Oppenheimer, Michael; Kopp, Robert E.
2017-06-01
The amplification of flood frequencies by sea level rise (SLR) is expected to become one of the most economically damaging impacts of climate change for many coastal locations. Understanding the magnitude and pattern by which the frequency of current flood levels increase is important for developing more resilient coastal settlements, particularly since flood risk management (e.g. infrastructure, insurance, communications) is often tied to estimates of flood return periods. The Intergovernmental Panel on Climate Change’s Fifth Assessment Report characterized the multiplication factor by which the frequency of flooding of a given height increases (referred to here as an amplification factor; AF). However, this characterization neither rigorously considered uncertainty in SLR nor distinguished between the amplification of different flooding levels (such as the 10% versus 0.2% annual chance floods); therefore, it may be seriously misleading. Because both historical flood frequency and projected SLR are uncertain, we combine joint probability distributions of the two to calculate AFs and their uncertainties over time. Under probabilistic relative sea level projections, while maintaining storm frequency fixed, we estimate a median 40-fold increase (ranging from 1- to 1314-fold) in the expected annual number of local 100-year floods for tide-gauge locations along the contiguous US coastline by 2050. While some places can expect disproportionate amplification of higher frequency events and thus primarily a greater number of historically precedented floods, others face amplification of lower frequency events and thus a particularly fast growing risk of historically unprecedented flooding. For example, with 50 cm of SLR, the 10%, 1%, and 0.2% annual chance floods are expected respectively to recur 108, 335, and 814 times as often in Seattle, but 148, 16, and 4 times as often in Charleston, SC.
Probabilistic Tsunami Hazard Analysis
Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.
2006-12-01
The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes
Operational Satellite Based Flood Mapping Using the Delft-FEWS System
Westerhoff, Rogier; Huizinga, Jan; Kleuskens, Marco; Burren, Richard; Casey, Simon
2010-12-01
Reliable and timely information is essential for appropriate flood management. This article describes a probabilistic method to assess flood extent from SAR data. The article also addresses the derivation of flood levels and flood depth based on probabilistic flood extents and SRTM. The methods are tested on Envisat ASAR images in a hydrological open standard IT platform (Delft-FEWS). Providing flood extent maps in terms of probabilities using multiple angle data offers advantages for operational purposes, like major improvement of revisit time from 35 to 1-2 days, weighted merging of various data sources (in-situ, optical and SAR) and uncertainty propagation in models. Using medium or high resolution SAR data instead of 1x1 km pixels and using high resolution digital terrain model instead of SRTM data are important recommendations.
Probabilistic Modeling of Timber Structures
DEFF Research Database (Denmark)
Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro
2005-01-01
The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present pro...... probabilistic model for these basic properties is presented and possible refinements are given related to updating of the probabilistic model given new information, modeling of the spatial variation of strength properties and the duration of load effects.......The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
Drenched riverside towns in central and south parts of China were preparing for even worse flooding aswater levels in the country’s huge rivers surged and rainstorms continued.As of July 27,accumulated precipitation since June 16 in 70 percent of the drainage
Skakun, Sergii; Kussul, Nataliia; Shelestov, Andrii; Kussul, Olga
2014-08-01
In this article, the use of time series of satellite imagery to flood hazard mapping and flood risk assessment is presented. Flooded areas are extracted from satellite images for the flood-prone territory, and a maximum flood extent image for each flood event is produced. These maps are further fused to determine relative frequency of inundation (RFI). The study shows that RFI values and relative water depth exhibit the same probabilistic distribution, which is confirmed by Kolmogorov-Smirnov test. The produced RFI map can be used as a flood hazard map, especially in cases when flood modeling is complicated by lack of available data and high uncertainties. The derived RFI map is further used for flood risk assessment. Efficiency of the presented approach is demonstrated for the Katima Mulilo region (Namibia). A time series of Landsat-5/7 satellite images acquired from 1989 to 2012 is processed to derive RFI map using the presented approach. The following direct damage categories are considered in the study for flood risk assessment: dwelling units, roads, health facilities, and schools. The produced flood risk map shows that the risk is distributed uniformly all over the region. The cities and villages with the highest risk are identified. The proposed approach has minimum data requirements, and RFI maps can be generated rapidly to assist rescuers and decisionmakers in case of emergencies. On the other hand, limitations include: strong dependence on the available data sets, and limitations in simulations with extrapolated water depth values.
A surface water flooding impact library for flood risk assessment
Directory of Open Access Journals (Sweden)
Aldridge Timothy
2016-01-01
Full Text Available The growing demand for improved risk-based Surface Water Flooding (SWF warning systems is evident in EU directives and in the UK Government’s Pitt Review of the 2007 summer floods. This paper presents a novel approach for collating receptor and vulnerability datasets via the concept of an Impact Library, developed by the Health and Safety Laboratory as a depository of pre-calculated impact information on SWF risk for use in a real-time SWF Hazard Impact Model (HIM. This has potential benefits for the Flood Forecasting Centre (FFC as the organisation responsible for the issuing of flood guidance information for England and Wales. The SWF HIM takes a pixel-based approach to link probabilistic surface water runoff forecasts produced by CEH’s Grid-to-Grid hydrological model with Impact Library information to generate impact assessments. These are combined to estimate flood risk as a combination of impact severity and forecast likelihood, at 1km pixel level, and summarised for counties and local authorities. The SWF HIM takes advantage of recent advances in operational ensemble forecasting of rainfall by the Met Office and of SWF by the Environment Agency and CEH working together through the FFC. Results are presented for a case study event which affected the North East of England during 2012. The work has been developed through the UK’s Natural Hazards Partnership (NHP, a group of organisations gathered to provide information, research and analysis on natural hazards for civil contingencies, government and responders across the UK.
Staged decision making based on probabilistic forecasting
Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris
2016-04-01
Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in
Storing and Querying Probabilistic XML Using a Probabilistic Relational DBMS
Hollander, E.S.; Keulen, van M.
2010-01-01
This work explores the feasibility of storing and querying probabilistic XML in a probabilistic relational database. Our approach is to adapt known techniques for mapping XML to relational data such that the possible worlds are preserved. We show that this approach can work for any XML-to-relational
Quantum probability for probabilists
Meyer, Paul-André
1993-01-01
In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.
Learning Probabilistic Decision Graphs
DEFF Research Database (Denmark)
Jaeger, Manfred; Dalgaard, Jens; Silander, Tomi
2004-01-01
Probabilistic decision graphs (PDGs) are a representation language for probability distributions based on binary decision diagrams. PDGs can encode (context-specific) independence relations that cannot be captured in a Bayesian network structure, and can sometimes provide computationally more...... efficient representations than Bayesian networks. In this paper we present an algorithm for learning PDGs from data. First experiments show that the algorithm is capable of learning optimal PDG representations in some cases, and that the computational efficiency of PDG models learned from real-life data...
Machine Learning Predictions of Flash Floods
Clark, R. A., III; Flamig, Z.; Gourley, J. J.; Hong, Y.
2016-12-01
This study concerns the development, assessment, and use of machine learning (ML) algorithms to automatically generate predictions of flash floods around the world from numerical weather prediction (NWP) output. Using an archive of NWP outputs from the Global Forecast System (GFS) model and a historical archive of reports of flash floods across the U.S. and Europe, we developed a set of ML models that output forecasts of the probability of a flash flood given a certain set of atmospheric conditions. Using these ML models, real-time global flash flood predictions from NWP data have been generated in research mode since February 2016. These ML models provide information about which atmospheric variables are most important in the flash flood prediction process. The raw ML predictions can be calibrated against historical events to generate reliable flash flood probabilities. The automatic system was tested in a research-to-operations testbed enviroment with National Weather Service forecasters. The ML models are quite successful at incorporating large amounts of information in a computationally-efficient manner and and result in reasonably skillful predictions. The system is largely successful at identifying flash floods resulting from synoptically-forced events, but struggles with isolated flash floods that arise as a result of weather systems largely unresolvable by the coarse resolution of a global NWP system. The results from this collection of studies suggest that automatic probabilistic predictions of flash floods are a plausible way forward in operational forecasting, but that future research could focus upon applying these methods to finer-scale NWP guidance, to NWP ensembles, and to forecast lead times beyond 24 hours.
A General Framework for Probabilistic Characterizing Formulae
DEFF Research Database (Denmark)
Sack, Joshua; Zhang, Lijun
2012-01-01
a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward......Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...
Probabilistic population aging
2017-01-01
We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675
Quantum probabilistic logic programming
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
Passage Retrieval: A Probabilistic Technique.
Melucci, Massimo
1998-01-01
Presents a probabilistic technique to retrieve passages from texts having a large size or heterogeneous semantic content. Results of experiments comparing the probabilistic technique to one based on a text segmentation algorithm revealed that the passage size affects passage retrieval performance; text organization and query generality may have an…
Probabilistic modeling of timber structures
DEFF Research Database (Denmark)
Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro
2007-01-01
The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet Publ...... is presented and possible refinements are given related to updating of the probabilistic model given new information, modeling of the spatial variation of strength properties and the duration of load effects.......The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...
Rollason, Edward; Bracken, Louise; Hardy, Richard; Large, Andy
2017-04-01
find confusing or lacking in realistic grounding. This means users do not have information they find useful to make informed decisions about how to prepare for and respond to floods. Working together with at-risk participants, the research has developed new approaches for communicating flood risk. These approaches focus on understanding flood mechanisms and dynamics, to help participants imagine their flood risk and link potential scenarios to reality, and provide forecasts of predicted flooding at a variety of scales, allowing participants to assess the significance of predicted flooding and make more informed judgments on what action to take in response. The findings presented have significant implications for the way in which flood risk is communicated, changing the focus of mapping from probabilistic future scenarios to understanding flood dynamics and mechanisms. Such ways of communicating flood risk embrace how people would like to see risk communicated, and help those at risk grow their resilience. Communicating in such a way has wider implications for flood modelling and data collection. However, these represent potential opportunities to build more effective local partnerships for assessing and managing flood risks.
Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.
2013-12-01
In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example
Energy-Efficient Probabilistic Routing Algorithm for Internet of Things
Directory of Open Access Journals (Sweden)
Sang-Hyun Park
2014-01-01
Full Text Available In the future network with Internet of Things (IoT, each of the things communicates with the others and acquires information by itself. In distributed networks for IoT, the energy efficiency of the nodes is a key factor in the network performance. In this paper, we propose energy-efficient probabilistic routing (EEPR algorithm, which controls the transmission of the routing request packets stochastically in order to increase the network lifetime and decrease the packet loss under the flooding algorithm. The proposed EEPR algorithm adopts energy-efficient probabilistic control by simultaneously using the residual energy of each node and ETX metric in the context of the typical AODV protocol. In the simulations, we verify that the proposed algorithm has longer network lifetime and consumes the residual energy of each node more evenly when compared with the typical AODV protocol.
Probabilistic quantum multimeters
Fiurasek, J; Fiurasek, Jaromir; Dusek, Miloslav
2004-01-01
We propose quantum devices that can realize probabilistically different projective measurements on a qubit. The desired measurement basis is selected by the quantum state of a program register. First we analyze the phase-covariant multimeters for a large class of program states, then the universal multimeters for a special choice of program. In both cases we start with deterministic but erroneous devices and then proceed to devices that never make a mistake but from time to time they give an inconclusive result. These multimeters are optimized (for a given type of a program) with respect to the minimum probability of inconclusive result. This concept is further generalized to the multimeters that minimize the error rate for a given probability of an inconclusive result (or vice versa). Finally, we propose a generalization for qudits.
Probabilistic retinal vessel segmentation
Wu, Chang-Hua; Agam, Gady
2007-03-01
Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.
Directory of Open Access Journals (Sweden)
Amabile Alessia
2016-01-01
Full Text Available Flooding is a worldwide phenomenon. Over the last few decades the world has experienced a rising number of devastating flood events and the trend in such natural disasters is increasing. Furthermore, escalations in both the probability and magnitude of flood hazards are expected as a result of climate change. Flood defence embankments are one of the major flood defence measures and reliability assessment for these structures is therefore a very important process. Routine hydro-mechanical models for the stability of flood embankments are based on the assumptions of steady-state through-flow and zero pore-pressures above the phreatic surface, i.e. negative capillary pressure (suction is ignored. Despite common belief, these assumptions may not always lead to conservative design. In addition, hydraulic loading is stochastic in nature and flood embankment stability should therefore be assessed in probabilistic terms. This cannot be accommodated by steady-state flow models. The paper presents an approach for reliability analysis of flood embankment taking into account the transient water through-flow. The factor of safety of the embankment is assessed in probabilistic terms based on a stochastic distribution for the hydraulic loading. Two different probabilistic approaches are tested to compare and validate the results.
Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?
Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.
2017-03-01
Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;
Unexpected flood loss correlations across Europe
Booth, Naomi; Boyd, Jessica
2017-04-01
Floods don't observe country borders, as highlighted by major events across Europe that resulted in heavy economic and insured losses in 1999, 2002, 2009 and 2013. Flood loss correlations between some countries occur along multi-country river systems or between neighbouring nations affected by the same weather systems. However, correlations are not so obvious and whilst flooding in multiple locations across Europe may appear independent, for a re/insurer providing cover across the continent, these unexpected correlations can lead to high loss accumulations. A consistent, continental-scale method that allows quantification and comparison of losses, and identifies correlations in loss between European countries is therefore essential. A probabilistic model for European river flooding was developed that allows estimation of potential losses to pan-European property portfolios. By combining flood hazard and exposure information in a catastrophe modelling platform, we can consider correlations between river basins across Europe rather than being restricted to country boundaries. A key feature of the model is its statistical event set based on extreme value theory. Using historical river flow data, the event set captures spatial and temporal patterns of flooding across Europe and simulates thousands of events representing a full range of possible scenarios. Some known correlations were identified, such as between neighbouring Belgium and Luxembourg where 28% of events that affect either country produce a loss in both. However, our model identified some unexpected correlations including between Austria and Poland, and Poland and France, which are geographically distant. These correlations in flood loss may be missed by traditional methods and are key for re/insurers with risks in multiple countries. The model also identified that 46% of European river flood events affect more than one country. For more extreme events with a return period higher than 200 years, all events
Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2004-01-01
We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...
Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2004-01-01
We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...
González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.
2011-12-01
Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.
New developments at the Flood Forecasting Centre: operational flood risk assessment and guidance
Pilling, Charlie
2017-04-01
The Flood Forecasting Centre (FFC) is a partnership between the UK Met Office, the Environment Agency and Natural Resources Wales. The FFC was established in 2009 to provide an overview of flood risk across England and Wales and to provide flood guidance services primarily for the emergency response community. The FFC provides forecasts for all natural sources of flooding, these being fluvial, surface water, coastal and groundwater. This involves an assessment of possible hydrometeorological events and their impacts over the next five days. During times of heightened flood risk, the close communication between the FFC, the Environment Agency and Natural Resources Wales allows mobilization and deployment of staff and flood defences. Following a number of severe flood events during winters 2013-14 and 2015-16, coupled with a drive from the changing landscape in national incident response, there is a desire to identify flood events at even longer lead time. This earlier assessment and mobilization is becoming increasingly important and high profile within Government. For example, following the exceptional flooding across the north of England in December 2015 the Environment Agency have invested in 40 km of temporary barriers that will be moved around the country to help mitigate against the impacts of large flood events. Efficient and effective use of these barriers depends on identifying the broad regions at risk well in advance of the flood, as well as scaling the magnitude and duration of large events. Partly in response to this, the FFC now produce a flood risk assessment for a month ahead. In addition, since January 2017, the 'new generation' daily flood guidance statement includes an assessment of flood risk for the 6 to 10 day period. Examples of both these new products will be introduced, as will some of the new developments in science and technical capability that underpin these assessments. Examples include improvements to fluvial forecasting from 'fluvial
14th International Probabilistic Workshop
Taerwe, Luc; Proske, Dirk
2017-01-01
This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.
Common Difficulties with Probabilistic Reasoning.
Hope, Jack A.; Kelly, Ivan W.
1983-01-01
Several common errors reflecting difficulties in probabilistic reasoning are identified, relating to ambiguity, previous outcomes, sampling, unusual events, and estimating. Knowledge of these mistakes and interpretations may help mathematics teachers understand the thought processes of their students. (MNS)
Directory of Open Access Journals (Sweden)
Paola Bianucci
2015-03-01
Full Text Available A useful tool is proposed in this paper to assist dam managers in comparing and selecting suitable operating rules. This procedure is based on well-known multiobjective and probabilistic methodologies, which were jointly applied here to assess and compare flood control strategies in hydropower reservoirs. The procedure consisted of evaluating the operating rules’ performance using a simulation fed by a representative and sufficiently large flood event series. These flood events were obtained from a synthetic rainfall series stochastically generated by using the RainSimV3 model coupled with a deterministic hydrological model. The performance of the assessed strategies was characterized using probabilistic variables. Finally, evaluation and comparison were conducted by analyzing objective functions which synthesize different aspects of the rules’ performance. These objectives were probabilistically defined in terms of risk and expected values. To assess the applicability and flexibility of the tool, it was implemented in a hydropower dam located in Galicia (Northern Spain. This procedure allowed alternative operating rule to be derived which provided a reasonable trade-off between dam safety, flood control, operability and energy production.
Interval probabilistic neural network.
Kowalski, Piotr A; Kulczycki, Piotr
2017-01-01
Automated classification systems have allowed for the rapid development of exploratory data analysis. Such systems increase the independence of human intervention in obtaining the analysis results, especially when inaccurate information is under consideration. The aim of this paper is to present a novel approach, a neural networking, for use in classifying interval information. As presented, neural methodology is a generalization of probabilistic neural network for interval data processing. The simple structure of this neural classification algorithm makes it applicable for research purposes. The procedure is based on the Bayes approach, ensuring minimal potential losses with regard to that which comes about through classification errors. In this article, the topological structure of the network and the learning process are described in detail. Of note, the correctness of the procedure proposed here has been verified by way of numerical tests. These tests include examples of both synthetic data, as well as benchmark instances. The results of numerical verification, carried out for different shapes of data sets, as well as a comparative analysis with other methods of similar conditioning, have validated both the concept presented here and its positive features.
Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios
DEFF Research Database (Denmark)
Custer, Rocco; Nishijima, Kazuyoshi
are usually deterministic and make use of auxiliary indicator, such as land cover, to spatially distribute exposures. As the dependence between auxiliary indicator and disaggregated number of exposures is generally imperfect, uncertainty arises in disaggregation. This paper therefore proposes a probabilistic...... disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...
Weighing costs and losses: A decision making game using probabilistic forecasts
Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan
2017-04-01
Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting
National Clearinghouse for Educational Facilities, 2011
2011-01-01
According to the Federal Emergency Management Agency, flooding is the nation's most common natural disaster. Some floods develop slowly during an extended period of rain or in a warming trend following a heavy snow. Flash floods can occur quickly, without any visible sign of rain. Catastrophic floods are associated with burst dams and levees,…
Multivariate postprocessing techniques for probabilistic hydrological forecasting
Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian
2016-04-01
Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power
Hurricane Sandy's flood frequency increasing from year 1800 to 2100
Lin, Ning; Kopp, Robert E.; Horton, Benjamin P.; Donnelly, Jeffrey P.
2016-10-01
Coastal flood hazard varies in response to changes in storm surge climatology and the sea level. Here we combine probabilistic projections of the sea level and storm surge climatology to estimate the temporal evolution of flood hazard. We find that New York City’s flood hazard has increased significantly over the past two centuries and is very likely to increase more sharply over the 21st century. Due to the effect of sea level rise, the return period of Hurricane Sandy’s flood height decreased by a factor of ˜3× from year 1800 to 2000 and is estimated to decrease by a further ˜4.4× from 2000 to 2100 under a moderate-emissions pathway. When potential storm climatology change over the 21st century is also accounted for, Sandy’s return period is estimated to decrease by ˜3× to 17× from 2000 to 2100.
Probabilistic aspects of Wigner function
Usenko, C V
2004-01-01
The Wigner function of quantum systems is an effective instrument to construct the approximate classical description of the systems for which the classical approximation is possible. During the last time, the Wigner function formalism is applied as well to seek indications of specific quantum properties of quantum systems leading to impossibility of the classical approximation construction. Most of all, as such an indication the existence of negative values in Wigner function for specific states of the quantum system being studied is used. The existence of such values itself prejudices the probabilistic interpretation of the Wigner function, though for an arbitrary observable depending jointly on the coordinate and the momentum of the quantum system just the Wigner function gives an effective instrument to calculate the average value and the other statistical characteristics. In this paper probabilistic interpretation of the Wigner function based on coordination of theoretical-probabilistic definition of the ...
Implications of probabilistic risk assessment
Energy Technology Data Exchange (ETDEWEB)
Cullingford, M.C.; Shah, S.M.; Gittus, J.H. (eds.)
1987-01-01
Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.).
Quantum probabilistically cloning and computation
Institute of Scientific and Technical Information of China (English)
2008-01-01
In this article we make a review on the usefulness of probabilistically cloning and present examples of quantum computation tasks for which quantum cloning offers an advantage which cannot be matched by any approach that does not resort to it.In these quantum computations,one needs to distribute quantum information contained in states about which we have some partial information.To perform quantum computations,one uses state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.And we discuss the achievable efficiencies and the efficient quantum logic network for probabilistic cloning the quantum states used in implementing quantum computation tasks for which cloning provides enhancement in performance.
HISTORY BASED PROBABILISTIC BACKOFF ALGORITHM
Directory of Open Access Journals (Sweden)
Narendran Rajagopalan
2012-01-01
Full Text Available Performance of Wireless LAN can be improved at each layer of the protocol stack with respect to energy efficiency. The Media Access Control layer is responsible for the key functions like access control and flow control. During contention, Backoff algorithm is used to gain access to the medium with minimum probability of collision. After studying different variations of back off algorithms that have been proposed, a new variant called History based Probabilistic Backoff Algorithm is proposed. Through mathematical analysis and simulation results using NS-2, it is seen that proposed History based Probabilistic Backoff algorithm performs better than Binary Exponential Backoff algorithm.
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard
, new and more refined design methods must be developed. These methods can for instance be developed using probabilistic design where the uncertainties in all phases of the design life are taken into account. The main aim of the present thesis is to develop models for probabilistic design of wind....... The uncertainty related to the existing methods for estimating the loads during operation is assessed by applying these methods to a case where the load response is assumed to be Gaussian. In this case an approximate analytical solution exists for a statistical description of the extreme load response. In general...
Probabilistic methods in combinatorial analysis
Sachkov, Vladimir N
2014-01-01
This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist
Probabilistic reasoning in data analysis.
Sirovich, Lawrence
2011-09-20
This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.
Probabilistic Approach to Rough Set Theory
Institute of Scientific and Technical Information of China (English)
Wojciech Ziarko
2006-01-01
The presentation introduces the basic ideas and investigates the probabilistic approach to rough set theory. The major aspects of the probabilistic approach to rough set theory to be explored during the presentation are: the probabilistic view of the approximation space, the probabilistic approximations of sets, as expressed via variable precision and Bayesian rough set models, and probabilistic dependencies between sets and multi-valued attributes, as expressed by the absolute certainty gain and expected certainty gain measures, respectively. The probabilis-tic dependency measures allow for representation of subtle stochastic associations between attributes. They also allow for more comprehensive evaluation of rules computed from data and for computation of attribute reduct, core and significance factors in probabilistic decision tables. It will be shown that the probabilistic dependency measure-based attribute reduction techniques are also extendible to hierarchies of decision tables. The presentation will include computational examples to illustrate pre-sented concepts and to indicate possible practical applications.
Probabilistic Logic Programming under Answer Sets Semantics
Institute of Scientific and Technical Information of China (English)
王洁; 鞠实儿
2003-01-01
Although traditional logic programming languages provide powerful tools for knowledge representation, they cannot deal with uncertainty information (e. g. probabilistic information). In this paper, we propose a probabilistic logic programming language by introduce probability into a general logic programming language. The work combines 4-valued logic with probability. Conditional probability can be easily represented in a probabilistic logic program. The semantics of such a probabilistic logic program i...
Probabilistic modelling of sea surges in coastal urban areas
Georgiadis, Stylianos; Jomo Danielsen Sørup, Hjalte; Arnbjerg-Nielsen, Karsten; Nielsen, Bo Friis
2016-04-01
Urban floods are a major issue for coastal cities with severe impacts on economy, society and environment. A main cause for floods are sea surges stemming from extreme weather conditions. In the context of urban flooding, certain standards have to be met by critical infrastructures in order to protect them from floods. These standards can be so strict that no empirical data is available. For instance, protection plans for sub-surface railways against floods are established with 10,000 years return levels. Furthermore, the long technical lifetime of such infrastructures is a critical issue that should be considered, along with the associated climate change effects in this lifetime. We present a case study of Copenhagen where the metro system is being expanded at present with several stations close to the sea. The current critical sea levels for the metro have never been exceeded and Copenhagen has only been severely flooded from pluvial events in the time where measurements have been conducted. However, due to the very high return period that the metro has to be able to withstand and due to the expectations to sea-level rise due to climate change, reliable estimates of the occurrence rate and magnitude of sea surges have to be established as the current protection is expected to be insufficient at some point within the technical lifetime of the metro. The objective of this study is to probabilistically model sea level in Copenhagen as opposed to extrapolating the extreme statistics as is the practice often used. A better understanding and more realistic description of the phenomena leading to sea surges can then be given. The application of hidden Markov models to high-resolution data of sea level for different meteorological stations in and around Copenhagen is an effective tool to address uncertainty. For sea surge studies, the hidden states of the model may reflect the hydrological processes that contribute to coastal floods. Also, the states of the hidden Markov
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
Vehicles traverse a flooded street in Liuzhou, guangxi zhuang Autonomous Region, on May 19.heavy rainstorms repeatedly struck China this month, triggering floods, mudflows and landslides. hunan, guangdong and Jiangxi provinces and Chongqing Municipality were the worst hit.
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Flooding: Prioritizing protection?
Peduzzi, Pascal
2017-09-01
With climate change, urban development and economic growth, more assets and infrastructures will be exposed to flooding. Now research shows that investments in flood protection are globally beneficial, but have varied levels of benefit locally.
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Flood Risk Regional Flood Defences: Technical report
Lendering, K.T.
2015-01-01
Historically the Netherlands have always had to deal with the threat of flooding, both from the rivers and the sea as well as from heavy rainfall. The country consists of a large amount of polders, which are low lying areas of land protected from flooding by embankments. These polders require an
Flood Risk Regional Flood Defences: Technical report
Lendering, K.T.
2015-01-01
Historically the Netherlands have always had to deal with the threat of flooding, both from the rivers and the sea as well as from heavy rainfall. The country consists of a large amount of polders, which are low lying areas of land protected from flooding by embankments. These polders require an ext
A Probabilistic Ontology Development Methodology
2014-06-01
Model-Based Systems Engineering (MBSE) Methodologies," Seattle, 2008. [17] Jeffrey O. Grady, System Requirements Analysis. New York: McGraw-Hill, Inc...software. [Online]. http://www.norsys.com/index.html [26] Lise Getoor, Nir Friedman, Daphne Koller, Avi Pfeffer , and Ben Taskar, "Probabilistic
Probabilistic aspects of ocean waves
Battjes, J.A.
1977-01-01
Background material for a special lecture on probabilistic aspects of ocean waves for a seminar in Trondheim. It describes long term statistics and short term statistics. Statistical distributions of waves, directional spectra and frequency spectra. Sea state parameters, response peaks, encounter
Probabilistic aspects of ocean waves
Battjes, J.A.
1977-01-01
Background material for a special lecture on probabilistic aspects of ocean waves for a seminar in Trondheim. It describes long term statistics and short term statistics. Statistical distributions of waves, directional spectra and frequency spectra. Sea state parameters, response peaks, encounter pr
Sound Probabilistic #SAT with Projection
Directory of Open Access Journals (Sweden)
Vladimir Klebanov
2016-10-01
Full Text Available We present an improved method for a sound probabilistic estimation of the model count of a boolean formula under projection. The problem solved can be used to encode a variety of quantitative program analyses, such as concerning security of resource consumption. We implement the technique and discuss its application to quantifying information flow in programs.
Probabilistic localisation in repetitive environments
Vroegindeweij, Bastiaan A.; IJsselmuiden, Joris; Henten, van Eldert J.
2016-01-01
One of the problems in loose housing systems for laying hens is the laying of eggs on the floor, which need to be collected manually. In previous work, PoultryBot was presented to assist in this and other tasks. Here, probabilistic localisation with a particle filter is evaluated for use inside p
Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study
Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.
2004-12-01
A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr
A probabilistic Hu-Washizu variational principle
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
Model Checking with Probabilistic Tabled Logic Programming
Gorlin, Andrey; Smolka, Scott A
2012-01-01
We present a formulation of the problem of probabilistic model checking as one of query evaluation over probabilistic logic programs. To the best of our knowledge, our formulation is the first of its kind, and it covers a rich class of probabilistic models and probabilistic temporal logics. The inference algorithms of existing probabilistic logic-programming systems are well defined only for queries with a finite number of explanations. This restriction prohibits the encoding of probabilistic model checkers, where explanations correspond to executions of the system being model checked. To overcome this restriction, we propose a more general inference algorithm that uses finite generative structures (similar to automata) to represent families of explanations. The inference algorithm computes the probability of a possibly infinite set of explanations directly from the finite generative structure. We have implemented our inference algorithm in XSB Prolog, and use this implementation to encode probabilistic model...
Flood Impact Modelling and Natural Flood Management
Owen, Gareth; Quinn, Paul; ODonnell, Greg
2016-04-01
Local implementation of Natural Flood Management methods are now being proposed in many flood schemes. In principal it offers a cost effective solution to a number of catchment based problem as NFM tackles both flood risk and WFD issues. However within larger catchments there is the issue of which subcatchments to target first and how much NFM to implement. If each catchment has its own configuration of subcatchment and rivers how can the issues of flood synchronisation and strategic investment be addressed? In this study we will show two key aspects to resolving these issues. Firstly, a multi-scale network water level recorder is placed throughout the system to capture the flow concentration and travel time operating in the catchment being studied. The second is a Flood Impact Model (FIM), which is a subcatchment based model that can generate runoff in any location using any hydrological model. The key aspect to the model is that it has a function to represent the impact of NFM in any subcatchment and the ability to route that flood wave to the outfall. This function allows a realistic representation of the synchronisation issues for that catchment. By running the model in interactive mode the user can define an appropriate scheme that minimises or removes the risk of synchornisation and gives confidence that the NFM investment is having a good level of impact downstream in large flood events.
Urban pluvial flood prediction
DEFF Research Database (Denmark)
Thorndahl, Søren Liedtke; Nielsen, Jesper Ellerbæk; Jensen, David Getreuer
2016-01-01
Flooding produced by high-intensive local rainfall and drainage system capacity exceedance can have severe impacts in cities. In order to prepare cities for these types of flood events – especially in the future climate – it is valuable to be able to simulate these events numerically both...... historically and in real-time. There is a rather untested potential in real-time prediction of urban floods. In this paper radar data observations with different spatial and temporal resolution, radar nowcasts of 0–2 h lead time, and numerical weather models with lead times up to 24 h are used as inputs...... to an integrated flood and drainage systems model in order to investigate the relative difference between different inputs in predicting future floods. The system is tested on a small town Lystrup in Denmark, which has been flooded in 2012 and 2014. Results show it is possible to generate detailed flood maps...
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard
, new and more refined design methods must be developed. These methods can for instance be developed using probabilistic design where the uncertainties in all phases of the design life are taken into account. The main aim of the present thesis is to develop models for probabilistic design of wind......, the uncertainty is dependent on the method used for load extrapolation, the number of simulations and the distribution fitted to the extracted peaks. Another approach for estimating the uncertainty on the estimated load effects during operation is to use field measurements. A new method for load extrapolation......, which is based on average conditional exceedence rates, is applied to wind turbine response. The advantage of this method is that it can handle dependence in the response and use exceedence rates instead of extracted peaks which normally are more stable. The results show that the method estimates...
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Toft, H.S.
2010-01-01
Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....
Probabilistic Design of Wind Turbines
Directory of Open Access Journals (Sweden)
Henrik S. Toft
2010-02-01
Full Text Available Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability levels and recommendation for consideration of system aspects. The uncertainties are characterized as aleatoric (physical uncertainty or epistemic (statistical, measurement and model uncertainties. Methods for uncertainty modeling consistent with methods for estimating the reliability are described. It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated.
Modified Claus process probabilistic model
Energy Technology Data Exchange (ETDEWEB)
Larraz Mora, R. [Chemical Engineering Dept., Univ. of La Laguna (Spain)
2006-03-15
A model is proposed for the simulation of an industrial Claus unit with a straight-through configuration and two catalytic reactors. Process plant design evaluations based on deterministic calculations does not take into account the uncertainties that are associated with the different input variables. A probabilistic simulation method was applied in the Claus model to obtain an impression of how some of these inaccuracies influences plant performance. (orig.)
Probabilistic Cloning and Quantum Computation
Institute of Scientific and Technical Information of China (English)
GAO Ting; YAN Feng-Li; WANG Zhi-Xi
2004-01-01
@@ We discuss the usefulness of quantum cloning and present examples of quantum computation tasks for which the cloning offers an advantage which cannot be matched by any approach that does not resort to quantum cloning.In these quantum computations, we need to distribute quantum information contained in the states about which we have some partial information. To perform quantum computations, we use a state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.
Probabilistic analysis and related topics
Bharucha-Reid, A T
1983-01-01
Probabilistic Analysis and Related Topics, Volume 3 focuses on the continuity, integrability, and differentiability of random functions, including operator theory, measure theory, and functional and numerical analysis. The selection first offers information on the qualitative theory of stochastic systems and Langevin equations with multiplicative noise. Discussions focus on phase-space evolution via direct integration, phase-space evolution, linear and nonlinear systems, linearization, and generalizations. The text then ponders on the stability theory of stochastic difference systems and Marko
Probabilistic methods for rotordynamics analysis
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
Probabilistic analysis and related topics
Bharucha-Reid, A T
1979-01-01
Probabilistic Analysis and Related Topics, Volume 2 focuses on the integrability, continuity, and differentiability of random functions, as well as functional analysis, measure theory, operator theory, and numerical analysis.The selection first offers information on the optimal control of stochastic systems and Gleason measures. Discussions focus on convergence of Gleason measures, random Gleason measures, orthogonally scattered Gleason measures, existence of optimal controls without feedback, random necessary conditions, and Gleason measures in tensor products. The text then elaborates on an
Earthquake and Flood Risk Assessments for Europe and Central Asia
Murnane, R. J.; Daniell, J. E.; Ward, P.; Winsemius, H.; Tijssen, A.; Toro, J.
2015-12-01
We report on a flood and earthquake risk assessment for 32 countries in Europe and Central Asia with a focus on how current flood and earthquake risk might evolve in the future due to changes in climate, population, and GDP. The future hazard and exposure conditions used for the risk assessment are consistent with selected IPCC AR5 Representative Concentration Pathways (RCPs) and Shared Socioeconomic Pathways (SSPs). Estimates of 2030 and 2080 population and GDP are derived using the IMAGE model forced by the socioeconomic conditions associated with the SSPs. Flood risk is modeled using the probabilistic GLOFRIS global flood risk modeling cascade which starts with meteorological fields derived from reanalysis data or climate models. For 2030 and 2080 climate conditions, the meteorological fields are generated from five climate models forced by the RCP4.5 and RCP8.5 scenarios. Future flood risk is estimated using population and GDP exposures consistent with the SSP2 and SSP3 scenarios. Population and GDP are defined as being affected by a flood when a grid cell receives any depth of flood inundation. The earthquake hazard is quantified using a 10,000-year stochastic catalog of over 15.8 million synthetic earthquake events of at least magnitude 5. Ground motion prediction and estimates of local site conditions are used to determine PGA. Future earthquake risk is estimated using population and GDP exposures consistent with all five SSPs. Population and GDP are defined as being affected by an earthquake when a grid cell experiences ground motion equaling or exceeding MMI VI. For most countries, changes in exposure alter flood risk to a greater extent than changes in climate. For both flood and earthquake, the spread in risk grows over time. There are large uncertainties due to the methodology; however, the results are not meant to be definitive. Instead they will be used to initiate discussions with governments regarding efforts to manage disaster risk.
Probabilistic interpretation of resonant states
Indian Academy of Sciences (India)
Naomichi Hatano; Tatsuro Kawamoto; Joshua Feinberg
2009-09-01
We provide probabilistic interpretation of resonant states. We do this by showing that the integral of the modulus square of resonance wave functions (i.e., the conventional norm) over a properly expanding spatial domain is independent of time, and therefore leads to probability conservation. This is in contrast with the conventional employment of a bi-orthogonal basis that precludes probabilistic interpretation, since wave functions of resonant states diverge exponentially in space. On the other hand, resonant states decay exponentially in time, because momentum leaks out of the central scattering area. This momentum leakage is also the reason for the spatial exponential divergence of resonant state. It is by combining the opposite temporal and spatial behaviours of resonant states that we arrive at our probabilistic interpretation of these states. The physical need to normalize resonant wave functions over an expanding spatial domain arises because particles leak out of the region which contains the potential range and escape to infinity, and one has to include them in the total count of particles.
Lawrence, D.; Paquet, E.; Gailhard, J.; Fleig, A. K.
2014-05-01
Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses is simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a bottom-up classification procedure was used to define a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000-year (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, statistical flood frequency analysis based on the annual maximum series, and the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods
Prototypes of risk-based flood forecasting systems in the Netherlands and Italy
Directory of Open Access Journals (Sweden)
Bachmann D.
2016-01-01
Full Text Available Flood forecasting, warning and emergency response are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within emergency response. However, the information provided for decision support is often restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in current early warning and response systems. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. This paper presents the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. This paper presents the first results from two prototype applications of the new developed concept: The first prototype is applied to the Rotterdam area situated in the western part of the Netherlands. The second pilot study focusses on a rural area between the cities of Mantua and Ferrara along the Po river (Italy.
Estimating the benefits of single value and probability forecasting for flood warning
Directory of Open Access Journals (Sweden)
J. S. Verkade
2011-12-01
Full Text Available Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS. These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty decreases the potential reduction of flood risk, but is seldom accounted for in estimates of the benefits of FFWRSs. In the present paper, a method to estimate the benefits of (imperfect FFWRSs in reducing flood risk is presented. The method is based on a hydro-economic model of expected annual damage (EAD due to flooding, combined with the concept of Relative Economic Value (REV. The estimated benefits include not only the reduction of flood losses due to a warning response, but also consider the costs of the warning response itself, as well as the costs associated with forecasting uncertainty. The method allows for estimation of the benefits of FFWRSs that use either deterministic or probabilistic forecasts. Through application to a case study, it is shown that FFWRSs using a probabilistic forecast have the potential to realise higher benefits at all lead-times. However, it is also shown that provision of warning at increasing lead-time does not necessarily lead to an increasing reduction of flood risk, but rather that an optimal lead-time at which warnings are provided can be established as a function of forecast uncertainty and the cost-loss ratio of the user receiving and responding to the warning.
Flow ensemble prediction for flash flood warnings at ungauged basins
Demargne, Julie; Javelle, Pierre; Organde, Didier; Caseri, Angelica; Ramos, Maria-Helena; de Saint Aubin, Céline; Jurdy, Nicolas
2015-04-01
Flash floods, which are typically triggered by severe rainfall events, are difficult to monitor and predict at the spatial and temporal scales of interest due to large meteorological and hydrologic uncertainties. In particular, uncertainties in quantitative precipitation forecasts (QPF) and quantitative precipitation estimates (QPE) need to be taken into account to provide skillful flash flood warnings with increased warning lead time. In France, the AIGA discharge-threshold flood warning system is currently being enhanced to ingest high-resolution ensemble QPFs from convection-permitting numerical weather prediction (NWP) models, as well as probabilistic QPEs, to improve flash flood warnings for small-to-medium (from 10 to 1000 km²) ungauged basins. The current deterministic AIGA system is operational in the South of France since 2005. It ingests the operational radar-gauge QPE grids from Météo-France to run a simplified hourly distributed hydrologic model at a 1-km² resolution every 15 minutes (Javelle et al. 2014). This produces real-time peak discharge estimates along the river network, which are subsequently compared to regionalized flood frequency estimates of given return periods. Warnings are then provided to the French national hydro-meteorological and flood forecasting centre (SCHAPI) and regional flood forecasting offices, based on the estimated severity of ongoing events. The calibration and regionalization of the hydrologic model has been recently enhanced to implement an operational flash flood warning system for the entire French territory. To quantify the QPF uncertainty, the COSMO-DE-EPS rainfall ensembles from the Deutscher Wetterdienst (20 members at a 2.8-km resolution for a lead time of 21 hours), which are available on the North-eastern part of France, were ingested in the hydrologic model of the AIGA system. Streamflow ensembles were produced and probabilistic flash flood warnings were derived for the Meuse and Moselle river basins and
Accounting For Greenhouse Gas Emissions From Flooded ...
Nearly three decades of research has demonstrated that the inundation of rivers and terrestrial ecosystems behind dams can lead to enhanced rates of greenhouse gas emissions, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. The research approaches include 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions. To inform th
Accounting For Greenhouse Gas Emissions From Flooded ...
Nearly three decades of research has demonstrated that the inundation of rivers and terrestrial ecosystems behind dams can lead to enhanced rates of greenhouse gas emissions, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. The research approaches include 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions. To inform th
Directory of Open Access Journals (Sweden)
J. Dietrich
2009-08-01
Full Text Available Ensemble forecasts aim at framing the uncertainties of the potential future development of the hydro-meteorological situation. A probabilistic evaluation can be used to communicate forecast uncertainty to decision makers. Here an operational system for ensemble based flood forecasting is presented, which combines forecasts from the European COSMO-LEPS, SRNWP-PEPS and COSMO-DE prediction systems. A multi-model lagged average super-ensemble is generated by recombining members from different runs of these meteorological forecast systems. A subset of the super-ensemble is selected based on a priori model weights, which are obtained from ensemble calibration. Flood forecasts are simulated by the conceptual rainfall-runoff-model ArcEGMO. Parameter uncertainty of the model is represented by a parameter ensemble, which is a priori generated from a comprehensive uncertainty analysis during model calibration. The use of a computationally efficient hydrological model within a flood management system allows us to compute the hydro-meteorological model chain for all members of the sub-ensemble. The model chain is not re-computed before new ensemble forecasts are available, but the probabilistic assessment of the output is updated when new information from deterministic short range forecasts or from assimilation of measured data becomes available. For hydraulic modelling, with the desired result of a probabilistic inundation map with high spatial resolution, a replacement model can help to overcome computational limitations. A prototype of the developed framework has been applied for a case study in the Mulde river basin. However these techniques, in particular the probabilistic assessment and the derivation of decision rules are still in their infancy. Further research is necessary and promising.
Probabilistic models of language processing and acquisition.
Chater, Nick; Manning, Christopher D
2006-07-01
Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.
ENSO impacts on flood risk at the global scale
Ward, Philip; Dettinger, Michael; Jongman, Brenden; Kummu, Matti; Winsemius, Hessel
2014-05-01
We present the impacts of El Niño Southern Oscillation (ENSO) on society and the economy, via relationships between ENSO and the hydrological cycle. We also discuss ways in which this knowledge can be used in disaster risk management and risk reduction. This contribution provides the most recent results of an ongoing 4-year collaborative research initiative to assess and map the impacts of large scale interannual climate variability on flood hazard and risk at the global scale. We have examined anomalies in flood risk between ENSO phases, whereby flood risk is expressed in terms of indicators such as: annual expected damage; annual expected affected population; annual expected affected Gross Domestic Product (GDP). We show that large anomalies in flood risk occur during El Niño or La Niña years in basins covering large parts of the Earth's surface. These anomalies reach statistical significance river basins covering almost two-thirds of the Earth's surface. Particularly strong anomalies exist in southern Africa, parts of western Africa, Australia, parts of Central Eurasia (especially for El Niño), the western USA (especially La Niña anomalies), and parts of South America. We relate these anomalies to possible causal relationships between ENSO and flood hazard, using both modelled and observed data on flood occurrence and extremity. The implications for flood risk management are many-fold. In those regions where disaster risk is strongly influenced by ENSO, the potential predictably of ENSO could be used to develop probabilistic flood risk projections with lead times up to several seasons. Such data could be used by the insurance industry in managing risk portfolios and by multinational companies for assessing the robustness of their supply chains to potential flood-related interruptions. Seasonal forecasts of ENSO influence of peak flows could also allow for improved flood early warning and regulation by dam operators, which could also reduce overall risks
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Energy Efficient Probabilistic Broadcasting for Mobile Ad-Hoc Network
Kumar, Sumit; Mehfuz, Shabana
2016-08-01
In mobile ad-hoc network (MANETs) flooding method is used for broadcasting route request (RREQ) packet from one node to another node for route discovery. This is the simplest method of broadcasting of RREQ packets but it often results in broadcast storm problem, originating collisions and congestion of packets in the network. A probabilistic broadcasting is one of the widely used broadcasting scheme for route discovery in MANETs and provides solution for broadcasting storm problem. But it does not consider limited energy of the battery of the nodes. In this paper, a new energy efficient probabilistic broadcasting (EEPB) is proposed in which probability of broadcasting RREQs is calculated with respect to remaining energy of nodes. The analysis of simulation results clearly indicate that an EEPB route discovery scheme in ad-hoc on demand distance vector (AODV) can increase the network lifetime with a decrease in the average power consumption and RREQ packet overhead. It also decreases the number of dropped packets in the network, in comparison to other EEPB schemes like energy constraint gossip (ECG), energy aware gossip (EAG), energy based gossip (EBG) and network lifetime through energy efficient broadcast gossip (NEBG).
Beckers, Joost; Buckman, Lora; Bachmann, Daniel; Visser, Martijn; Tollenaar, Daniel; Vatvani, Deepak; Kramer, Nienke; Goorden, Neeltje
2015-04-01
Decision making in disaster management requires fast access to reliable and relevant information. We believe that online information and services will become increasingly important in disaster management. Within the EU FP7 project RASOR (Rapid Risk Assessment and Spatialisation of Risk) an online platform is being developed for rapid multi-hazard risk analyses to support disaster management anywhere in the world. The platform will provide access to a plethora of GIS data that are relevant to risk assessment. It will also enable the user to run numerical flood models to simulate historical and newly defined flooding scenarios. The results of these models are maps of flood extent, flood depths and flow velocities. The RASOR platform will enable to overlay historical event flood maps with observations and Earth Observation (EO) imagery to fill in gaps and assess the accuracy of the flood models. New flooding scenarios can be defined by the user and simulated to investigate the potential impact of future floods. A series of flood models have been developed within RASOR for selected case study areas around the globe that are subject to very different flood hazards: • The city of Bandung in Indonesia, which is prone to fluvial flooding induced by heavy rainfall. The flood hazard is exacerbated by land subsidence. • The port of Cilacap on the south coast of Java, subject to tsunami hazard from submarine earthquakes in the Sunda trench. • The area south of city of Rotterdam in the Netherlands, prone to coastal and/or riverine flooding. • The island of Santorini in Greece, which is subject to tsunamis induced by landslides. Flood models have been developed for each of these case studies using mostly EO data, augmented by local data where necessary. Particular use was made of the new TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement) product from the German Aerospace centre (DLR) and EADS Astrium. The presentation will describe the flood models and the
Probabilistic Planning with Imperfect Sensing Actions Using Hybrid Probabilistic Logic Programs
Saad, Emad
Effective planning in uncertain environment is important to agents and multi-agents systems. In this paper, we introduce a new logic based approach to probabilistic contingent planning (probabilistic planning with imperfect sensing actions), by relating probabilistic contingent planning to normal hybrid probabilistic logic programs with probabilistic answer set semantics [24]. We show that any probabilistic contingent planning problem can be encoded as a normal hybrid probabilistic logic program. We formally prove the correctness of our approach. Moreover, we show that the complexity of finding a probabilistic contingent plan in our approach is NP-complete. In addition, we show that any probabilistic contingent planning problem, \\cal PP, can be encoded as a classical normal logic program with answer set semantics, whose answer sets corresponds to valid trajectories in \\cal PP. We show that probabilistic contingent planning problems can be encoded as SAT problems. We present a new high level probabilistic action description language that allows the representation of sensing actions with probabilistic outcomes.
NASA Global Flood Mapping System
Policelli, Fritz; Slayback, Dan; Brakenridge, Bob; Nigro, Joe; Hubbard, Alfred
2017-01-01
Product utility key factors: Near real time, automated production; Flood spatial extent Cloudiness Pixel resolution: 250m; Flood temporal extent; Flash floods short duration on ground?; Landcover--Water under vegetation cover vs open water
Advances in Global Flood Forecasting Systems
Thielen-del Pozo, J.; Pappenberger, F.; Burek, P.; Alfieri, L.; Kreminski, B.; Muraro, D.
2012-12-01
A trend of increasing number of heavy precipitation events over many regions in the world during the past century has been observed (IPCC, 2007), but conclusive results on a changing frequency or intensity of floods have not yet been established. However, the socio-economic impact particularly of floods is increasing at an alarming trend. Thus anticipation of severe events is becoming a key element of society to react timely to effectively reduce socio-economic damage. Anticipation is essential on local as well as on national or trans-national level since management of response and aid for major disasters requires a substantial amount of planning and information on different levels. Continental and trans-national flood forecasting systems already exist. The European Flood Awareness System (EFAS) has been developed in close collaboration with the National services and is going operational in 2012, enhancing the national forecasting centres with medium-range probabilistic added value information while at the same time providing the European Civil Protection with harmonised information on ongoing and upcoming floods for improved aid management. Building on experiences and methodologies from EFAS, a Global Flood Awareness System (GloFAS) has now been developed jointly between researchers from the European Commission Joint Research Centre (JRC) and the European Centre for Medium-Range Weather Forecast (ECWMF). The prototype couples HTESSEL, the land-surface scheme of the ECMWF NWP model with the LISFLOOD hydrodynamic model for the flow routing in the river network. GloFAS is set-up on global scale with horizontal grid spacing of 0.1 degree. The system is driven with 51 ensemble members from VAREPS with a time horizon of 15 days. In order to allow for the routing in the large rivers, the coupled model is run for 45 days assuming zero rainfall after day 15. Comparison with observations have shown that in some rivers the system performs quite well while in others the hydro
Hazard function analysis for flood planning under nonstationarity
Read, Laura K.; Vogel, Richard M.
2016-05-01
The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.
-Boundedness and -Compactness in Finite Dimensional Probabilistic Normed Spaces
Indian Academy of Sciences (India)
Reza Saadati; Massoud Amini
2005-11-01
In this paper, we prove that in a finite dimensional probabilistic normed space, every two probabilistic norms are equivalent and we study the notion of -compactness and -boundedness in probabilistic normed spaces.
Benaloh's Dense Probabilistic Encryption Revisited
Fousse, Laurent; Alnuaimi, Mohamed
2010-01-01
In 1994, Josh Benaloh proposed a probabilistic homomorphic encryption scheme, enhancing the poor expansion factor provided by Goldwasser and Micali's scheme. Since then, numerous papers have taken advantage of Benaloh's homomorphic encryption function, including voting schemes, non-interactive verifiable secret sharing, online poker... In this paper we show that the original description of the scheme is incorrect, possibly resulting in ambiguous decryption of ciphertexts. We give a corrected description of the scheme, provide a complete proof of correctness and an analysis of the probability of failure in the initial description.
Probabilistic Analysis of Crack Width
Directory of Open Access Journals (Sweden)
J. Marková
2000-01-01
Full Text Available Probabilistic analysis of crack width of a reinforced concrete element is based on the formulas accepted in Eurocode 2 and European Model Code 90. Obtained values of reliability index b seem to be satisfactory for the reinforced concrete slab that fulfils requirements for the crack width specified in Eurocode 2. However, the reliability of the slab seems to be insufficient when the European Model Code 90 is considered; reliability index is less than recommended value 1.5 for serviceability limit states indicated in Eurocode 1. Analysis of sensitivity factors of basic variables enables to find out variables significantly affecting the total crack width.
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
lina braces fora particularly dangerous flood season in the wake of disastrous rainstorms Aseries of heavy storms since early May led to severe flooding and landslides in south and southwest China,causing heavy casualties and economic losses. Severe convective weather such as downpours,
Discover Floods Educators Guide
Project WET Foundation, 2009
2009-01-01
Now available as a Download! This valuable resource helps educators teach students about both the risks and benefits of flooding through a series of engaging, hands-on activities. Acknowledging the different roles that floods play in both natural and urban communities, the book helps young people gain a global understanding of this common--and…
Looking at the big scale - Global Flood Forecasting
Burek, P.; Alfieri, L.; Thielen-del Pozo, J.; Muraro, D.; Pappenberger, F.; Krzeminsk, B.
2012-04-01
Reacting to the increasing need for better preparedness to worldwide hydrological extremes, the Joint Research Centre has joined forces with the European Centre for Medium-Range Weather Forecast (ECMWF), to couple state-of-the art weather forecasts with a hydrological model on global scale. On a pre-operationally basis a fully hydro-meteorological flood forecasting model is running since July 2011 and producing daily probabilistic discharge forecast with worldwide coverage and forecast horizon of about 1 month. An important aspect of this global system is that it is set-up on continental scale and therefore independent of administrative and political boundaries - providing downstream countries with information on upstream river conditions as well as continental and global overviews. The prototype of a Global Flood Alert System consists of HTESSEL land surface scheme coupled with LISFLOOD hydrodynamic model for the flow routing in the river network. Both hydrological models are set up on global coverage with horizontal grid resolution of 0.1° and daily time step for input and output data. To estimate corresponding discharge warning thresholds for selected return periods, the coupled HTESSEL-LISFLOOD hydrological model is driven with ERA-Interim input meteorological data for a 21 year period from 1989 onward. For daily forecasts the ensemble stream flow predictions are run by feeding Variable Resolution Ensemble Prediction System (VarEPS) weather forecasts into the coupled model. VarEPS consist of 51-member ensemble global forecasts for 15 days. The hydrological simulations are computed for a 45-day time horizon, to account the routing of flood waves through large river basins with time of concentration of the order of one month. Both results, the discharge thresholds from the long term run and the multiple hydrographs of the daily ensemble stream flow prediction are joined together to produce probabilistic information of critical threshold exceedance. Probabilistic
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Why do probabilistic finite element analysis ?
Thacker, B H
2008-01-01
The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.
Function Approximation Using Probabilistic Fuzzy Systems
J.H. van den Berg (Jan); U. Kaymak (Uzay); R.J. Almeida e Santos Nogueira (Rui Jorge)
2011-01-01
textabstractWe consider function approximation by fuzzy systems. Fuzzy systems are typically used for approximating deterministic functions, in which the stochastic uncertainty is ignored. We propose probabilistic fuzzy systems in which the probabilistic nature of uncertainty is taken into account.
Probabilistic Remaining Useful Life Prediction of Composite Aircraft Components Project
National Aeronautics and Space Administration — A Probabilistic Fatigue Damage Assessment Network (PFDAN) toolkit for Abaqus will be developed for probabilistic life management of a laminated composite structure...
Semantics of sub-probabilistic programs
Institute of Scientific and Technical Information of China (English)
Yixing CHEN; Hengyang WU
2008-01-01
The aim of this paper is to extend the probabil-istic choice in probabilistic programs to sub-probabilistic choice, i.e., of the form (p)P (q) Q where p + q ≤ 1. It means that program P is executed with probability p and program Q is executed with probability q. Then, start-ing from an initial state, the execution of a sub-probabil-istic program results in a sub-probability distribution. This paper presents two equivalent semantics for a sub-probabilistic while-programming language. One of these interprets programs as sub-probabilistic distributions on state spaces via denotational semantics. The other inter-prets programs as bounded expectation transformers via wp-semantics. This paper proposes an axiomatic systems for total logic, and proves its soundness and completeness in a classical pattern on the structure of programs.
Distillation Column Flooding Predictor
Energy Technology Data Exchange (ETDEWEB)
George E. Dzyacky
2010-11-23
The Flooding Predictor™ is a patented advanced control technology proven in research at the Separations Research Program, University of Texas at Austin, to increase distillation column throughput by over 6%, while also increasing energy efficiency by 10%. The research was conducted under a U. S. Department of Energy Cooperative Agreement awarded to George Dzyacky of 2ndpoint, LLC. The Flooding Predictor™ works by detecting the incipient flood point and controlling the column closer to its actual hydraulic limit than historical practices have allowed. Further, the technology uses existing column instrumentation, meaning no additional refining infrastructure is required. Refiners often push distillation columns to maximize throughput, improve separation, or simply to achieve day-to-day optimization. Attempting to achieve such operating objectives is a tricky undertaking that can result in flooding. Operators and advanced control strategies alike rely on the conventional use of delta-pressure instrumentation to approximate the column’s approach to flood. But column delta-pressure is more an inference of the column’s approach to flood than it is an actual measurement of it. As a consequence, delta pressure limits are established conservatively in order to operate in a regime where the column is never expected to flood. As a result, there is much “left on the table” when operating in such a regime, i.e. the capacity difference between controlling the column to an upper delta-pressure limit and controlling it to the actual hydraulic limit. The Flooding Predictor™, an innovative pattern recognition technology, controls columns at their actual hydraulic limit, which research shows leads to a throughput increase of over 6%. Controlling closer to the hydraulic limit also permits operation in a sweet spot of increased energy-efficiency. In this region of increased column loading, the Flooding Predictor is able to exploit the benefits of higher liquid
Inland and coastal flooding: developments in prediction and prevention.
Hunt, J C R
2005-06-15
We review the scientific and engineering understanding of various types of inland and coastal flooding by considering the different causes and dynamic processes involved, especially in extreme events. Clear progress has been made in the accuracy of numerical modelling of meteorological causes of floods, hydraulics of flood water movement and coastal wind-wave-surge. Probabilistic estimates from ensemble predictions and the simultaneous use of several models are recent techniques in meteorological prediction that could be considered for hydraulic and oceanographic modelling. The contribution of remotely sensed data from aircraft and satellites is also considered. The need to compare and combine statistical and computational modelling methodologies for long range forecasts and extreme events is emphasized, because this has become possible with the aid of kilometre scale computations and network grid facilities to simulate and analyse time-series and extreme events. It is noted that despite the adverse effects of climatic trends on flooding, appropriate planning of rapidly growing urban areas could mitigate some of the worst effects. However, resources for flood prevention, including research, have to be considered in relation to those for other natural disasters. Policies have to be relevant to the differing geology, meteorology and cultures of the countries affected.
A Dynamic Probabilistic Broadcasting Scheme based on Cross-Layer design for MANETs
Directory of Open Access Journals (Sweden)
Qing-wen WANG
2010-11-01
Full Text Available Broadcasting plays a fundamental role in transmitting a message from the sender to the rest of the network nodes in Mobile Ad hoc Networks (MANETs. The blind flooding scheme causes a broadcast storm problem, which leads to significant network performance degradation. In order to solve the problem, a dynamic probabilistic broadcasting scheme cross-layer design for MANETs (DPBSC is proposed. DPBSC adopts the cross-layer design, which lets routing layer share the received signal power information at MAC layer while still maintaining separation between the two layers. The additional transmission range that can benefit from rebroadcast is calculated according to the received signal power, which is applied to dynamically adjust the rebroadcast probability. DPBSC reduces the redundant retransmission and the chance of the contention and collision in the networks. Simulation results reveal that the DPBSC achieves better performance in terms of the saved-rebroadcast, the average packet drop fraction, the average number of collisions and average end-to-end delay at expense of the throughput, which is respectively compared with the blind flooding and fixed probabilistic flooding applied at the routing layer while IEEE 802.11 at the MAC layer.
Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian; van Andel, Schalk-Jan; Wood, Andy
2014-05-01
Probabilistic streamflow forecasts have been increasingly used or requested by practitioners in the operation of multipurpose water reservoirs. They usually integrate hydrologic inflow forecasts to their operational management rules to optimize water allocation or its economic value, to mitigate droughts, for flood and ecological control, among others. In this paper, we present an experiment conducted to investigate the use of probabilistic forecasts to make decisions on water reservoir outflows. The experiment was set up as a risk-based decision-making game. In the game, each participant acted as a water manager. A sequence of probabilistic inflow forecasts was presented to be used to make a reservoir release decision at a monthly time step, subject to a few constraints. After each decision, the actual inflow was presented and the consequences of the decisions made were discussed. Results from the application of the game to different groups of scientists and operational managers during conferences and meetings in 2013 (a total of about 150 participants) illustrate the different strategies adopted by the players. This game experiment allowed participants to experience first hand the challenges of probabilistic, quantitative decision-making.
A Dynamic Probabilistic Based Broadcasting Scheme for MANETs
Shanmugam, Kannan; Subburathinam, Karthik; Velayuthampalayam Palanisamy, Arunachalam
2016-01-01
MANET is commonly known as Mobile Ad Hoc Network in which cluster of mobile nodes can communicate with each other without having any basic infrastructure. The basic characteristic of MANET is dynamic topology. Due to the dynamic behavior nature, the topology of the network changes very frequently, and this will lead to the failure of the valid route repeatedly. Thus, the process of finding the valid route leads to notable drop in the throughput of the network. To identify a new valid path to the targeted mobile node, available proactive routing protocols use simple broadcasting method known as simple flooding. The simple flooding method broadcasts the RREQ packet from the source to the rest of the nodes in mobile network. But the problem with this method is disproportionate repetitive retransmission of RREQ packet which could result in high contention on the available channel and packet collision due to extreme traffic in the network. A reasonable number of routing algorithms have been suggested for reducing the lethal impact of flooding the RREQ packets. However, most of the algorithms have resulted in considerable amount of complexity and deduce the throughput by depending on special hardware components and maintaining complex information which will be less frequently used. By considering routing complexity with the goal of increasing the throughput of the network, in this paper, we have introduced a new approach called Dynamic Probabilistic Route (DPR) discovery. The Node's Forwarding Probability (NFP) is dynamically calculated by the DPR mobile nodes using Probability Function (PF) which depends on density of local neighbor nodes and the cumulative number of its broadcast covered neighbors. PMID:27019868
A Dynamic Probabilistic Based Broadcasting Scheme for MANETs.
Shanmugam, Kannan; Subburathinam, Karthik; Palanisamy, Arunachalam Velayuthampalayam
2016-01-01
MANET is commonly known as Mobile Ad Hoc Network in which cluster of mobile nodes can communicate with each other without having any basic infrastructure. The basic characteristic of MANET is dynamic topology. Due to the dynamic behavior nature, the topology of the network changes very frequently, and this will lead to the failure of the valid route repeatedly. Thus, the process of finding the valid route leads to notable drop in the throughput of the network. To identify a new valid path to the targeted mobile node, available proactive routing protocols use simple broadcasting method known as simple flooding. The simple flooding method broadcasts the RREQ packet from the source to the rest of the nodes in mobile network. But the problem with this method is disproportionate repetitive retransmission of RREQ packet which could result in high contention on the available channel and packet collision due to extreme traffic in the network. A reasonable number of routing algorithms have been suggested for reducing the lethal impact of flooding the RREQ packets. However, most of the algorithms have resulted in considerable amount of complexity and deduce the throughput by depending on special hardware components and maintaining complex information which will be less frequently used. By considering routing complexity with the goal of increasing the throughput of the network, in this paper, we have introduced a new approach called Dynamic Probabilistic Route (DPR) discovery. The Node's Forwarding Probability (NFP) is dynamically calculated by the DPR mobile nodes using Probability Function (PF) which depends on density of local neighbor nodes and the cumulative number of its broadcast covered neighbors.
A Dynamic Probabilistic Based Broadcasting Scheme for MANETs
Directory of Open Access Journals (Sweden)
Kannan Shanmugam
2016-01-01
Full Text Available MANET is commonly known as Mobile Ad Hoc Network in which cluster of mobile nodes can communicate with each other without having any basic infrastructure. The basic characteristic of MANET is dynamic topology. Due to the dynamic behavior nature, the topology of the network changes very frequently, and this will lead to the failure of the valid route repeatedly. Thus, the process of finding the valid route leads to notable drop in the throughput of the network. To identify a new valid path to the targeted mobile node, available proactive routing protocols use simple broadcasting method known as simple flooding. The simple flooding method broadcasts the RREQ packet from the source to the rest of the nodes in mobile network. But the problem with this method is disproportionate repetitive retransmission of RREQ packet which could result in high contention on the available channel and packet collision due to extreme traffic in the network. A reasonable number of routing algorithms have been suggested for reducing the lethal impact of flooding the RREQ packets. However, most of the algorithms have resulted in considerable amount of complexity and deduce the throughput by depending on special hardware components and maintaining complex information which will be less frequently used. By considering routing complexity with the goal of increasing the throughput of the network, in this paper, we have introduced a new approach called Dynamic Probabilistic Route (DPR discovery. The Node’s Forwarding Probability (NFP is dynamically calculated by the DPR mobile nodes using Probability Function (PF which depends on density of local neighbor nodes and the cumulative number of its broadcast covered neighbors.
Implementation of external hazards in Probabilistic Safety Assessment for nuclear power plants
Kumar, Manorma; Klug, Joakim; Raimond, Emmanuel
2015-04-01
The paper will focus on the discussion on implementation of external hazards in the probabilistic safety assessment (PSA) methods for the extreme external hazards mainly focused on Seismic, Flooding, Meteorological Hazards (e.g. Storm, Extreme temperature, snow pack), Biological infestation, Lightening hazards, Accidental Aircraft crash and man- made hazards including natural external fire and external explosion. This will include discussion on identification of some good practices on the implementation of external hazards in Level 1 PSA, with a perspective of development of extended PSA and introduction of relevant modelling for external hazards in an existing Level 1 PSA. This paper is associated to the European project ASAMPSAE (www.asampsa.eu) which gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and which aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants.
Oulahen, Greg
2015-03-01
Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability.
Oulahen, Greg
2015-03-01
Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability.
Du, Weiwei; FitzGerald, Gerard Joseph; Clark, Michele; Hou, Xiang-Yu
2010-01-01
Floods are the most common hazard to cause disasters and have led to extensive morbidity and mortality throughout the world. The impact of floods on the human community is related directly to the location and topography of the area, as well as human demographics and characteristics of the built environment. The aim of this study is to identify the health impacts of disasters and the underlying causes of health impacts associated with floods. A conceptual framework is developed that may assist with the development of a rational and comprehensive approach to prevention, mitigation, and management. This study involved an extensive literature review that located >500 references, which were analyzed to identify common themes, findings, and expert views. The findings then were distilled into common themes. The health impacts of floods are wide ranging, and depend on a number of factors. However, the health impacts of a particular flood are specific to the particular context. The immediate health impacts of floods include drowning, injuries, hypothermia, and animal bites. Health risks also are associated with the evacuation of patients, loss of health workers, and loss of health infrastructure including essential drugs and supplies. In the medium-term, infected wounds, complications of injury, poisoning, poor mental health, communicable diseases, and starvation are indirect effects of flooding. In the long-term, chronic disease, disability, poor mental health, and poverty-related diseases including malnutrition are the potential legacy. This article proposes a structured approach to the classification of the health impacts of floods and a conceptual framework that demonstrates the relationships between floods and the direct and indirect health consequences.
A probabilistic strategy for parametric catastrophe insurance
Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin
2017-04-01
Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss
Jonkman, S.N.; Mooyaart, L.F.; Van Ledden, M.; Stoeten, K.J.; De Vries, P.A.L.; Lendering, K.T.; Van der Toorn, A.; Willems, A.
2014-01-01
The Houston - Galveston area is at significant risk from hurricane induced storm surges. This paper summarizes ongoing studies on flood risk reduction for the region. Firstly, based on a simplified probabilistic hurricane surge model , the return periods of surges within the bay have been estimated.
Development of flood index by characterisation of flood hydrographs
Bhattacharya, Biswa; Suman, Asadusjjaman
2015-04-01
In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Due to climatological characteristics there are catchments where flood forecasting may have a relatively limited role and flood event management may have to be trusted upon. For example, in flash flood catchments, which often may be tiny and un-gauged, flood event management often depends on approximate prediction tools such as flash flood guidance (FFG). There are catchments fed largely by flood waters coming from upstream catchments, which are un-gauged or due to data sharing issues in transboundary catchments the flow of information from upstream catchment is limited. Hydrological and hydraulic modelling of these downstream catchments will never be sufficient to provide any required forecasting lead time and alternative tools to support flood event management will be required. In FFG, or similar approaches, the primary motif is to provide guidance by synthesising the historical data. We follow a similar approach to characterise past flood hydrographs to determine a flood index (FI), which varies in space and time with flood magnitude and its propagation. By studying the variation of the index the pockets of high flood risk, requiring attention, can be earmarked beforehand. This approach can be very useful in flood risk management of catchments where information about hydro-meteorological variables is inadequate for any forecasting system. This paper presents the development of FI and its application to several catchments including in Kentucky in the USA
Norman, Laura M.; Levick, Lainie; Guertin, D. Phillip; Callegary, James; Guadarrama, Jesus Quintanar; Anaya, Claudia Zulema Gil; Prichard, Andrea; Gray, Floyd; Castellanos, Edgar; Tepezano, Edgar; Huth, Hans; Vandervoet, Prescott; Rodriguez, Saul; Nunez, Jose; Atwood, Donald; Granillo, Gilberto Patricio Olivero; Ceballos, Francisco Octavio Gastellum
2010-01-01
Flooding in Ambos Nogales often exceeds the capacity of the channel and adjacent land areas, endangering many people. The Nogales Wash is being studied to prevent future flood disasters and detention features are being installed in tributaries of the wash. This paper describes the application of the KINEROS2 model and efforts to understand the capacity of these detention features under various flood and urbanization scenarios. Results depict a reduction in peak flow for the 10-year, 1-hour event based on current land use in tributaries with detention features. However, model results also demonstrate that larger storm events and increasing urbanization will put a strain on the features and limit their effectiveness.
Probabilistic Fatigue Damage Program (FATIG)
Michalopoulos, Constantine
2012-01-01
FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.
The Complexity of Probabilistic Lobbying
Erdélyi, Gábor; Goldsmith, Judy; Mattei, Nicholas; Raible, Daniel; Rothe, Jörg
2009-01-01
We propose various models for lobbying in a probabilistic environment, in which an actor (called "The Lobby") seeks to influence the voters' preferences of voting for or against multiple issues when the voters' preferences are represented in terms of probabilities. In particular, we provide two evaluation criteria and three bribery methods to formally describe these models, and we consider the resulting forms of lobbying with and without issue weighting. We provide a formal analysis for these problems of lobbying in a stochastic environment, and determine their classical and parameterized complexity depending on the given bribery/evaluation criteria. Specifically, we show that some of these problems can be solved in polynomial time, some are NP-complete but fixed-parameter tractable, and some are W[2]-complete. Finally, we provide (in)approximability results.
Machine learning a probabilistic perspective
Murphy, Kevin P
2012-01-01
Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...
Probabilistic simulation of fire scenarios
Energy Technology Data Exchange (ETDEWEB)
Hostikka, Simo E-mail: simo.bostikka@vtt.fi; Keski-Rahkonen, Olavi
2003-10-01
A risk analysis tool is developed for computation of the distributions of fire model output variables. The tool, called Probabilistic Fire Simulator (PFS), combines Monte Carlo simulation and CFAST, a two-zone fire model. In this work, the tool is used to estimate the failure probability of redundant cables in a cable tunnel fire, and the failure and smoke filling probabilities in an electronics room during an electronics cabinet fire. Sensitivity of the output variables to the input variables is calculated in terms of the rank order correlations. The use of the rank order correlations allows the user to identify both modelling parameters and actual facility properties that have the most influence on the results. Various steps of the simulation process, i.e. data collection, generation of the input distributions, modelling assumptions, definition of the output variables and the actual simulation, are described.
Probabilistic direct counterfactual quantum communication
Zhang, Sheng
2017-02-01
It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).
Probabilistic cloning with supplementary information
Azuma, K; Koashi, M; Imoto, N; Azuma, Koji; Shimamura, Junichi; Koashi, Masato; Imoto, Nobuyuki
2005-01-01
We consider probabilistic cloning of a state chosen from a mutually nonorthogonal set of pure states, with the help of a party holding supplementary information in the form of pure states. When the number of states is two, we show that the best efficiency of producing m copies is always achieved by a two-step protocol in which the helping party first attempts to produce m-1 copies from the supplementary state, and if it fails, then the original state is used to produce m copies. On the other hand, when the number of states exceeds two, the best efficiency is not always achieved by such a protocol. We give examples in which the best efficiency is not achieved even if we allow any amount of one-way classical communication from the helping party.
Federal Emergency Management Agency, Department of Homeland Security — The Floodplain Mapping/Redelineation study deliverables depict and quantify the flood risks for the study area. The primary risk classifications used are the...
practitioners will cover a range of practices that can help communities build flood resilience, from small scale interventions such as rain gardens and permeable pavement to coordinated open space and floodplain preservation
Mold growth may be a problem after flooding. Excess moisture in the home is cause for concern about indoor air quality primarily because it provides breeding conditions for pests, molds and other microorganisms.
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic machine learning and artificial intelligence
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
A History of Probabilistic Inductive Logic Programming
Directory of Open Access Journals (Sweden)
Fabrizio eRiguzzi
2014-09-01
Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.
Probabilistic Modeling of Graded Timber Material Properties
DEFF Research Database (Denmark)
Faber, M. H.; Köhler, J.; Sørensen, John Dalsgaard
2004-01-01
The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for quality grading in the production line. It is shown how statistical models may be established...... an important role in the overall probabilistic modeling. Therefore a scheme for estimating the parameters of probability distribution parameters focusing on the tail behavior has been established using a censored Maximum Likelihood estimation technique. The proposed probabilistic models have been formulated...
Real-time forecasts of flood hazard and impact: some UK experiences
Directory of Open Access Journals (Sweden)
Cole Steven J.
2016-01-01
Full Text Available Major UK floods over the last decade have motivated significant technological and scientific advances in operational flood forecasting and warning. New joint forecasting centres between the national hydrological and meteorological operating agencies have been formed that issue a daily, national Flood Guidance Statement (FGS to the emergency response community. The FGS is based on a Flood Risk Matrix approach that is a function of potential impact severity and likelihood. It has driven an increased demand for robust, accurate and timely forecast and alert information on fluvial and surface water flooding along with impact assessments. The Grid-to-Grid (G2G distributed hydrological model has been employed across Britain at a 1km resolution to support the FGS. Novel methods for linking dynamic gridded estimates of river flow and surface runoff with more detailed offline flood risk maps have been developed to obtain real-time probabilistic forecasts of potential impacts, leading to operational trials. Examples of the national-scale G2G application are provided along with case studies of forecast flood impact from (i an operational Surface Water Flooding (SWF trial during the Glasgow 2014 Commonwealth Games, (ii SWF developments under the Natural Hazards Partnership over England & Wales, and (iii fluvial applications in Scotland.
Probabilistic UML statecharts for specification and verification: a case study
Jansen, D.N.; Jürjens, J.; Cengarle, M.V.; Fernandez, E.B.; Rumpe, B.; Sander, R.
2002-01-01
This paper introduces a probabilistic extension of UML statecharts. A requirements-level semantics of statecharts is extended to include probabilistic elements. Desired properties for probabilistic statecharts are expressed in the probabilistic logic PCTL, and verified using the model checker Prism.
Institute of Scientific and Technical Information of China (English)
Dorine; Houston
1998-01-01
Dear Xiao Lan. ’Several times a week, no matter which of the major television news networksI turn to, the screen is filled with tragic pictures of flooding along the YangtzeRiver, and I grieve for the suffering people whose lives are being so terriblydisrupted by this disaster. Even more to be grieved is the terrible number of peoplewho have been killed by the floods and their effects.
Hurricane Sandy’s flood frequency increasing from year 1800 to 2100
Horton, Benjamin P.; Donnelly, Jeffrey P.
2016-01-01
Coastal flood hazard varies in response to changes in storm surge climatology and the sea level. Here we combine probabilistic projections of the sea level and storm surge climatology to estimate the temporal evolution of flood hazard. We find that New York City’s flood hazard has increased significantly over the past two centuries and is very likely to increase more sharply over the 21st century. Due to the effect of sea level rise, the return period of Hurricane Sandy’s flood height decreased by a factor of ∼3× from year 1800 to 2000 and is estimated to decrease by a further ∼4.4× from 2000 to 2100 under a moderate-emissions pathway. When potential storm climatology change over the 21st century is also accounted for, Sandy’s return period is estimated to decrease by ∼3× to 17× from 2000 to 2100. PMID:27790992
Flood Bypass Capacity Optimization
Siclari, A.; Hui, R.; Lund, J. R.
2015-12-01
Large river flows can damage adjacent flood-prone areas, by exceeding river channel and levee capacities. Particularly large floods are difficult to contain in leveed river banks alone. Flood bypasses often can efficiently reduce flood risks, where excess river flow is diverted over a weir to bypasses, that incur much less damage and cost. Additional benefits of bypasses include ecosystem protection, agriculture, groundwater recharge and recreation. Constructing or expanding an existing bypass costs in land purchase easements, and levee setbacks. Accounting for such benefits and costs, this study develops a simple mathematical model for optimizing flood bypass capacity using benefit-cost and risk analysis. Application to the Yolo Bypass, an existing bypass along the Sacramento River in California, estimates optimal capacity that economically reduces flood damage and increases various benefits, especially for agriculture. Land availability is likely to limit bypass expansion. Compensation for landowners could relax such limitations. Other economic values could affect the optimal results, which are shown by sensitivity analysis on major parameters. By including land geography into the model, location of promising capacity expansions can be identified.
Explorers Presentation: Flooding and Coastal Communities
Institute, Marine
2015-01-01
: Explorers Flooding and Coastal Communities presentation provides an introduction to flooding. This can be used with the lesson plan on building flood defences. It covers: What is a flood? Why does it flood? Where does the water come from? The water cycle; Where is water stored? Examples of Pluvial vs. Coastal flooding; Impacts of flooding; Flood defences; What else influences flooding - Human impacts, Urbanisation, Deforestation, Sea level rise
Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar
2017-02-01
Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.
Probabilistic analysis of linear elastic cracked structures
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
This paper presents a probabilistic methodology for linear fracture mechanics analysis of cracked structures. The main focus is on probabilistic aspect related to the nature of crack in material. The methodology involves finite element analysis; statistical models for uncertainty in material properties, crack size, fracture toughness and loads; and standard reliability methods for evaluating probabilistic characteristics of linear elastic fracture parameter. The uncertainty in the crack size can have a significant effect on the probability of failure, particularly when the crack size has a large coefficient of variation. Numerical example is presented to show that probabilistic methodology based on Monte Carlo simulation provides accurate estimates of failure probability for use in linear elastic fracture mechanics.
Structural reliability codes for probabilistic design
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager
1997-01-01
difficulties of ambiguity and definition show up when attempting to make the transition from a given authorized partial safety factor code to a superior probabilistic code. For any chosen probabilistic code format there is a considerable variation of the reliability level over the set of structures defined...... considerable variation of the reliability measure as defined by a specific probabilistic code format. Decision theoretical principles are applied to get guidance about which of these different reliability levels of existing practice to choose as target reliability level. Moreover, it is shown that the chosen...... probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...
Revising incompletely specified convex probabilistic belief bases
CSIR Research Space (South Africa)
Rens, G
2016-04-01
Full Text Available International Workshop on Non-Monotonic Reasoning (NMR), 22-24 April 2016, Cape Town, South Africa Revising Incompletely Specified Convex Probabilistic Belief Bases Gavin Rens CAIR_, University of KwaZulu-Natal, School of Mathematics, Statistics...
A logic for inductive probabilistic reasoning
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...
Non-unitary probabilistic quantum computing
Gingrich, Robert M.; Williams, Colin P.
2004-01-01
We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.
Safety Verification for Probabilistic Hybrid Systems
DEFF Research Database (Denmark)
Zhang, Lijun; She, Zhikun; Ratschan, Stefan;
2010-01-01
The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...... of classical hybrid systems we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...
Safety Verification for Probabilistic Hybrid Systems
DEFF Research Database (Denmark)
Zhang, Lijun; She, Zhikun; Ratschan, Stefan;
2012-01-01
The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based...... of classical hybrid systems, we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...
Improved transformer protection using probabilistic neural network ...
African Journals Online (AJOL)
user
This article presents a novel technique to distinguish between magnetizing inrush ... Protective relaying, Probabilistic neural network, Active power relays, Power ... Forward Neural Network (MFFNN) with back-propagation learning technique.
Probabilistic composition of preferences, theory and applications
Parracho Sant'Anna, Annibal
2015-01-01
Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.
Strategic Team AI Path Plans: Probabilistic Pathfinding
Directory of Open Access Journals (Sweden)
Tng C. H. John
2008-01-01
Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.
An operational flash-flood forecasting chain applied to the test cases of the EU project HYDROPTIMET
Directory of Open Access Journals (Sweden)
A. C. Taramasso
2005-01-01
Full Text Available The application of a flash-flood prediction chain, developed by CIMA, to some testcases for the Tanaro river basin in the framework of the EU project HYDROPTIMET is presented here. The components of the CIMA chain are: forecast rainfall depths, a stochastic downscaling procedure and a hydrological model. Different meteorological Limited Area Models (LAMs provide the rainfall input to the hydrological component. The flash-flood prediction chain is run both in a deterministic and in a probabilistic configuration. The sensitivity of forecasting chain performances to different LAMs providing rainfall forecasts is discussed. The results of the application show how the probabilistic forecasting system can give, especially in the case of convective events, a valuable contribution in addressing the uncertainty at different spatio-temporal scales involved in the flash flood forecasting problem in small and medium basins with complex orography.
Hoang, Long; Nguyen Viet, Dung; Kummu, Matti; Lauri, Hannu; Koponen, Jorma; van Vliet, Michelle T. H.; Supit, Iwan; Leemans, Rik; Kabat, Pavel; Ludwig, Fulco
2016-04-01
Extreme floods cause huge damages to human lives and infrastructure, and hamper socio-economic development in the Mekong River Delta in Vietnam. Induced by climate change, upstream hydrological changes and sea level rise are expected to further exacerbate future flood hazard and thereby posing critical challenges for securing safety and sustainability. This paper provides a probabilistic quantification of future flood hazard for the Mekong Delta, focusing on extreme events under climate change. We developed a model chain to simulate separate and combined impacts of two drivers, namely upstream hydrological changes and sea level rise on flood magnitude and frequency. Simulation results show that upstream changes and sea level rise substantially increase flood hazard throughout the whole Mekong Delta. Due to differences in their nature, two drivers show different features in their impacts on floods. Impacts of upstream changes are more dominant in floodplains in the upper delta, causing an increase of up to +0.80 m in flood depth. Sea level rise introduces flood hazard to currently safe areas in the middle and coastal delta zones. A 0.6 m rise in relative sea level causes an increase in flood depth between 0.10 and 0.70 m, depending on location by 2050s. Upstream hydrological changes and sea level rise tend to intensify each other's impacts on floods, resulting in stronger combined impacts than linearly summed impacts of each individual driver. Substantial increase of future flood hazard strongly requires better flood protection and more flood resilient development for the Mekong Delta. Findings from this study can be used as quantified physical boundary conditions to develop flood management strategies and strategic delta management plans.
Assessment of global flood exposures - developing an appropriate approach
Millinship, Ian; Booth, Naomi
2015-04-01
Increasingly complex probabilistic catastrophe models have become the standard for quantitative flood risk assessments by re/insurance companies. On the one hand, probabilistic modelling of this nature is extremely useful; a large range of risk metrics can be output. However, they can be time consuming and computationally expensive to develop and run. Levels of uncertainty are persistently high despite, or perhaps because of, attempts to increase resolution and complexity. A cycle of dependency between modelling companies and re/insurers has developed whereby available models are purchased, models run, and both portfolio and model data 'improved' every year. This can lead to potential exposures in perils and territories that are not currently modelled being largely overlooked by companies, who may then face substantial and unexpected losses when large events occur in these areas. We present here an approach to assessing global flood exposures which reduces the scale and complexity of approach used and begins with the identification of hotspots where there is a significant exposure to flood risk. The method comprises four stages: i) compile consistent exposure information, ii) to apply reinsurance terms and conditions to calculate values exposed, iii) to assess the potential hazard using a global set of flood hazard maps, and iv) to identify potential risk 'hotspots' which include considerations of spatially and/or temporally clustered historical events, and local flood defences. This global exposure assessment is designed as a scoping exercise, and reveals areas or cities where the potential for accumulated loss is of significant interest to a reinsurance company, and for which there is no existing catastrophe model. These regions are then candidates for the development of deterministic scenarios, or probabilistic models. The key advantages of this approach will be discussed. These include simplicity and ability of business leaders to understand results, as well as
Probabilistic Analysis Methods for Hybrid Ventilation
DEFF Research Database (Denmark)
Brohus, Henrik; Frier, Christian; Heiselberg, Per
This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application of stoc...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....
PROBABILISTIC METHODOLOGY OF LOW CYCLE FATIGUE ANALYSIS
Institute of Scientific and Technical Information of China (English)
Jin Hui; Wang Jinnuo; Wang Libin
2003-01-01
The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analy sis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are consid ered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.
DEMPSTER-SHAFER THEORY BY PROBABILISTIC REASONING
Directory of Open Access Journals (Sweden)
Chiranjib Mukherjee
2015-10-01
Full Text Available Probabilistic reasoning is used when outcomes are unpredictable. We examine the methods which use probabilistic representations for all knowledge and which reason by propagating the uncertainties can arise from evidence and assertions to conclusions. The uncertainties can arise from an inability to predict outcomes due to unreliable, vague, in complete or inconsistent knowledge. Some approaches taken in Artificial Intelligence system to deal with reasoning under similar types of uncertain conditions.
Probabilistic nature in L/H transition
Energy Technology Data Exchange (ETDEWEB)
Toda, Shinichiro; Itoh, Sanae-I.; Yagi, Masatoshi [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka; Fukuyama, Atsushi
1999-11-01
Statistical picture for an excitation of a plasma transition is examined, which occurs in a strongly turbulent state. The physical picture of transition phenomena is extended to include the statistical variances. The dynamics of the plasma density and turbulent-driven flux is studied with hysteresis nature in the flux-density relation. The probabilistic excitation is predicted and the critical conditions are described by the probabilistic distribution function. The stability for the model equations is also discussed. (author)
Semantics of probabilistic processes an operational approach
Deng, Yuxin
2015-01-01
This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us
Shultz, James M; McLean, Andrew; Herberman Mash, Holly B; Rosen, Alexa; Kelly, Fiona; Solo-Gabriele, Helena M; Youngs Jr, Georgia A; Jensen, Jessica; Bernal, Oscar; Neria, Yuval
2013-01-01
Introduction. In 2011, following heavy winter snowfall, two cities bordering two rivers in North Dakota, USA faced major flood threats. Flooding was foreseeable and predictable although the extent of risk was uncertain. One community, Fargo, situated in a shallow river basin, successfully mitigated and prevented flooding. For the other community, Minot, located in a deep river valley, prevention was not possible and downtown businesses and one-quarter of the homes were inundated, in the city’s worst flood on record. We aimed at contrasting the respective hazards, vulnerabilities, stressors, psychological risk factors, psychosocial consequences, and disaster risk reduction strategies under conditions where flood prevention was, and was not, possible. Methods. We applied the “trauma signature analysis” (TSIG) approach to compare the hazard profiles, identify salient disaster stressors, document the key components of disaster risk reduction response, and examine indicators of community resilience. Results. Two demographically-comparable communities, Fargo and Minot, faced challenging river flood threats and exhibited effective coordination across community sectors. We examined the implementation of disaster risk reduction strategies in situations where coordinated citizen action was able to prevent disaster impact (hazard avoidance) compared to the more common scenario when unpreventable disaster strikes, causing destruction, harm, and distress. Across a range of indicators, it is clear that successful mitigation diminishes both physical and psychological impact, thereby reducing the trauma signature of the event. Conclusion. In contrast to experience of historic flooding in Minot, the city of Fargo succeeded in reducing the trauma signature by way of reducing risk through mitigation. PMID:28228985
Probabilistic Prediction of Lifetimes of Ceramic Parts
Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.
2006-01-01
ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.
Probabilistic Choice, Reversibility, Loops, and Miracles
Stoddart, Bill; Bell, Pete
We consider an addition of probabilistic choice to Abrial's Generalised Substitution Language (GSL) in a form that accommodates the backtracking interpretation of non-deterministic choice. Our formulation is introduced as an extension of the Prospective Values formalism we have developed to describe the results from a backtracking search. Significant features are that probabilistic choice is governed by feasibility, and non-termination is strict. The former property allows us to use probabilistic choice to generate search heuristics. In this paper we are particularly interested in iteration. By demonstrating sub-conjunctivity and monotonicity properties of expectations we give the basis for a fixed point semantics of iterative constructs, and we consider the practical proof treatment of probabilistic loops. We discuss loop invariants, loops with probabilistic behaviour, and probabilistic termination in the context of a formalism in which a small probability of non-termination can dominate our calculations, proposing a method of limits to avoid this problem. The formal programming constructs described have been implemented in a reversible virtual machine (RVM).
Refinement for Probabilistic Systems with Nondeterminism
Directory of Open Access Journals (Sweden)
David Streader
2011-06-01
Full Text Available Before we combine actions and probabilities two very obvious questions should be asked. Firstly, what does "the probability of an action" mean? Secondly, how does probability interact with nondeterminism? Neither question has a single universally agreed upon answer but by considering these questions at the outset we build a novel and hopefully intuitive probabilistic event-based formalism. In previous work we have characterised refinement via the notion of testing. Basically, if one system passes all the tests that another system passes (and maybe more we say the first system is a refinement of the second. This is, in our view, an important way of characterising refinement, via the question "what sort of refinement should I be using?" We use testing in this paper as the basis for our refinement. We develop tests for probabilistic systems by analogy with the tests developed for non-probabilistic systems. We make sure that our probabilistic tests, when performed on non-probabilistic automata, give us refinement relations which agree with for those non-probabilistic automata. We formalise this property as a vertical refinement.
Flood Risk Analysis and Flood Potential Losses Assessment
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
The heavy floods in the Taihu Basin showed increasing trend in recent years. In thiswork, a typical area in the northern Taihu Basin was selected for flood risk analysis and potentialflood losses assessment. Human activities have strong impact on the study area' s flood situation (asaffected by the polders built, deforestation, population increase, urbanization, etc. ), and havemade water level higher, flood duration shorter, and flood peaks sharper. Five years of differentflood return periods [(1970), 5 (1962), 10 (1987), 20 (1954), 50 (1991)] were used to cal-culate the potential flood risk area and its losses. The potential flood risk map, economic losses,and flood-impacted population were also calculated. The study's main conclusions are: 1 ) Humanactivities have strongly changed the natural flood situation in the study area, increasing runoff andflooding; 2) The flood risk area is closely related with the precipitation center; 3) Polder construc-tion has successfully protected land from flood, shortened the flood duration, and elevated waterlevel in rivers outside the polders; 4) Economic and social development have caused flood losses toincrease in recent years.
A Bayesian Network approach for flash flood risk assessment
Boutkhamouine, Brahim; Roux, Hélène; Pérès, François
2017-04-01
influencing variables. Each node of the graph corresponds to a variable and arcs represent the probabilistic dependencies between these variables. Both the quantification of the strength of these probabilistic dependencies and the computation of inferences are based on Bayes' theorem. In order to use BNs for the assessment of the flooding risks, the modelling work is divided into two parts. First, identifying all the factors controlling the flood generation. The qualitative explanation of this issue is then reached by establishing the cause and effect relationships between these factors. These underlying relationships are represented in what we call Conditional Probabilities Tables (CPTs). The next step is to estimate these CPTs using information coming from network of sensors, databases and expertise. By using this basic cognitive structure, we will be able to estimate the magnitude of flood risk in a small geographical area with a homogeneous hydrological system. The second part of our work will be dedicated to the estimation of this risk on the scale of a basin. To do so, we will create a spatio-temporal model able to take in consideration both spatial and temporal variability of all factors involved in the flood generation. Key words: Flash flood forecasting - Uncertainty modelling - flood risk management -Bayesian Networks.
Computing Distances between Probabilistic Automata
Directory of Open Access Journals (Sweden)
Mathieu Tracol
2011-07-01
Full Text Available We present relaxed notions of simulation and bisimulation on Probabilistic Automata (PA, that allow some error epsilon. When epsilon is zero we retrieve the usual notions of bisimulation and simulation on PAs. We give logical characterisations of these notions by choosing suitable logics which differ from the elementary ones, L with negation and L without negation, by the modal operator. Using flow networks, we show how to compute the relations in PTIME. This allows the definition of an efficiently computable non-discounted distance between the states of a PA. A natural modification of this distance is introduced, to obtain a discounted distance, which weakens the influence of long term transitions. We compare our notions of distance to others previously defined and illustrate our approach on various examples. We also show that our distance is not expansive with respect to process algebra operators. Although L without negation is a suitable logic to characterise epsilon-(bisimulation on deterministic PAs, it is not for general PAs; interestingly, we prove that it does characterise weaker notions, called a priori epsilon-(bisimulation, which we prove to be NP-difficult to decide.
Pearl A Probabilistic Chart Parser
Magerman, D M; Magerman, David M.; Marcus, Mitchell P.
1994-01-01
This paper describes a natural language parsing algorithm for unrestricted text which uses a probability-based scoring function to select the "best" parse of a sentence. The parser, Pearl, is a time-asynchronous bottom-up chart parser with Earley-type top-down prediction which pursues the highest-scoring theory in the chart, where the score of a theory represents the extent to which the context of the sentence predicts that interpretation. This parser differs from previous attempts at stochastic parsers in that it uses a richer form of conditional probabilities based on context to predict likelihood. Pearl also provides a framework for incorporating the results of previous work in part-of-speech assignment, unknown word models, and other probabilistic models of linguistic features into one parsing tool, interleaving these techniques instead of using the traditional pipeline architecture. In preliminary tests, Pearl has been successful at resolving part-of-speech and word (in speech processing) ambiguity, dete...
Optimal probabilistic dense coding schemes
Kögler, Roger A.; Neves, Leonardo
2017-04-01
Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.
Probabilistic description of traffic flow
Mahnke, R.; Kaupužs, J.; Lubashevsky, I.
2005-03-01
A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.
Probabilistic description of traffic breakdowns.
Kühne, Reinhart; Mahnke, Reinhard; Lubashevsky, Ihor; Kaupuzs, Jevgenijs
2002-06-01
We analyze the characteristic features of traffic breakdown. To describe this phenomenon we apply the probabilistic model regarding the jam emergence as the formation of a large car cluster on a highway. In these terms, the breakdown occurs through the formation of a certain critical nucleus in the metastable vehicle flow, which enables us to confine ourselves to one cluster model. We assume that, first, the growth of the car cluster is governed by attachment of cars to the cluster whose rate is mainly determined by the mean headway distance between the car in the vehicle flow and, maybe, also by the headway distance in the cluster. Second, the cluster dissolution is determined by the car escape from the cluster whose rate depends on the cluster size directly. The latter is justified using the available experimental data for the correlation properties of the synchronized mode. We write the appropriate master equation converted then into the Fokker-Planck equation for the cluster distribution function and analyze the formation of the critical car cluster due to the climb over a certain potential barrier. The further cluster growth irreversibly causes jam formation. Numerical estimates of the obtained characteristics and the experimental data of the traffic breakdown are compared. In particular, we draw a conclusion that the characteristic intrinsic time scale of the breakdown phenomenon should be about 1 min and explain the case why the traffic volume interval inside which traffic breakdown is observed is sufficiently wide.
Dynamical systems probabilistic risk assessment.
Energy Technology Data Exchange (ETDEWEB)
Denman, Matthew R.; Ames, Arlo Leroy
2014-03-01
Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.
Dynamical systems probabilistic risk assessment
Energy Technology Data Exchange (ETDEWEB)
Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ames, Arlo Leroy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2014-03-01
Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.
Crowdsourcing detailed flood data
Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad
2015-04-01
Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK
Developments of the European Flood Awareness System (EFAS)
Thiemig, Vera; Olav Skøien, Jon; Salamon, Peter; Pappenberger, Florian; Wetterhall, Fredrik; Holst, Bo; Asp, Sara-Sophia; Garcia Padilla, Mercedes; Garcia, Rafael J.; Schweim, Christoph; Ziese, Markus
2017-04-01
EFAS (http://www.efas.eu) is an operational system for flood forecasting and early warning for the entire Europe, which is fully operational as part of the Copernicus Emergency Management Service since 2012. The prime aim of EFAS is to gain time for preparedness measures before major flood events - particularly in trans-national river basins - strike. This is achieved by providing complementary, added value information to the national and regional services holding the mandate for flood warning as well as to the ERCC (European Response and Coordination Centre). Using a coherent model for all of Europe forced with a range of deterministic and ensemble weather forecasts, the system can give a probabilistic flood forecast for a medium range lead time (up to 10 days) independent of country borders. The system is under continuous development, and we will present the basic set up, some prominent examples of recent and ongoing developments (such as the rapid impact assessment, seasonal outlook and the extended domain) and the future challenges.
Assessing Flood Risk Using Reservoir Flood Control Rules
Institute of Scientific and Technical Information of China (English)
Xiang Fu; Yadong Mei; Zhihuai Xiao
2016-01-01
The application of conventional flood operation regulation is restricted due to insufficient description of flood control rules for the Pubugou Reservoir in southern China. Based on the require-ments of different flood control objects, this paper proposes to optimize flood control rules with punish-ment mechanism by defining different parameters of flood control rules in response to flood inflow fore-cast and reservoir water level. A genetic algorithm is adopted for solving parameter optimization problem. The failure risk and overflow volume of the downstream insufficient flood control capacity are assessed through the reservoir operation policies. The results show that an optimised regulation can provide better performance than the current flood control rules.
Believe it or not? The challenge of validating large scale probabilistic risk models
Directory of Open Access Journals (Sweden)
Sayers Paul
2016-01-01
Full Text Available The National Flood Risk Assessment (NaFRA for England and Wales was initially undertaken in 2002 with frequent updates since. NaFRA has become a key source of information on flood risk, informing policy and investment decisions as well as communicating risk to the public and insurers. To make well informed decisions based on these data, users rightfully demand to know the confidence they can place in them. The probability of inundation and associated damage however cannot be validated in the traditional sense, due the rare and random nature of damaging floods and the lack of a long (and widespread stationary observational record (reflecting not only changes in climate but also the significant changes in land use and flood defence infrastructure that are likely to have occurred. To explore the validity of NaFRA this paper therefore provides a bottom-up qualitative exploration of the potential errors within the supporting methods and data. The paper concludes by underlining the need for further research to understand how to robustly validate probabilistic risk models.
Cranston, Michael; Speight, Linda; Maxey, Richard; Tavendale, Amy; Buchanan, Peter
2015-04-01
One of the main challenges for the flood forecasting community remains the provision of reliable early warnings of surface (or pluvial) flooding. The Scottish Flood Forecasting Service has been developing approaches for forecasting the risk of surface water flooding including capitalising on the latest developments in quantitative precipitation forecasting from the Met Office. A probabilistic Heavy Rainfall Alert decision support tool helps operational forecasters assess the likelihood of surface water flooding against regional rainfall depth-duration estimates from MOGREPS-UK linked to historical short-duration flooding in Scotland. The surface water flood risk is communicated through the daily Flood Guidance Statement to emergency responders. A more recent development is an innovative risk-based hydrometeorological approach that links 24-hour ensemble rainfall forecasts through a hydrological model (Grid-to-Grid) to a library of impact assessments (Speight et al., 2015). The early warning tool - FEWS Glasgow - presents the risk of flooding to people, property and transport across a 1km grid over the city of Glasgow with a lead time of 24 hours. Communication of the risk was presented in a bespoke surface water flood forecast product designed based on emergency responder requirements and trialled during the 2014 Commonwealth Games in Glasgow. The development of new approaches to surface water flood forecasting are leading to improved methods of communicating the risk and better performance in early warning with a reduction in false alarm rates with summer flood guidance in 2014 (67%) compared to 2013 (81%) - although verification of instances of surface water flooding remains difficult. However the introduction of more demanding hydrometeorological capabilities with associated greater levels of uncertainty does lead to an increased demand on operational flood forecasting skills and resources. Speight, L., Cole, S.J., Moore, R.J., Pierce, C., Wright, B., Golding, B
Orhan, A Emin; Ma, Wei Ji
2017-07-26
Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.
Composite Flood Risk for New Jersery
U.S. Environmental Protection Agency — The Composite Flood Risk layer combines flood hazard datasets from Federal Emergency Management Agency (FEMA) flood zones, NOAA's Shallow Coastal Flooding, and the...
Composite Flood Risk for Virgin Island
U.S. Environmental Protection Agency — The Composite Flood Risk layer combines flood hazard datasets from Federal Emergency Management Agency (FEMA) flood zones, NOAA's Shallow Coastal Flooding, and the...
Flood Risk Management In Europe: European flood regulation
Hegger, D.L.T.; Bakker, M.H.; Green, C.; Driessen, Peter; Delvaux, B.; Rijswick, H.F.M.W. van; Suykens, C.; Beyers, J-C.; Deketelaere, K.; Doorn-Hoekveld, W. van; Dieperink, C.
2013-01-01
In Europe, water management is moving from flood defense to a risk management approach, which takes both the probability and the potential consequences of flooding into account. In this report, we will look at Directives and (non-)EU- initiatives in place to deal with flood risk in Europe indirectly
Improving Global Flood Forecasting using Satellite Detected Flood Extent
Revilla Romero, B.
2016-01-01
Flooding is a natural global phenomenon but in many cases is exacerbated by human activity. Although flooding generally affects humans in a negative way, bringing death, suffering, and economic impacts, it also has potentially beneficial effects. Early flood warning and forecasting systems, as well
Improving Global Flood Forecasting using Satellite Detected Flood Extent
Revilla Romero, B.
2016-01-01
Flooding is a natural global phenomenon but in many cases is exacerbated by human activity. Although flooding generally affects humans in a negative way, bringing death, suffering, and economic impacts, it also has potentially beneficial effects. Early flood warning and forecasting systems, as well
Merging information from multi-model flood projections in a hierarchical Bayesian framework
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Real-time updating of the flood frequency distribution through data assimilation
Aguilar, Cristina; Montanari, Alberto; Polo, María-José
2017-07-01
We explore the memory properties of catchments for predicting the likelihood of floods based on observations of average flows in pre-flood seasons. Our approach assumes that flood formation is driven by the superimposition of short- and long-term perturbations. The former is given by the short-term meteorological forcing leading to infiltration and/or saturation excess, while the latter is originated by higher-than-usual storage in the catchment. To exploit the above sensitivity to long-term perturbations, a meta-Gaussian model and a data assimilation approach are implemented for updating the flood frequency distribution a season in advance. Accordingly, the peak flow in the flood season is predicted in probabilistic terms by exploiting its dependence on the average flow in the antecedent seasons. We focus on the Po River at Pontelagoscuro and the Danube River at Bratislava. We found that the shape of the flood frequency distribution is noticeably impacted by higher-than-usual flows occurring up to several months earlier. The proposed technique may allow one to reduce the uncertainty associated with the estimation of flood frequency.
Williams, P.; Huddelston, M.; Michel, G.; Thompson, S.; Heynert, K.; Pickering, C.; Abbott Donnelly, I.; Fewtrell, T.; Galy, H.; Sperna Weiland, F.; Winsemius, H.; Weerts, A.; Nixon, S.; Davies, P.; Schiferli, D.
2012-04-01
Recently, a Global Flood Model (GFM) initiative has been proposed by Willis, UK Met Office, Esri, Deltares and IBM. The idea is to create a global community platform that enables better understanding of the complexities of flood risk assessment to better support the decisions, education and communication needed to mitigate flood risk. The GFM will provide tools for assessing the risk of floods, for devising mitigation strategies such as land-use changes and infrastructure improvements, and for enabling effective pre- and post-flood event response. The GFM combines humanitarian and commercial motives. It will benefit: - The public, seeking to preserve personal safety and property; - State and local governments, seeking to safeguard economic activity, and improve resilience; - NGOs, similarly seeking to respond proactively to flood events; - The insurance sector, seeking to understand and price flood risk; - Large corporations, seeking to protect global operations and supply chains. The GFM is an integrated and transparent set of modules, each composed of models and data. For each module, there are two core elements: a live "reference version" (a worked example) and a framework of specifications, which will allow development of alternative versions. In the future, users will be able to work with the reference version or substitute their own models and data. If these meet the specification for the relevant module, they will interoperate with the rest of the GFM. Some "crowd-sourced" modules could even be accredited and published to the wider GFM community. Our intent is to build on existing public, private and academic work, improve local adoption, and stimulate the development of multiple - but compatible - alternatives, so strengthening mankind's ability to manage flood impacts. The GFM is being developed and managed by a non-profit organization created for the purpose. The business model will be inspired from open source software (eg Linux): - for non-profit usage
Directory of Open Access Journals (Sweden)
H. Apel
2015-08-01
Full Text Available Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU for time-efficient flood propagation modelling. All hazards – fluvial, pluvial and combined – were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median
Minnesota Department of Natural Resources — FEMA flood hazard delineations are used by the Federal Emergency Management Agency (FEMA) to designate the Special Flood Hazard Area (SFHA) and for insurance rating...
FEMA DFIRM Base Flood Elevations
Minnesota Department of Natural Resources — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally,...
2013 FEMA Flood Hazard Boundaries
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
FLOOD CHARACTERISTICS AND MANAGEMENT ADAPTATIONS ...
African Journals Online (AJOL)
Dr Osondu
2011-10-26
, bearing flood losses and land ... Engineering control of the major tributaries of the Imo River system is required to ..... on previous knowledge of physical nature of flood ... uptake; other factors include a lack of formal titles to.
2013 FEMA Base Flood Elevation
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Base Flood Elevation (BFE) Lines
Department of Homeland Security — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally if...
National Flood Hazard Layer (NFHL)
Federal Emergency Management Agency, Department of Homeland Security — The National Flood Hazard Layer (NFHL) is a compilation of GIS data that comprises a nationwide digital Flood Insurance Rate Map. The GIS data and services are...
Kansas Data Access and Support Center — The Q3 Flood Data are derived from the Flood Insurance Rate Maps (FIRMS) published by the Federal Emergency Management Agency (FEMA). The file is georeferenced to...
California Department of Resources — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...
2013 FEMA Flood Control Structures
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Multivariate pluvial flood damage models
Energy Technology Data Exchange (ETDEWEB)
Van Ootegem, Luc [HIVA — University of Louvain (Belgium); SHERPPA — Ghent University (Belgium); Verhofstadt, Elsy [SHERPPA — Ghent University (Belgium); Van Herck, Kristine; Creten, Tom [HIVA — University of Louvain (Belgium)
2015-09-15
Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks.
Application of probabilistic precipitation forecasts from a ...
African Journals Online (AJOL)
2014-02-14
Feb 14, 2014 ... potential flash floods in support of the flash flood warning system of the South African Weather Service (SAWS). The aim of this ... life in the form of water, can become extremely hostile and violent in ... scenarios for operational decision making. ... extrapolation techniques or statistical modelling using radar.
Apel, Heiko; Garschagen, Matthias; Delgado, José Miguel; Viet Dung, Nguyen; Van Tuan, Vo; Thanh Binh, Nguyen; Birkmann, Joern; Merz, Bruno
2013-04-01
Low lying estuaries as the Mekong Delta in Vietnam are among the most vulnerable areas with respect to climate change impacts. While regular floods are not a threat but an opportunity for livelihoods and income generation, extreme flood events can pose considerable risks to the people living in Deltas. Climate change is expected to increase the frequency of extreme floods globally, which in combination with sea level rise and a likely intensification of cyclone activity creates increased and/or entirely new hazard exposure in the Deltas. Yet, in line with the risk literature and especially the recent IPCC SREX report, flooding risk needs to be understood as deriving from the interaction of physical hazards and the vulnerabilities of exposed elements. Therefore, the paper aims for an integrated risk assessment through combining the most up to date estimates of flood hazard projections under climate change conditions in the Mekong Delta with the assessment of vulnerability patterns. Projections of flood hazard are estimated based the modulation of the flood frequency distribution by atmospheric circulation patterns. Future projections of these patterns are calculated from an ensemble of climate models. A quasi two-dimensional hydrodynamical model of the Delta is then applied to estimate water levels and flood extend. This model is fed with a set of hydrographs which are based on both the derived climate model uncertainty and the bivariate nature of floods in the Mekong Delta. Flood peak is coupled with flood volume in the probabilistic framework to derive synthetic extreme future floods with associated probabilities of occurrence. This flood hazard analysis is combined with static sea level rise scenarios, which alter the lower boundary of the hydrodynamic model and give estimates of the impact on sea level rise on inundation extend and depths. The vulnerability assessment is based on a three step approach. Firstly, vulnerability profiles are developed for different
Capturing spatial and temporal patterns of widespread, extreme flooding across Europe
Busby, Kathryn; Raven, Emma; Liu, Ye
2013-04-01
Statistical characterisation of physical hazards is an integral part of probabilistic catastrophe models used by the reinsurance industry to estimate losses from large scale events. Extreme flood events are not restricted by country boundaries which poses an issue for reinsurance companies as their exposures often extend beyond them. We discuss challenges and solutions that allow us to appropriately capture the spatial and temporal dependence of extreme hydrological events on a continental-scale, which in turn enables us to generate an industry-standard stochastic event set for estimating financial losses for widespread flooding. By presenting our event set methodology, we focus on explaining how extreme value theory (EVT) and dependence modelling are used to account for short, inconsistent hydrological data from different countries, and how to make appropriate statistical decisions that best characterise the nature of flooding across Europe. The consistency of input data is of vital importance when identifying historical flood patterns. Collating data from numerous sources inherently causes inconsistencies and we demonstrate our robust approach to assessing the data and refining it to compile a single consistent dataset. This dataset is then extrapolated using a parameterised EVT distribution to estimate extremes. Our method then captures the dependence of flood events across countries using an advanced multivariate extreme value model. Throughout, important statistical decisions are explored including: (1) distribution choice; (2) the threshold to apply for extracting extreme data points; (3) a regional analysis; (4) the definition of a flood event, which is often linked with reinsurance industry's hour's clause; and (5) handling of missing values. Finally, having modelled the historical patterns of flooding across Europe, we sample from this model to generate our stochastic event set comprising of thousands of events over thousands of years. We then briefly
Duband, D.
2009-09-01
historical daily situations responsible of extreme floods with larges discharges, with the conditional precipitations associated on catchments with god and up to date observations of precipitations (daily, hourly). This kind of complete studies would be very useful for: -Statistical-physical studies of extreme rainfall-flood events (peak discharge, volume), frequency-probability-uncertainty (GRADEX and SHADEX methodology), -Better forecasting of meteorological (precipitations) and hydrological (floods) events, during crisis situations, -better understanding of the historical variability in the past 2 centuries (atmospheric features, precipitations, discharges high/low), -Better adjustment of modelling simulation, -Better identification and probabilistic approach of uncertainties.
Optimal strategies for flood prevention
Eijgenraam, Carel; Brekelmans, Ruud; den Hertog, Dick; Roos, C.
2016-01-01
Flood prevention policy is of major importance to the Netherlands since a large part of the country is below sea level and high water levels in rivers may also cause floods. In this paper we propose a dike height optimization model to determine economically efficient flood protection standards. We i
Theresa K. Andersen; Marshall J. Shepherd
2013-01-01
Atmospheric warming and associated hydrological changes have implications for regional flood intensity and frequency. Climate models and hydrological models have the ability to integrate various contributing factors and assess potential changes to hydrology at global to local scales through the century. This survey of floods in a changing climate reviews flood...
Probabilistic tsunami hazard assessment at Seaside, Oregon, for near- and far-field seismic sources
GonzáLez, F. I.; Geist, E. L.; Jaffe, B.; KâNoǧLu, U.; Mofjeld, H.; Synolakis, C. E.; Titov, V. V.; Arcas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.
2009-11-01
The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk.
Residual ultimate strength of a very large crude carrier considering probabilistic damage extents
Directory of Open Access Journals (Sweden)
Choung Joonmo
2014-03-01
Full Text Available This paper provides the prediction of ultimate longitudinal strengths of the hull girders of a very large crude carrier considering probabilistic damage extent due to collision and grounding accidents based on IMO Guidelines (2003. The probabilistic density functions of damage extent are expressed as a function of non-dimensional damage variables. The accumulated probabilistic levels of 10%, 30%, 50%, and 70% are taken into account for the estimation of damage extent. The ultimate strengths have been calculated using the in-house software called Ultimate Moment Analysis of Damaged Ships which is based on the progressive collapse method, with a new convergence criterion of force vector equilibrium. Damage indices are provided for several probable heeling angles from 0° (sagging to 180° (hogging due to collision- and grounding-induced structural failures and consequent flooding of compartments. This paper proves from the residual strength analyses that the second moment of area of a damage section can be a reliable index for the estimation of the residual ultimate strength. A simple polynomial formula is also proposed based on minimum residual ultimate strengths.
Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources
Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.
2009-01-01
The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.
Probabilistic numerics and uncertainty in computations.
Hennig, Philipp; Osborne, Michael A; Girolami, Mark
2015-07-08
We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.
Pan-European stochastic flood event set
Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav
2017-04-01
Impact Forecasting (IF), the model development center of Aon Benfield, has been developing a large suite of catastrophe flood models on probabilistic bases for individual countries in Europe. Such natural catastrophes do not follow national boundaries: for example, the major flood in 2016 was responsible for the Europe's largest insured loss of USD3.4bn and affected Germany, France, Belgium, Austria and parts of several other countries. Reflecting such needs, IF initiated a pan-European flood event set development which combines cross-country exposures with country based loss distributions to provide more insightful data to re/insurers. Because the observed discharge data are not available across the whole Europe in sufficient quantity and quality to permit a detailed loss evaluation purposes, a top-down approach was chosen. This approach is based on simulating precipitation from a GCM/RCM model chain followed by a calculation of discharges using rainfall-runoff modelling. IF set up this project in a close collaboration with Karlsruhe Institute of Technology (KIT) regarding the precipitation estimates and with University of East Anglia (UEA) in terms of the rainfall-runoff modelling. KIT's main objective is to provide high resolution daily historical and stochastic time series of key meteorological variables. A purely dynamical downscaling approach with the regional climate model COSMO-CLM (CCLM) is used to generate the historical time series, using re-analysis data as boundary conditions. The resulting time series are validated against the gridded observational dataset E-OBS, and different bias-correction methods are employed. The generation of the stochastic time series requires transfer functions between large-scale atmospheric variables and regional temperature and precipitation fields. These transfer functions are developed for the historical time series using reanalysis data as predictors and bias-corrected CCLM simulated precipitation and temperature as
The European Flood Alert System – Part 1: Concept and development
Directory of Open Access Journals (Sweden)
A. de Roo
2008-02-01
Full Text Available This paper presents the development of the European Flood Alert System (EFAS, which aims at increasing preparedness for floods in trans-national European river basins by providing local water authorities with medium-range and probabilistic flood forecasting information 3 to 10 days in advance. The EFAS research project started in 2003 with the development of a prototype at the European Commission Joint Research Centre (JRC, in close collaboration with the national hydrological and meteorological services. The prototype covers the whole of Europe on a 5 km grid. In parallel, different high-resolution data sets have been collected for the Elbe and Danube river basins, allowing the potential of the system under optimum conditions and on a higher resolution, to be assessed. Flood warning lead-times of 3–10 days are achieved through the incorporation of medium-range weather forecasts from the Deutscher Wetterdienst (DWD and the European Centre for Medium-Range Weather Forecasts (ECMWF, comprising a full set of 51 probabilistic forecasts from the Ensemble Prediction System (EPS provided by ECMWF. The ensemble of different hydrographs is analysed and combined to produce early flood warning information, which is disseminated to the hydrological services that have agreed to participate in the development of the system. In Part 1 of this paper, the scientific approach adopted in development of the system is presented. The rational of the project, the system's set-up, its underlying components, basic principles, and products, are described. In Part 2, results of a detailed statistical analysis of the performance of the system are shown, with regard to both probabilistic and deterministic forecasts
The European Flood Alert System – Part 1: Concept and development
Directory of Open Access Journals (Sweden)
J. Thielen
2009-02-01
Full Text Available This paper presents the development of the European Flood Alert System (EFAS, which aims at increasing preparedness for floods in trans-national European river basins by providing local water authorities with medium-range and probabilistic flood forecasting information 3 to 10 days in advance. The EFAS research project started in 2003 with the development of a prototype at the European Commission Joint Research Centre (JRC, in close collaboration with the national hydrological and meteorological services. The prototype covers the whole of Europe on a 5 km grid. In parallel, different high-resolution data sets have been collected for the Elbe and Danube river basins, allowing the potential of the system under optimum conditions and on a higher resolution to be assessed. Flood warning lead-times of 3–10 days are achieved through the incorporation of medium-range weather forecasts from the German Weather Service (DWD and the European Centre for Medium-Range Weather Forecasts (ECMWF, comprising a full set of 51 probabilistic forecasts from the Ensemble Prediction System (EPS provided by ECMWF. The ensemble of different hydrographs is analysed and combined to produce early flood warning information, which is disseminated to the hydrological services that have agreed to participate in the development of the system. In Part 1 of this paper, the scientific approach adopted in the development of the system is presented. The rational of the project, the system�s set-up, its underlying components, basic principles and products are described. In Part 2, results of a detailed statistical analysis of the performance of the system are shown, with regard to both probabilistic and deterministic forecasts.
Rethinking the relationship between flood risk perception and flood management.
Birkholz, S; Muro, M; Jeffrey, P; Smith, H M
2014-04-15
Although flood risk perceptions and their concomitant motivations for behaviour have long been recognised as significant features of community resilience in the face of flooding events, there has, for some time now, been a poorly appreciated fissure in the accompanying literature. Specifically, rationalist and constructivist paradigms in the broader domain of risk perception provide different (though not always conflicting) contexts for interpreting evidence and developing theory. This contribution reviews the major constructs that have been applied to understanding flood risk perceptions and contextualises these within broader conceptual developments around risk perception theory and contemporary thinking around flood risk management. We argue that there is a need to re-examine and re-invigorate flood risk perception research, in a manner that is comprehensively underpinned by more constructivist thinking around flood risk management as well as by developments in broader risk perception research. We draw attention to an historical over-emphasis on the cognitive perceptions of those at risk to the detriment of a richer understanding of a wider range of flood risk perceptions such as those of policy-makers or of tax-payers who live outside flood affected areas as well as the linkages between these perspectives and protective measures such as state-supported flood insurance schemes. Conclusions challenge existing understandings of the relationship between risk perception and flood management, particularly where the latter relates to communication strategies and the extent to which those at risk from flooding feel responsible for taking protective actions.
2002-01-01
Heavy rains in Central Europe over the past few weeks have led to some of the worst flooding the region has witnessed in more than a century. The floods have killed more than 100 people in Germany, Russia, Austria, Hungary, and the Czech Republic and have led to as much as $20 billion in damage. This false-color image of the Elbe River and its tributaries was taken on August 20, 2002, by the Moderate Resolution Imaging Spectroradiometer (MODIS), flying aboard NASA's Terra satellite. The floodwaters that inundated Dresden, Germany, earlier this week have moved north. As can be seen, the river resembles a fairly large lake in the center of the image just south of the town of Wittenberg. Flooding was also bad further downriver in the towns of Maqgdeburge and Hitzacker. Roughly 20,000 people were evacuated from their homes in northern Germany. Fifty thousand troops, border police, and technical assistance workers were called in to combat the floods along with 100,000 volunteers. The floodwaters are not expected to badly affect Hamburg, which sits on the mouth of the river on the North Sea. Credit:Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC
Institute of Scientific and Technical Information of China (English)
LI LI
2010-01-01
@@ Aseries of heavy storms since early May led to severe flooding and landslides in south and southwest China,causing heaw casualties and economic losses.Severe convective weather such as downpours,gusts,hail and thunderstorms attacked these areas over a week from May 5.
Clementi, Andrea; Silvestri, Riccardo
2010-01-01
We consider a Mobile Ad-hoc NETwork (MANET) formed by n agents that move at speed V according to the Manhattan Random-Way Point model over a square region of side length L. The resulting stationary (agent) spatial probability distribution is far to be uniform: the average density over the "central zone" is asymptotically higher than that over the "suburb". Agents exchange data iff they are at distance at most R within each other. We study the flooding time of this MANET: the number of time steps required to broadcast a message from one source agent to all agents of the network in the stationary phase. We prove the first asymptotical upper bound on the flooding time. This bound holds with high probability, it is a decreasing function of R and V, and it is tight for a wide and relevant range of the network parameters (i.e. L, R and V). A consequence of our result is that flooding over the sparse and highly-disconnected suburb can be as fast as flooding over the dense and connected central zone. Rather surprisin...
Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Demonstration
Energy Technology Data Exchange (ETDEWEB)
Curtis Smith; Steven Prescott; Tony Koonce
2014-04-01
A key area of the Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) strategy is the development of methodologies and tools that will be used to predict the safety, security, safeguards, performance, and deployment viability of SMRs. The goal of the SMR PRA activity will be to develop quantitative methods and tools and the associated analysis framework for assessing a variety of risks. Development and implementation of SMR-focused safety assessment methods may require new analytic methods or adaptation of traditional methods to the advanced design and operational features of SMRs. We will need to move beyond the current limitations such as static, logic-based models in order to provide more integrated, scenario-based models based upon predictive modeling which are tied to causal factors. The development of SMR-specific safety models for margin determination will provide a safety case that describes potential accidents, design options (including postulated controls), and supports licensing activities by providing a technical basis for the safety envelope. This report documents the progress that was made to implement the PRA framework, specifically by way of demonstration of an advanced 3D approach to representing, quantifying and understanding flooding risks to a nuclear power plant.
Probabilistic Aspects in Spoken Document Retrieval
Directory of Open Access Journals (Sweden)
Macherey Wolfgang
2003-01-01
Full Text Available Accessing information in multimedia databases encompasses a wide range of applications in which spoken document retrieval (SDR plays an important role. In SDR, a set of automatically transcribed speech documents constitutes the files for retrieval, to which a user may address a request in natural language. This paper deals with two probabilistic aspects in SDR. The first part investigates the effect of recognition errors on retrieval performance and inquires the question of why recognition errors have only a little effect on the retrieval performance. In the second part, we present a new probabilistic approach to SDR that is based on interpolations between document representations. Experiments performed on the TREC-7 and TREC-8 SDR task show comparable or even better results for the new proposed method than other advanced heuristic and probabilistic retrieval metrics.
A Model-Driven Probabilistic Parser Generator
Quesada, Luis; Cortijo, Francisco J
2012-01-01
Existing probabilistic scanners and parsers impose hard constraints on the way lexical and syntactic ambiguities can be resolved. Furthermore, traditional grammar-based parsing tools are limited in the mechanisms they allow for taking context into account. In this paper, we propose a model-driven tool that allows for statistical language models with arbitrary probability estimators. Our work on model-driven probabilistic parsing is built on top of ModelCC, a model-based parser generator, and enables the probabilistic interpretation and resolution of anaphoric, cataphoric, and recursive references in the disambiguation of abstract syntax graphs. In order to prove the expression power of ModelCC, we describe the design of a general-purpose natural language parser.
Modal Specifications for Probabilistic Timed Systems
Directory of Open Access Journals (Sweden)
Tingting Han
2013-06-01
Full Text Available Modal automata are a classic formal model for component-based systems that comes equipped with a rich specification theory supporting abstraction, refinement and compositional reasoning. In recent years, quantitative variants of modal automata were introduced for specifying and reasoning about component-based designs for embedded and mobile systems. These respectively generalize modal specification theories for timed and probabilistic systems. In this paper, we define a modal specification language for combined probabilistic timed systems, called abstract probabilistic timed automata, which generalizes existing formalisms. We introduce appropriate syntactic and semantic refinement notions and discuss consistency of our specification language, also with respect to time-divergence. We identify a subclass of our models for which we define the fundamental operations for abstraction, conjunction and parallel composition, and show several compositionality results.
Probabilistic Modeling and Visualization for Bankruptcy Prediction
DEFF Research Database (Denmark)
Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara
2017-01-01
In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...
Probabilistic inversion for chicken processing lines
Energy Technology Data Exchange (ETDEWEB)
Cooke, Roger M. [Department of Mathematics, Delft University of Technology, Delft (Netherlands)]. E-mail: r.m.cooke@ewi.tudelft.nl; Nauta, Maarten [Microbiological Laboratory for Health Protection RIVM, Bilthoven (Netherlands); Havelaar, Arie H. [Microbiological Laboratory for Health Protection RIVM, Bilthoven (Netherlands); Fels, Ine van der [Microbiological Laboratory for Health Protection RIVM, Bilthoven (Netherlands)
2006-10-15
We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism.
Scalable group level probabilistic sparse factor analysis
DEFF Research Database (Denmark)
Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard
2017-01-01
Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...
Probabilistic Grammar: The view from Cognitive Sociolinguistics
Directory of Open Access Journals (Sweden)
Jeroen Claes
2017-06-01
Full Text Available In this paper, I propose that Probabilistic Grammar may benefit from incorporating theoretical insights from Cognitive (SocioLinguistics. I begin by introducing Cognitive Linguistics. Then, I propose a model of the domain-general cognitive constraints (markedness of coding, statistical preemption, and structural priming that condition language (variation. Subsequently, three case studies are presented that test the predictions of this model on three distinct alternations in English and Spanish (variable agreement with existential 'haber', variable agreement with existential 'there be', and Spanish subject pronoun expression. For each case study, the model generates empirically correct predictions. I conclude that, with the support of Cognitive Sociolinguistics, Probabilistic Grammar may move beyond description towards explanation. This article is part of the special collection: Probabilistic grammars: Syntactic variation in a comparative perspective
bayesPop: Probabilistic Population Projections
Directory of Open Access Journals (Sweden)
Hana Ševčíková
2016-12-01
Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.
Probabilistic Transcriptome Assembly and Variant Graph Genotyping
DEFF Research Database (Denmark)
Sibbesen, Jonas Andreas
the resulting sequencing data should be interpreted. This has over the years spurred the development of many probabilistic methods that are capable of modelling dierent aspects of the sequencing process. Here, I present two of such methods that were developed to each tackle a dierent problem in bioinformatics......, together with an application of the latter method to a large Danish sequencing project. The rst is a probabilistic method for transcriptome assembly that is based on a novel generative model of the RNA sequencing process and provides condence estimates on the assembled transcripts. We show...... that this approach outperforms existing state-of-the-art methods measured using sensitivity and precision on both simulated and real data. The second is a novel probabilistic method that uses exact alignment of k-mers to a set of variants graphs to provide unbiased estimates of genotypes in a population...
Probabilistic Forecasting of the Wave Energy Flux
DEFF Research Database (Denmark)
Pinson, Pierre; Reikard, G.; Bidlot, J.-R.
2012-01-01
markets. A methodology for the probabilistic forecasting of the wave energy flux is introduced, based on a log-Normal assumption for the shape of predictive densities. It uses meteorological forecasts (from the European Centre for Medium-range Weather Forecasts – ECMWF) and local wave measurements......Wave energy will certainly have a significant role to play in the deployment of renewable energy generation capacities. As with wind and solar, probabilistic forecasts of wave power over horizons of a few hours to a few days are required for power system operation as well as trading in electricity...... as input. The parameters of the models involved are adaptively and recursively estimated. The methodology is evaluated for 13 locations around North-America over a period of 15months. The issued probabilistic forecasts substantially outperform the various benchmarks considered, with improvements between 6...
Constraint Processing in Lifted Probabilistic Inference
Kisynski, Jacek
2012-01-01
First-order probabilistic models combine representational power of first-order logic with graphical models. There is an ongoing effort to design lifted inference algorithms for first-order probabilistic models. We analyze lifted inference from the perspective of constraint processing and, through this viewpoint, we analyze and compare existing approaches and expose their advantages and limitations. Our theoretical results show that the wrong choice of constraint processing method can lead to exponential increase in computational complexity. Our empirical tests confirm the importance of constraint processing in lifted inference. This is the first theoretical and empirical study of constraint processing in lifted inference.
The probabilistic approach to human reasoning.
Oaksford, M; Chater, N
2001-08-01
A recent development in the cognitive science of reasoning has been the emergence of a probabilistic approach to the behaviour observed on ostensibly logical tasks. According to this approach the errors and biases documented on these tasks occur because people import their everyday uncertain reasoning strategies into the laboratory. Consequently participants' apparently irrational behaviour is the result of comparing it with an inappropriate logical standard. In this article, we contrast the probabilistic approach with other approaches to explaining rationality, and then show how it has been applied to three main areas of logical reasoning: conditional inference, Wason's selection task and syllogistic reasoning.
Probabilistic Design of Wave Energy Devices
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.
2011-01-01
Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...
Probabilistic Durability Analysis in Advanced Engineering Design
Directory of Open Access Journals (Sweden)
A. Kudzys
2000-01-01
Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.
Probabilistic assessment of uncertain adaptive hybrid composites
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1994-01-01
Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.
Quantum logic networks for probabilistic teleportation
Institute of Scientific and Technical Information of China (English)
刘金明; 张永生; 郭光灿
2003-01-01
By means of the primitive operations consisting of single-qubit gates, two-qubit controlled-not gates, Von Neuman measurement and classically controlled operations, we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit, atwo-particle entangled state, and an N-particle entanglement. Based on the quantum networks, we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.
Quantum logic networks for probabilistic teleportation
Institute of Scientific and Technical Information of China (English)
刘金明; 张永生; 等
2003-01-01
By eans of the primitive operations consisting of single-qubit gates.two-qubit controlled-not gates,Von Neuman measurement and classically controlled operations.,we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit,a two-particle entangled state,and an N-particle entanglement.Based on the quantum networks,we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.
Why are probabilistic laws governing quantum mechanics and neurobiology?
Kröger, H
2004-01-01
We address the question: Why are dynamical laws governing in quantum mechanics and in neuroscience of probabilistic nature instead of being deterministic? We discuss some ideas showing that the probabilistic option offers advantages over the deterministic one.
Why are probabilistic laws governing quantum mechanics and neurobiology?
Kröger, Helmut
2005-08-01
We address the question: Why are dynamical laws governing in quantum mechanics and in neuroscience of probabilistic nature instead of being deterministic? We discuss some ideas showing that the probabilistic option offers advantages over the deterministic one.
DEFF Research Database (Denmark)
Liang, Gengsheng; Mioc, Darka; Anton, François
2007-01-01
Under flood events, the ground traffic is blocked in and around the flooded area due to damages to roads and bridges. The traditional transportation network may not always help people to make a right decision for evacuation. In order to provide dynamic road information needed for flood rescue, we...... developed an adaptive web-based transportation network application using Oracle technology. Moreover, the geographic relationships between the road network and flood areas are taken into account. The overlay between the road network and flood polygons is computed on the fly. This application allows users...... to retrieve the shortest and safest route in Fredericton road network during flood event. It enables users to make a timely decision for flood rescue. We are using Oracle Spatial to deal with emergency situations that can be applied to other constrained network applications as well....
The state of the art of flood forecasting - Hydrological Ensemble Prediction Systems
Thielen-Del Pozo, J.; Pappenberger, F.; Salamon, P.; Bogner, K.; Burek, P.; de Roo, A.
2010-09-01
Flood forecasting systems form a key part of ‘preparedness' strategies for disastrous floods and provide hydrological services, civil protection authorities and the public with information of upcoming events. Provided the warning leadtime is sufficiently long, adequate preparatory actions can be taken to efficiently reduce the impacts of the flooding. Because of the specific characteristics of each catchment, varying data availability and end-user demands, the design of the best flood forecasting system may differ from catchment to catchment. However, despite the differences in concept and data needs, there is one underlying issue that spans across all systems. There has been an growing awareness and acceptance that uncertainty is a fundamental issue of flood forecasting and needs to be dealt with at the different spatial and temporal scales as well as the different stages of the flood generating processes. Today, operational flood forecasting centres change increasingly from single deterministic forecasts to probabilistic forecasts with various representations of the different contributions of uncertainty. The move towards these so-called Hydrological Ensemble Prediction Systems (HEPS) in flood forecasting represents the state of the art in forecasting science, following on the success of the use of ensembles for weather forecasting (Buizza et al., 2005) and paralleling the move towards ensemble forecasting in other related disciplines such as climate change predictions. The use of HEPS has been internationally fostered by initiatives such as "The Hydrologic Ensemble Prediction Experiment" (HEPEX), created with the aim to investigate how best to produce, communicate and use hydrologic ensemble forecasts in hydrological short-, medium- und long term prediction of hydrological processes. The advantages of quantifying the different contributions of uncertainty as well as the overall uncertainty to obtain reliable and useful flood forecasts also for extreme events
Flood marks of the 1813 flood in the Central Europe
Miklanek, Pavol; Pekárová, Pavla; Halmová, Dana; Pramuk, Branislav; Bačová Mitková, Veronika
2014-05-01
In August 2013, 200 years have passed since the greatest and most destructive floods known in the Slovak river basins. The flood affected almost the entire territory of Slovakia, northeastern Moravia, south of Poland. River basins of Váh (Orava, Kysuca), Poprad, Nitra, Hron, Torysa, Hornád, upper and middle Vistula, Odra have been most affected. The aim of this paper is to map the flood marks documenting this catastrophic flood in Slovakia. Flood marks and registrations on the 1813 flood in the Váh river basin are characterized by great diversity and are written in Bernolák modification of Slovak, in Latin, German and Hungarian. Their descriptions are stored in municipal chronicles and Slovak and Hungarian state archives. The flood in 1813 devastated the entire Váh valley, as well as its tributaries. Following flood marks were known in the Vah river basin: Dolná Lehota village in the Orava river basin, historical map from 1817 covering the Sučany village and showing three different cross-sections of the Váh river during the 1813 flood, flood mark in the city of Trenčín, Flood mark in the gate of the Brunovce mansion, cross preserved at the old linden tree at Drahovce, and some records in written documents, e.g. Cifer village. The second part of the study deals with flood marks mapping in the Hron, Hnilec and Poprad River basins, and Vistula River basin in Krakow. On the basis of literary documents and the actual measurement, we summarize the peak flow rates achieved during the floods in 1813 in the profile Hron: Banská Bystrica. According to recent situation the 1813 flood peak was approximately by 1.22 m higher, than the flood in 1974. Also in the Poprad basin is the August 1813 flood referred as the most devastating flood in last 400 years. The position of the flood mark is known, but the building was unfortunately removed later. The water level in 1813 was much higher than the water level during the recent flood in June 2010. In Cracow the water level
A common fixed point for operators in probabilistic normed spaces
Energy Technology Data Exchange (ETDEWEB)
Ghaemi, M.B. [Faculty of Mathematics, Iran University of Science and Technology, Narmak, Tehran (Iran, Islamic Republic of)], E-mail: mghaemi@iust.ac.ir; Lafuerza-Guillen, Bernardo [Department of Applied Mathematics, University of Almeria, Almeria (Spain)], E-mail: blafuerz@ual.es; Razani, A. [Department of Mathematics, Faculty of Science, I. Kh. International University, P.O. Box 34194-288, Qazvin (Iran, Islamic Republic of)], E-mail: razani@ikiu.ac.ir
2009-05-15
Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.0.
Citizen involvement in flood risk governance: flood groups and networks
Directory of Open Access Journals (Sweden)
Twigger-Ross Clare
2016-01-01
Full Text Available Over the past decade has been a policy shift withinUK flood risk management towards localism with an emphasis on communities taking ownership of flood risk. There is also an increased focus on resilience and, more specifically, on community resilience to flooding. This paper draws on research carried out for UK Department for Environment Food and Rural Affairs to evaluate the Flood Resilience Community Pathfinder (FRCP scheme in England. Resilience is conceptualised as multidimensional and linked to exisiting capacities within a community. Creating resilience to flooding is an ongoing process of adaptation, learning from past events and preparing for future risks. This paper focusses on the development of formal and informal institutions to support improved flood risk management: institutional resilience capacity. It includes new institutions: e.g. flood groups, as well as activities that help to build inter- and intra- institutional resilience capacity e.g. community flood planning. The pathfinder scheme consisted of 13 projects across England led by local authorities aimed at developing community resilience to flood risk between 2013 – 2015. This paper discusses the nature and structure of flood groups, the process of their development, and the extent of their linkages with formal institutions, drawing out the barriers and facilitators to developing institutional resilience at the local level.
Llewellyn, Mark
2006-06-01
Floods and tsunamis cause few severe injuries, but those injuries can overwhelm local areas, depending on the magnitude of the disaster. Most injuries are extremity fractures, lacerations, and sprains. Because of the mechanism of soft tissue and bone injuries, infection is a significant risk. Aspiration pneumonias are also associated with tsunamis. Appropriate precautionary interventions prevent communicable dis-ease outbreaks. Psychosocial health issues must be considered.
Probabilistic programming: a true vedification Challenge
Katoen, Joost-Pieter; Finkbeiner, Bernd; Pu, Geguang; Zhang, Lijun
2015-01-01
Probabilistic programs [6] are sequential programs, written in languages like C, Java, Scala, or ML, with two added constructs: (1) the ability to draw values at random from probability distributions, and (2) the ability to condition values of variables in a program through observations. For a compr
Probabilistic methods for service life predictions
Siemes, A.J.M.
1999-01-01
Nowadays it is commonly accepted that the safety of structures should be expressed in terms of reli-ability. This means as the probability of failure. In literature [1, 2, 3, and 4] the bases have been given for the calculation of the failure probability. Making probabilistic calculations can be don
Probabilistic Meteorological Characterization for Turbine Loads
DEFF Research Database (Denmark)
Kelly, Mark C.; Larsen, Gunner Chr.; Dimitrov, Nikolay Krasimirov;
2014-01-01
Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface....... These are used as input to loads calculation, and with a statistical loads output description, they allow for improved design and loads calculations....
On Probabilistic Automata in Continuous Time
DEFF Research Database (Denmark)
Eisentraut, Christian; Hermanns, Holger; Zhang, Lijun
2010-01-01
their compositionality properties. Weak bisimulation is partly oblivious to the probabilistic branching structure, in order to reflect some natural equalities in this spectrum of models. As a result, the standard way to associate a stochastic process to a generalised stochastic Petri net can be proven sound with respect...
Pigeons' Discounting of Probabilistic and Delayed Reinforcers
Green, Leonard; Myerson, Joel; Calvert, Amanda L.
2010-01-01
Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…
Ignorability in Statistical and Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...
Enhancing Automated Test Selection in Probabilistic Networks
Sent, D.; van der Gaag, L.C.; Bellazzi, R; Abu-Hanna, A; Hunter, J
2007-01-01
Most test-selection algorithms currently in use with probabilistic networks select variables myopically, that is, test variables are selected sequentially, on a one-by-one basis, based upon expected information gain. While myopic test selection is not realistic for many medical applications, non-myo
Relevance feedback in probabilistic multimedia retrieval
Boldareva, L.; Hiemstra, D.; Jonker, W.
2003-01-01
In this paper we explore a new view on data organisation and retrieval in a (multimedia) collection. We use probabilistic framework for indexing and interactive retrieval of the data, which enable to fill the semantic gap. Semi-automated experiments with TREC-2002 video collection showed that our ap
Sampling Techniques for Probabilistic Roadmap Planners
Geraerts, R.J.; Overmars, M.H.
2004-01-01
The probabilistic roadmap approach is a commonly used motion planning technique. A crucial ingredient of the approach is a sampling algorithm that samples the configuration space of the moving object for free configurations. Over the past decade many sampling techniques have been proposed. It is
Probabilistic Damage Stability Calculations for Ships
DEFF Research Database (Denmark)
Jensen, Jørgen Juncher
1996-01-01
The aim of these notes is to provide background material for the present probabilistic damage stability rules fro dry cargo ships.The formulas for the damage statistics are derived and shortcomings as well as possible improvements are discussed. The advantage of the definiton of fictitious...
Strong Ideal Convergence in Probabilistic Metric Spaces
Indian Academy of Sciences (India)
Celaleddin Şençimen; Serpil Pehlivan
2009-06-01
In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this space and investigate some properties of these concepts.
Financial Markets Analysis by Probabilistic Fuzzy Modelling
J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)
2003-01-01
textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno (
Probabilistic decision graphs for optimization under uncertainty
DEFF Research Database (Denmark)
Jensen, Finn V.; Nielsen, Thomas Dyhre
2011-01-01
This paper provides a survey on probabilistic decision graphs for modeling and solving decision problems under uncertainty. We give an introduction to influence diagrams, which is a popular framework for representing and solving sequential decision problems with a single decision maker. As the me...
Relevance feedback in probabilistic multimedia retrieval
Boldareva, L.; Hiemstra, Djoerd; Jonker, Willem
2003-01-01
In this paper we explore a new view on data organisation and retrieval in a (multimedia) collection. We use probabilistic framework for indexing and interactive retrieval of the data, which enable to fill the semantic gap. Semi-automated experiments with TREC-2002 video collection showed that our ap
Probabilistic safety goals. Phase 3 - Status report
Energy Technology Data Exchange (ETDEWEB)
Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))
2009-07-15
The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)
Probabilistic Resource Analysis by Program Transformation
DEFF Research Database (Denmark)
Kirkeby, Maja Hanne; Rosendahl, Mads
2016-01-01
The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...
A Probabilistic Framework for Curve Evolution
DEFF Research Database (Denmark)
Dahl, Vedrana Andersen
2017-01-01
approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...
A Comparative Study of Probabilistic Roadmap Planners
Geraerts, R.J.; Overmars, M.H.
2004-01-01
The probabilistic roadmap approach is one of the leading motion planning techniques. Over the past eight years the technique has been studied by many different researchers. This has led to a large number of variants of the approach, each with its own merits. It is difficult to compare the different
Dialectical Multivalued Logic and Probabilistic Theory
Directory of Open Access Journals (Sweden)
José Luis Usó Doménech
2017-02-01
Full Text Available There are two probabilistic algebras: one for classical probability and the other for quantum mechanics. Naturally, it is the relation to the object that decides, as in the case of logic, which algebra is to be used. From a paraconsistent multivalued logic therefore, one can derive a probability theory, adding the correspondence between truth value and fortuity.
Mastering probabilistic graphical models using Python
Ankan, Ankur
2015-01-01
If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.
Balkanization and Unification of Probabilistic Inferences
Yu, Chong-Ho
2005-01-01
Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…
Fontanazza, C M; Freni, G; Notaro, V
2012-01-01
Flood damage in urbanized watersheds may be assessed by combining the flood depth-damage curves and the outputs of urban flood models. The complexity of the physical processes that must be simulated and the limited amount of data available for model calibration may lead to high uncertainty in the model results and consequently in damage estimation. Moreover depth-damage functions are usually affected by significant uncertainty related to the collected data and to the simplified structure of the regression law that is used. The present paper carries out the analysis of the uncertainty connected to the flood damage estimate obtained combining the use of hydraulic models and depth-damage curves. A Bayesian inference analysis was proposed along with a probabilistic approach for the parameters estimating. The analysis demonstrated that the Bayesian approach is very effective considering that the available databases are usually short.
Allowances for evolving coastal flood risk under uncertain local sea-level rise
Buchanan, M. K.; Kopp, R. E.; Oppenheimer, M.; Tebaldi, C.
2015-12-01
Sea-level rise (SLR) causes estimates of flood risk made under the assumption of stationary mean sea level to be biased low. However, adjustments to flood return levels made assuming fixed increases of sea level are also inaccurate when applied to sea level that is rising over time at an uncertain rate. To accommodate both the temporal dynamics of SLR and their uncertainty, we develop an Average Annual Design Life Level (AADLL) metric and associated SLR allowances [1,2]. The AADLL is the flood level corresponding to a time-integrated annual expected probability of occurrence (AEP) under uncertainty over the lifetime of an asset; AADLL allowances are the adjustment from 2000 levels that maintain current risk. Given non-stationary and uncertain SLR, AADLL flood levels and allowances provide estimates of flood protection heights and offsets for different planning horizons and different levels of confidence in SLR projections in coastal areas. Allowances are a function primarily of local SLR and are nearly independent of AEP. Here we employ probabilistic SLR projections [3] to illustrate the calculation of AADLL flood levels and allowances with a representative set of long-duration tide gauges along U.S. coastlines. [1] Rootzen et al., 2014, Water Resources Research 49: 5964-5972. [2] Hunter, 2013, Ocean Engineering 71: 17-27. [3] Kopp et al., 2014, Earth's Future 2: 383-406.
GloFAS – global ensemble streamflow forecasting and flood early warning
Directory of Open Access Journals (Sweden)
L. Alfieri
2013-03-01
Full Text Available Anticipation and preparedness for large-scale flood events have a key role in mitigating their impact and optimizing the strategic planning of water resources. Although several developed countries have well-established systems for river monitoring and flood early warning, figures of populations affected every year by floods in developing countries are unsettling. This paper presents the Global Flood Awareness System (GloFAS, which has been set up to provide an overview on upcoming floods in large world river basins. GloFAS is based on distributed hydrological simulation of numerical ensemble weather predictions with global coverage. Streamflow forecasts are compared statistically to climatological simulations to detect probabilistic exceedance of warning thresholds. In this article, the system setup is described, together with an evaluation of its performance over a two-year test period and a qualitative analysis of a case study for the Pakistan flood, in summer 2010. It is shown that hazardous events in large river basins can be skilfully detected with a forecast horizon of up to 1 month. In addition, results suggest that an accurate simulation of initial model conditions and an improved parameterization of the hydrological model are key components to reproduce accurately the streamflow variability in the many different runoff regimes of the earth.
Atmospheric Rivers, Floods, and Flash Floods in California
Skelly, Klint T.
Atmospheric Rivers (ARs) are long (>2000 km), narrow (<1000 km) corridors of enhanced vertically integrated water vapor (IWV) and enhanced IWV transport (IVT). The landfall of ARs along the U.S. West Coast have been linked to extreme precipitation and flooding/flash flooding in regions of complex topography. The objective of this study is to investigate the relationship between a 10 water-year (2005-2014) climatology of floods, flash floods, and landfalling ARs. The ARs in this study are defined using IVT following the Rutz et al. (2013) methodology, whereas floods and flash floods are identified by the National Centers for Environmental Information (NCEI) Storm Events Database. The results of this study indicate that landfalling ARs are present on a majority of days that there are floods in northern California. Landfalling ARs are predominantly present on a majority of days that there are flash flood reports during the cold-season (November-March); however, the North American monsoon is present on days that there are flash flood reports during the warm-season (April-October). Two exemplary case studies are provided to illustrate the hydrologic impact of landfalling ARs. The first case study illustrated a flood event that occurred in associated with three landfalling ARs that produced 800 mm in regions over the Russian River watershed in northern California and the second case study illustrated a flash flood event that occurred in association with a landfalling AR that produced ˜225 mm of precipitation in regions over the Santa Ynez xii watershed in which produced a flash flood over the southern portions of Santa Barbara County in southern California.
Action-based flood forecasting for triggering humanitarian action
Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin
2016-09-01
Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.
Medium Range Ensembles Flood Forecasts for Community Level Applications
Fakhruddin, S.; Kawasaki, A.; Babel, M. S.; AIT
2013-05-01
Early warning is a key element for disaster risk reduction. In recent decades, there has been a major advancement in medium range and seasonal forecasting. These could provide a great opportunity to improve early warning systems and advisories for early action for strategic and long term planning. This could result in increasing emphasis on proactive rather than reactive management of adverse consequences of flood events. This can be also very helpful for the agricultural sector by providing a diversity of options to farmers (e.g. changing cropping pattern, planting timing, etc.). An experimental medium range (1-10 days) flood forecasting model has been developed for Bangladesh which provides 51 set of discharge ensembles forecasts of one to ten days with significant persistence and high certainty. This could help communities (i.e. farmer) for gain/lost estimation as well as crop savings. This paper describe the application of ensembles probabilistic flood forecast at the community level for differential decision making focused on agriculture. The framework allows users to interactively specify the objectives and criteria that are germane to a particular situation, and obtain the management options that are possible, and the exogenous influences that should be taken into account before planning and decision making. risk and vulnerability assessment was conducted through community consultation. The forecast lead time requirement, users' needs, impact and management options for crops, livestock and fisheries sectors were identified through focus group discussions, informal interviews and questionnaire survey.
Challenges of Modeling Flood Risk at Large Scales
Guin, J.; Simic, M.; Rowe, J.
2009-04-01
algorithm propagates the flows for each simulated event. The model incorporates a digital terrain model (DTM) at 10m horizontal resolution, which is used to extract flood plain cross-sections such that a one-dimensional hydraulic model can be used to estimate extent and elevation of flooding. In doing so the effect of flood defenses in mitigating floods are accounted for. Finally a suite of vulnerability relationships have been developed to estimate flood losses for a portfolio of properties that are exposed to flood hazard. Historical experience indicates that a for recent floods in Great Britain more than 50% of insurance claims occur outside the flood plain and these are primarily a result of excess surface flow, hillside flooding, flooding due to inadequate drainage. A sub-component of the model addresses this issue by considering several parameters that best explain the variability of claims off the flood plain. The challenges of modeling such a complex phenomenon at a large scale largely dictate the choice of modeling approaches that need to be adopted for each of these model components. While detailed numerically-based physical models exist and have been used for conducting flood hazard studies, they are generally restricted to small geographic regions. In a probabilistic risk estimation framework like our current model, a blend of deterministic and statistical techniques have to be employed such that each model component is independent, physically sound and is able to maintain the statistical properties of observed historical data. This is particularly important because of the highly non-linear behavior of the flooding process. With respect to vulnerability modeling, both on and off the flood plain, the challenges include the appropriate scaling of a damage relationship when applied to a portfolio of properties. This arises from the fact that the estimated hazard parameter used for damage assessment, namely maximum flood depth has considerable uncertainty. The
Recent advances in flood forecasting and flood risk assessment
Directory of Open Access Journals (Sweden)
G. Arduino
2005-01-01
Full Text Available Recent large floods in Europe have led to increased interest in research and development of flood forecasting systems. Some of these events have been provoked by some of the wettest rainfall periods on record which has led to speculation that such extremes are attributable in some measure to anthropogenic global warming and represent the beginning of a period of higher flood frequency. Whilst current trends in extreme event statistics will be difficult to discern, conclusively, there has been a substantial increase in the frequency of high floods in the 20th century for basins greater than 2x105 km2. There is also increasing that anthropogenic forcing of climate change may lead to an increased probability of extreme precipitation and, hence, of flooding. There is, therefore, major emphasis on the improvement of operational flood forecasting systems in Europe, with significant European Community spending on research and development on prototype forecasting systems and flood risk management projects. This Special Issue synthesises the most relevant scientific and technological results presented at the International Conference on Flood Forecasting in Europe held in Rotterdam from 3-5 March 2003. During that meeting 150 scientists, forecasters and stakeholders from four continents assembled to present their work and current operational best practice and to discuss future directions of scientific and technological efforts in flood prediction and prevention. The papers presented at the conference fall into seven themes, as follows.
Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen
2016-04-01
coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.
Fault tree analysis for urban flooding
Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.; Van Gelder, P.H.A.J.M.
2008-01-01
Traditional methods to evaluate flood risk mostly focus on storm events as the main cause of flooding. Fault tree analysis is a technique that is able to model all potential causes of flooding and to quantify both the overall probability of flooding and the contributions of all causes of flooding to
New challenges on uncertainty propagation assessment of flood risk analysis
Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés
2016-04-01
Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis
Flood Risk and Flood hazard maps - Visualisation of hydrological risks
Energy Technology Data Exchange (ETDEWEB)
Spachinger, Karl; Dorner, Wolfgang; Metzka, Rudolf [University of Applied Sciences Deggendorf (Germany); Serrhini, Kamal [Universite de Technologie de Compiegne, Genie des Systemes Urbains, France, and Universite Francois Rabelais, Unite Mixte de Recherche, Tours (France); Fuchs, Sven [Institute of Mountain Risk Engineering, University of Natural Resources and Applied Life Sciences, Vienna (Austria)], E-mail: karl.spachinger@fhd.edu
2008-11-01
Hydrological models are an important basis of flood forecasting and early warning systems. They provide significant data on hydrological risks. In combination with other modelling techniques, such as hydrodynamic models, they can be used to assess the extent and impact of hydrological events. The new European Flood Directive forces all member states to evaluate flood risk on a catchment scale, to compile maps of flood hazard and flood risk for prone areas, and to inform on a local level about these risks. Flood hazard and flood risk maps are important tools to communicate flood risk to different target groups. They provide compiled information to relevant public bodies such as water management authorities, municipalities, or civil protection agencies, but also to the broader public. For almost each section of a river basin, run-off and water levels can be defined based on the likelihood of annual recurrence, using a combination of hydrological and hydrodynamic models, supplemented by an analysis of historical records and mappings. In combination with data related to the vulnerability of a region risk maps can be derived. The project RISKCATCH addressed these issues of hydrological risk and vulnerability assessment focusing on the flood risk management process. Flood hazard maps and flood risk maps were compiled for Austrian and German test sites taking into account existing national and international guidelines. These maps were evaluated by eye-tracking using experimental graphic semiology. Sets of small-scale as well as large-scale risk maps were presented to test persons in order to (1) study reading behaviour as well as understanding and (2) deduce the most attractive components that are essential for target-oriented risk communication. A cognitive survey asking for negative and positive aspects and complexity of each single map complemented the experimental graphic semiology. The results indicate how risk maps can be improved to fit the needs of different user
Flood resilience urban territories. Flood resilience urban territories.
Beraud, Hélène; Barroca, Bruno; Hubert, Gilles
2010-05-01
The flood's impact during the last twenty years on French territory reveals our lack of preparation towards large-extended floods which might cause the stopping of companies' activity, services, or lead to housing unavailability during several months. New Orleans' case has to exemplify us: four years after the disaster, the city still couldn't get back its dynamism. In France, more than 300 towns are flood-exposed. While these towns are the mainspring of territory's development, it is likely that the majority of them couldn't get up quickly after a large-extended flood. Therefore, to understand and improve the urban territory's resilience facing floods is a real stake for territory's development. Urban technical networks supply, unify and irrigate all urban territories' constituents. Characterizing their flood resilience can be interesting to understand better urban resilience. In this context, waste management during and after floods is completely crucial. During a flood, the waste management network can become dysfunctional (roads cut, waste storage installations or waste treatment flooded). How can the mayor respect his obligation to guarantee salubrity and security in his city? In post flood the question is even more problematic. The waste management network presents a real stake for territory's restart. After a flood, building materials, lopped-of branches, furniture, business stocks, farm stocks, mud, rubbles, animal cadavers are wet, mixed, even polluted by hydrocarbons or toxic substances. The waste's volume can be significant. Sanitary and environmental risks can be crucial. In view of this situation, waste's management in post crisis period raises a real problem. What to make of this waste? How to collect it? Where to stock it? How to process it? Who is responsible? Answering these questions is all the more strategic since this waste is the mark of disaster. Thus, cleaning will be the first population's and local actor's reflex in order to forget the
Capturing changes in flood risk with Bayesian approaches for flood damage assessment
Vogel, Kristin; Schröter, Kai; Kreibich, Heidi; Thieken, Annegret; Müller, Meike; Sieg, Tobias; Laudan, Jonas; Kienzler, Sarah; Weise, Laura; Merz, Bruno; Scherbaum, Frank
2016-04-01
Flood risk is a function of hazard as well as of exposure and vulnerability. All three components are under change over space and time and have to be considered for reliable damage estimations and risk analyses, since this is the basis for an efficient, adaptable risk management. Hitherto, models for estimating flood damage are comparatively simple and cannot sufficiently account for changing conditions. The Bayesian network approach allows for a multivariate modeling of complex systems without relying on expert knowledge about physical constraints. In a Bayesian network each model component is considered to be a random variable. The way of interactions between those variables can be learned from observations or be defined by expert knowledge. Even a combination of both is possible. Moreover, the probabilistic framework captures uncertainties related to the prediction and provides a probability distribution for the damage instead of a point estimate. The graphical representation of Bayesian networks helps to study the change of probabilities for changing circumstances and may thus simplify the communication between scientists and public authorities. In the framework of the DFG-Research Training Group "NatRiskChange" we aim to develop Bayesian networks for flood damage and vulnerability assessments of residential buildings and companies under changing conditions. A Bayesian network learned from data, collected over the last 15 years in flooded regions in the Elbe and Danube catchments (Germany), reveals the impact of many variables like building characteristics, precaution and warning situation on flood damage to residential buildings. While the handling of incomplete and hybrid (discrete mixed with continuous) data are the most challenging issues in the study on residential buildings, a similar study, that focuses on the vulnerability of small to medium sized companies, bears new challenges. Relying on a much smaller data set for the determination of the model
Estimation of flood frequency by SCHADEX method - in Nysa Kłodzka catchment
Osuch, M.; Romanowicz, R. J.; Paquet, E.; Garavaglia, F.
2012-04-01
Estimation of design flood using Continuous Simulation (CS) has emerged as a very active research topic across academic institutions in Europe. CS is based on the use of rainfall-runoff models, of various complexity, for transforming precipitation data into river flow. By coupling a rainfall-runoff model with a stochastic rainfall model, Monte Carlo simulations can generate long series of synthetic rainfall being transformed into river flow from which flood frequency characteristics can be deducted. This approach is favoured by politicians and water managers, as it allows the influence of water management and climatic changes to be taken into account during the estimation of flood frequency curves. The other approach to FFA is based on the available historical maximum annual or seasonal flow data and consists of fitting theoretical cumulative distributions to observations. These theoretical, parameterised distributions are used in practical applications to derive flow quantiles with a desired probability of exceedence for the purpose of water management. The aim of this work is an application of a continuous simulation approach to flood frequency analysis (FFA) using the Nysa Kłodzka catchment as a case study. The applied method is SCHADEX, a probabilistic method for extreme floods estimation which combines a weather pattern based rainfall probabilistic model and a conceptual rainfall-runoff model, within a stochastic event simulation framework. In that method, the distribution of areal precipitation is described by a compound probabilistic distribution based on weather patterns sub-sampling (MEWP distribution). These patterns represent synoptic situation and allow for disagreggation of heavy rainfall data into homogenous subsamples (Garavaglia et al. 2010 a and b). Extreme flood estimation is then achieved by stochastic simulation using MORDOR rainfall-runoff model. The resulting FFA curve is compared to an outcome of a seasonal maxima approach (recommended
Bolotnov, V. P.
2007-01-01
The concept of regional hydroecological monitoring has been developed for the flood-plain of the Middle Ob. Its object is to control the state of flood-plain ecosystem productivity for organization of scientific, regional-adopted and ecologically regulated nature management. For this purpose hydroecological zoning of flood-plain territory performed, the most representative stations of water-gauge observations for each flood-plain zone organized, the scheme of flood-plain flooding was prepared...
Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?
Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy
2016-10-01
The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.
Towards Interactive Flood Governance: changing approaches in Dutch flood policy
J.A. van Ast (Jacko)
2013-01-01
markdownabstract__Abstract__ In the course of history, flooding of rivers and the sea brought misery to humanity. Low lying delta’s of large rivers like Bangladesh, New Orleans, the Nile delta or the Netherlands belong to the most vulnerable for flood disasters. Since ancient times people pondered
FLOOD AND FLOOD CONTROL OF THE YELLOW RIVER
Institute of Scientific and Technical Information of China (English)
Wenxue LI; Huirang WANG; Yunqi SU; Naiqian JIANG; Yuanfeng ZHANG
2002-01-01
The Yellow River is the cradle of China. It had long been the center of politics, economics and culture of China in history. Large coverage flood disaster occurred frequently in the Yellow River basin and the losses were often heavy. Thus, the Yellow River is also considered as the serious hidden danger of China. Since the founding of new China, structural and non-structural systems of flood control have been established basically. Tremendous successes have been made on flood control. Into the 21century, flood control standard of the Lower Yellow River has been increased significantly with the operation of the Xiaolangdi Reservoir. However, problems of the Yellow River are complicated and the tasks for solving these problems are arduous. Particularly, the sedimentation problem can't be solved completely in the near future. The situation of "suspended river" and threat of flood will long exist.Therefore, supported by rapid social and economical development of the nation and relied on advanced technology, the flood control system shall be perfected. Meantime, study of the Yellow River shall be enhanced in order to better understand the flood, get with it and use it thus to reduce flood disaster.
Carter, R.W.; Godfrey, R.G.
1960-01-01
The basic equations used in flood routing are developed from the law of continuity. In each method the assumptions are discussed to enable the user to select an appropriate technique. In the stage-storage method the storage is related to the mean gage height in the reach under consideration. In the discharge-storage method the storage is determined, from weighted values of inflow and outflow discharge. In the reservoir-storage method the storage is considered as a function of outflow discharge alone. A detailed example is given for each method to illustrate that particular technique.
NSGIC State | GIS Inventory — This polyline layer indicates the approximate effective FEMA Base Flood Elevation (BFE) associated with the corresponding Special Flood Hazard Area (SFHA). Each line...
NSGIC GIS Inventory (aka Ramona) — This polyline layer indicates the approximate effective FEMA Base Flood Elevations (BFE) associated with the corresponding Special Flood Hazard Area (SFHA). Each...
Estancia Special Flood Hazard Areas (SFHA)
Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...
Elephant Butte Special Flood Hazard Areas (SFHA)
Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...
Sierra County Special Flood Hazard Areas (SFHA)
Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...
Climate and change: simulating flooding impacts on urban transport network
Pregnolato, Maria; Ford, Alistair; Dawson, Richard
2015-04-01
National-scale climate projections indicate that in the future there will be hotter and drier summers, warmer and wetter winters, together with rising sea levels. The frequency of extreme weather events is expected to increase, causing severe damage to the built environment and disruption of infrastructures (Dawson, 2007), whilst population growth and changed demographics are placing new demands on urban infrastructure. It is therefore essential to ensure infrastructure networks are robust to these changes. This research addresses these challenges by focussing on the development of probabilistic tools for managing risk by modelling urban transport networks within the context of extreme weather events. This paper presents a methodology to investigate the impacts of extreme weather events on urban environment, in particular infrastructure networks, through a combination of climate simulations and spatial representations. By overlaying spatial data on hazard thresholds from a flood model and a flood safety function, mitigated by potential adaptation strategies, different levels of disruption to commuting journeys on road networks are evaluated. The method follows the Catastrophe Modelling approach and it consists of a spatial model, combining deterministic loss models and probabilistic risk assessment techniques. It can be applied to present conditions as well as future uncertain scenarios, allowing the examination of the impacts alongside socio-economic and climate changes. The hazard is determined by simulating free surface water flooding, with the software CityCAT (Glenis et al., 2013). The outputs are overlapped to the spatial locations of a simple network model in GIS, which uses journey-to-work (JTW) observations, supplemented with speed and capacity information. To calculate the disruptive effect of flooding on transport networks, a function relating water depth to safe driving car speed has been developed by combining data from experimental reports (Morris et
The Aqueduct Global Flood Analyzer
Iceland, Charles
2015-04-01
As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.
Fan, Qin; Davlasheridze, Meri
2016-06-01
Climate change is expected to worsen the negative effects of natural disasters like floods. The negative impacts, however, can be mitigated by individuals' adjustments through migration and relocation behaviors. Previous literature has identified flood risk as one significant driver in relocation decisions, but no prior study examines the effect of the National Flood Insurance Program's voluntary program-the Community Rating System (CRS)-on residential location choice. This article fills this gap and tests the hypothesis that flood risk and the CRS-creditable flood control activities affect residential location choices. We employ a two-stage sorting model to empirically estimate the effects. In the first stage, individuals' risk perception and preference heterogeneity for the CRS activities are considered, while mean effects of flood risk and the CRS activities are estimated in the second stage. We then estimate heterogeneous marginal willingness to pay (WTP) for the CRS activities by category. Results show that age, ethnicity and race, educational attainment, and prior exposure to risk explain risk perception. We find significant values for the CRS-creditable mitigation activities, which provides empirical evidence for the benefits associated with the program. The marginal WTP for an additional credit point earned for public information activities, including hazard disclosure, is found to be the highest. Results also suggest that water amenities dominate flood risk. Thus, high amenity values may increase exposure to flood risk, and flood mitigation projects should be strategized in coastal regions accordingly.
Water NOT wanted - Coastal Floods and Flooding Protection in Denmark
DEFF Research Database (Denmark)
Sørensen, Carlo Sass
2016-01-01
vulnerability towards coastal flooding, the country has experienced severe storm surges throughout history, and hitherto safe areas will become increasingly at risk this century as the climate changes. Historically a seafarers’ nation, Denmark has always been connected with the sea. From medieval time ports...... acceptance of floods has decreased from a “this is a natural consequence of living by the sea” to an explicit: Water Not Wanted! This paper provides a brief overview of floods and flooding protection issues in Denmark (Ch. 2 & Ch. 3), the current legislation (Ch. 4), and discusses challenges in relation...... to climate change adaptation, risk reduction, and to potential ways of rethinking flooding protection in strategies that also incorporate other uses (Ch. 5)....
Agent-Oriented Probabilistic Logic Programming
Institute of Scientific and Technical Information of China (English)
Jie Wang; Shi-Er Ju; Chun-Nian Liu
2006-01-01
Currently, agent-based computing is an active research area, and great efforts have been made towards the agent-oriented programming both from a theoretical and practical view. However, most of them assume that there is no uncertainty in agents' mental state and their environment. In other words, under this assumption agent developers are just allowed to specify how his agent acts when the agent is 100% sure about what is true/false. In this paper, this unrealistic assumption is removed and a new agent-oriented probabilistic logic programming language is proposed, which can deal with uncertain information about the world. The programming language is based on a combination of features of probabilistic logic programming and imperative programming.
Probabilistic forecasting and Bayesian data assimilation
Reich, Sebastian
2015-01-01
In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...
Exact and Approximate Probabilistic Symbolic Execution
Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem
2014-01-01
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.
Probabilistic Parsing Using Left Corner Language Models
Manning, C D; Manning, Christopher D.; Carpenter, Bob
1997-01-01
We introduce a novel parser based on a probabilistic version of a left-corner parser. The left-corner strategy is attractive because rule probabilities can be conditioned on both top-down goals and bottom-up derivations. We develop the underlying theory and explain how a grammar can be induced from analyzed data. We show that the left-corner approach provides an advantage over simple top-down probabilistic context-free grammars in parsing the Wall Street Journal using a grammar induced from the Penn Treebank. We also conclude that the Penn Treebank provides a fairly weak testbed due to the flatness of its bracketings and to the obvious overgeneration and undergeneration of its induced grammar.
Probabilistic Universality in two-dimensional Dynamics
Lyubich, Mikhail
2011-01-01
In this paper we continue to explore infinitely renormalizable H\\'enon maps with small Jacobian. It was shown in [CLM] that contrary to the one-dimensional intuition, the Cantor attractor of such a map is non-rigid and the conjugacy with the one-dimensional Cantor attractor is at most 1/2-H\\"older. Another formulation of this phenomenon is that the scaling structure of the H\\'enon Cantor attractor differs from its one-dimensional counterpart. However, in this paper we prove that the weight assigned by the canonical invariant measure to these bad spots tends to zero on microscopic scales. This phenomenon is called {\\it Probabilistic Universality}. It implies, in particular, that the Hausdorff dimension of the canonical measure is universal. In this way, universality and rigidity phenomena of one-dimensional dynamics assume a probabilistic nature in the two-dimensional world.
Lipschitz Parametrization of Probabilistic Graphical Models
Honorio, Jean
2012-01-01
We show that the log-likelihood of several probabilistic graphical models is Lipschitz continuous with respect to the lp-norm of the parameters. We discuss several implications of Lipschitz parametrization. We present an upper bound of the Kullback-Leibler divergence that allows understanding methods that penalize the lp-norm of differences of parameters as the minimization of that upper bound. The expected log-likelihood is lower bounded by the negative lp-norm, which allows understanding the generalization ability of probabilistic models. The exponential of the negative lp-norm is involved in the lower bound of the Bayes error rate, which shows that it is reasonable to use parameters as features in algorithms that rely on metric spaces (e.g. classification, dimensionality reduction, clustering). Our results do not rely on specific algorithms for learning the structure or parameters. We show preliminary results for activity recognition and temporal segmentation.
Efficient Probabilistic Inference with Partial Ranking Queries
Huang, Jonathan; Guestrin, Carlos E
2012-01-01
Distributions over rankings are used to model data in various settings such as preference analysis and political elections. The factorial size of the space of rankings, however, typically forces one to make structural assumptions, such as smoothness, sparsity, or probabilistic independence about these underlying distributions. We approach the modeling problem from the computational principle that one should make structural assumptions which allow for efficient calculation of typical probabilistic queries. For ranking models, "typical" queries predominantly take the form of partial ranking queries (e.g., given a user's top-k favorite movies, what are his preferences over remaining movies?). In this paper, we argue that riffled independence factorizations proposed in recent literature [7, 8] are a natural structural assumption for ranking distributions, allowing for particularly efficient processing of partial ranking queries.
Probabilistic Dynamic Logic of Phenomena and Cognition
Vityaev, Evgenii; Perlovsky, Leonid; Smerdov, Stanislav
2011-01-01
The purpose of this paper is to develop further the main concepts of Phenomena Dynamic Logic (P-DL) and Cognitive Dynamic Logic (C-DL), presented in the previous paper. The specific character of these logics is in matching vagueness or fuzziness of similarity measures to the uncertainty of models. These logics are based on the following fundamental notions: generality relation, uncertainty relation, simplicity relation, similarity maximization problem with empirical content and enhancement (learning) operator. We develop these notions in terms of logic and probability and developed a Probabilistic Dynamic Logic of Phenomena and Cognition (P-DL-PC) that relates to the scope of probabilistic models of brain. In our research the effectiveness of suggested formalization is demonstrated by approximation of the expert model of breast cancer diagnostic decisions. The P-DL-PC logic was previously successfully applied to solving many practical tasks and also for modelling of some cognitive processes.
A probabilistic model of RNA conformational space
DEFF Research Database (Denmark)
Frellsen, Jes; Moltke, Ida; Thiim, Martin;
2009-01-01
The increasing importance of non-coding RNA in biology and medicine has led to a growing interest in the problem of RNA 3-D structure prediction. As is the case for proteins, RNA 3-D structure prediction methods require two key ingredients: an accurate energy function and a conformational sampling......, the discrete nature of the fragments necessitates the use of carefully tuned, unphysical energy functions, and their non-probabilistic nature impairs unbiased sampling. We offer a solution to the sampling problem that removes these important limitations: a probabilistic model of RNA structure that allows...... conformations for 9 out of 10 test structures, solely using coarse-grained base-pairing information. In conclusion, the method provides a theoretical and practical solution for a major bottleneck on the way to routine prediction and simulation of RNA structure and dynamics in atomic detail....
Significance testing as perverse probabilistic reasoning
Directory of Open Access Journals (Sweden)
Westover Kenneth D
2011-02-01
Full Text Available Abstract Truth claims in the medical literature rely heavily on statistical significance testing. Unfortunately, most physicians misunderstand the underlying probabilistic logic of significance tests and consequently often misinterpret their results. This near-universal misunderstanding is highlighted by means of a simple quiz which we administered to 246 physicians at two major academic hospitals, on which the proportion of incorrect responses exceeded 90%. A solid understanding of the fundamental concepts of probability theory is becoming essential to the rational interpretation of medical information. This essay provides a technically sound review of these concepts that is accessible to a medical audience. We also briefly review the debate in the cognitive sciences regarding physicians' aptitude for probabilistic inference.
Incorporating psychological influences in probabilistic cost analysis
Energy Technology Data Exchange (ETDEWEB)
Kujawski, Edouard; Alvaro, Mariana; Edwards, William
2004-01-08
Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for
Probabilistic Output Analysis by Program Manipulation
Rosendahl, Mads; Kirkeby, Maja H.
2015-01-01
The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability function as a possibly uncomputable expression in an intermediate language. This program is then analyzed, transformed, and approximated. The result is a closed form expression that computes an over...
Learning Probabilistic Models of Word Sense Disambiguation
Pedersen, Ted
1998-01-01
This dissertation presents several new methods of supervised and unsupervised learning of word sense disambiguation models. The supervised methods focus on performing model searches through a space of probabilistic models, and the unsupervised methods rely on the use of Gibbs Sampling and the Expectation Maximization (EM) algorithm. In both the supervised and unsupervised case, the Naive Bayesian model is found to perform well. An explanation for this success is presented in terms of learning rates and bias-variance decompositions.
Bayesian Probabilistic Projection of International Migration.
Azose, Jonathan J; Raftery, Adrian E
2015-10-01
We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model.
A Probabilistic Approach to Knowledge Translation
Jiang, Shangpu; Lowd, Daniel; Dou, Dejing
2015-01-01
In this paper, we focus on a novel knowledge reuse scenario where the knowledge in the source schema needs to be translated to a semantically heterogeneous target schema. We refer to this task as "knowledge translation" (KT). Unlike data translation and transfer learning, KT does not require any data from the source or target schema. We adopt a probabilistic approach to KT by representing the knowledge in the source schema, the mapping between the source and target schemas, and the resulting ...
Treatment of Uncertainties in Probabilistic Tsunami Hazard
Thio, H. K.
2012-12-01
Over the last few years, we have developed a framework for developing probabilistic tsunami inundation maps, which includes comprehensive quantification of earthquake recurrence as well as uncertainties, and applied it to the development of a tsunami hazard map of California. The various uncertainties in tsunami source and propagation models are an integral part of a comprehensive probabilistic tsunami hazard analysis (PTHA), and often drive the hazard at low probability levels (i.e. long return periods). There is no unique manner in which uncertainties are included in the analysis although in general, we distinguish between "natural" or aleatory variability, such as slip distribution and event magnitude, and uncertainties due to an incomplete understanding of the behavior of the earth, called epistemic uncertainties, such as scaling relations and rupture segmentation. Aleatory uncertainties are typically included through integration over distribution functions based on regression analyses, whereas epistemic uncertainties are included using logic trees. We will discuss how the different uncertainties were included in our recent probabilistic tsunami inundation maps for California, and their relative importance on the final results. Including these uncertainties in offshore exceedance waveheights is straightforward, but the problem becomes more complicated once the non-linearity of near-shore propagation and inundation are encountered. By using the probabilistic off-shore waveheights as input level for the inundation models, the uncertainties up to that point can be included in the final maps. PTHA provides a consistent analysis of tsunami hazard and will become an important tool in diverse areas such as coastal engineering and land use planning. The inclusive nature of the analysis, where few assumptions are made a-priori as to which sources are significant, means that a single analysis can provide a comprehensive view of the hazard and its dominant sources
PRISMATIC: Unified Hierarchical Probabilistic Verification Tool
2011-09-01
Lecture Notes in Computer Science 5123, pp. 135–148...and Peyronnet, S., “Approximate Probabilistic Model Checking,”. in VMCAI, Vol. 2937 of Lecture Notes in Computer Science , pp. 73–84, 2004. 18 Hermanns...Systems,” in TACAS, Vol. 3920 of Lecture Notes in Computer Science , pp. 441–444, 2006. 20 Jiménez, V.M., Marzal, A., “Computing the k shortest paths:
Probabilistic and quantum finite automata with postselection
Yakaryilmaz, Abuzer
2011-01-01
We prove that endowing a real-time probabilistic or quantum computer with the ability of postselection increases its computational power. For this purpose, we provide a new model of finite automata with postselection, and compare it with the model of L\\={a}ce et al. We examine the related language classes, and also establish separations between the classical and quantum versions, and between the zero-error vs. bounded-error modes of recognition in this model.
Probabilistic forecast of daily areal precipitation focusing on extreme events
Bliefernicht, J.; Bárdossy, A.
2007-04-01
A dynamical downscaling scheme is usually used to provide a short range flood forecasting system with high-resolved precipitation fields. Unfortunately, a single forecast of this scheme has a high uncertainty concerning intensity and location especially during extreme events. Alternatively, statistical downscaling techniques like the analogue method can be used which can supply a probabilistic forecasts. However, the performance of the analogue method is affected by the similarity criterion, which is used to identify similar weather situations. To investigate this issue in this work, three different similarity measures are tested: the euclidean distance (1), the Pearson correlation (2) and a combination of both measures (3). The predictor variables are geopotential height at 1000 and 700 hPa-level and specific humidity fluxes at 700 hPa-level derived from the NCEP/NCAR-reanalysis project. The study is performed for three mesoscale catchments located in the Rhine basin in Germany. It is validated by a jackknife method for a period of 44 years (1958-2001). The ranked probability skill score, the Brier Skill score, the Heidke skill score and the confidence interval of the Cramer association coefficient are calculated to evaluate the system for extreme events. The results show that the combined similarity measure yields the best results in predicting extreme events. However, the confidence interval of the Cramer coefficient indicates that this improvement is only significant compared to the Pearson correlation but not for the euclidean distance. Furthermore, the performance of the presented forecasting system is very low during the summer and new predictors have to be tested to overcome this problem.
Flood Risk and Asset Management
2012-09-01
Within the UK for example, the flooding of the village of Boscastle (August, 2004), that took place over a day, Roca -Collel and Davison (2010), can...Hazard Research Centre. Roca -Collel, M. and Davison, M. (2010). "Two dimensional model analysis of flash- flood processes: application to the Boscastle
Geomorphological factors of flash floods
Kuznetsova, Yulia
2016-04-01
Growing anthropogenic load, rise of extreme meteorological events frequency and total precipitation depth often lead to increasing danger of catastrophic fluvial processes worldwide. Flash floods are one of the most dangerous and less understood types of them. Difficulties of their study are mainly related to short duration of single events, remoteness and hard access to origin areas. Most detailed researches of flash floods focus on hydrological parameters of the flow itself and its meteorological factors. At the same time, importance of the basin geological and geomorphological structure for flash floods generation and the role they play in global sediment redistribution is yet poorly understood. However, understanding and quantitative assessment of these features is a real basis for a complete concept of factors, characteristics and dynamics of flash floods. This work is a review of published data on flash floods, and focuses on the geomorphological factors of the phenomenon. We consider both individual roles and interactions between different geomorphological features (the whole basin parameters, characteristics of the single slopes and valley bottom). Special attention is paid to critical values of certain factors. This approach also highlights the gaps or less studied factors of flash floods. Finally, all data is organized into a complex diagram that may be used for flash floods modeling. This also may help to reach a new level of flash flood predictions and risk assessment.
Extreme flooding tolerance in Rorippa
Akman, M.; Bhikharie, A.; Mustroph, A.; Sasidharan, Rashmi
2014-01-01
Low oxygen stress imposed by floods creates a strong selection force shaping plant ecosystems in flood-prone areas. Plants inhabiting these environments adopt various adaptations and survival strategies to cope with increasing water depths. Two Rorippa species, R. sylvestris and R. amphibia that gro
Utilizing Probabilistic Linear Equations in Cube Attacks
Institute of Scientific and Technical Information of China (English)
Yuan Yao; Bin Zhang; Wen-Ling Wu
2016-01-01
Cube attacks, proposed by Dinur and Shamir at EUROCRYPT 2009, have shown huge power against stream ciphers. In the original cube attacks, a linear system of secret key bits is exploited for key recovery attacks. However, we find a number of equations claimed linear in previous literature actually nonlinear and not fit into the theoretical framework of cube attacks. Moreover, cube attacks are hard to apply if linear equations are rare. Therefore, it is of significance to make use of probabilistic linear equations, namely nonlinear superpolys that can be approximated by linear expressions effectively. In this paper, we suggest a way to test out and utilize these probabilistic linear equations, thus extending cube attacks to a wider scope. Concretely, we employ the standard parameter estimation approach and the sequential probability ratio test (SPRT) for linearity test in the preprocessing phase, and use maximum likelihood decoding (MLD) for solving the probabilistic linear equations in the online phase. As an application, we exhibit our new attack against 672 rounds of Trivium and reduce the number of key bits to search by 7.
Symbolic Computing in Probabilistic and Stochastic Analysis
Directory of Open Access Journals (Sweden)
Kamiński Marcin
2015-12-01
Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.
Asteroid Risk Assessment: A Probabilistic Approach.
Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth
2016-02-01
Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.
Probabilistic Graph Layout for Uncertain Network Visualization.
Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel
2017-01-01
We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.
Designing and rehabilitating concrete structures - probabilistic approach
Energy Technology Data Exchange (ETDEWEB)
Edvardsen, C.; Mohr, L. [COWI Consulting Engineers and Planners AS, Lyngby (Denmark)
2000-07-01
Four examples dealing with corrosion of steel reinforcement in concrete due to chloride ingress are described, using a probabilistic approach which was developed in the recently published DuraCrete Report. The first example illustrates the difference in the required concrete cover dictated by environmental considerations. The second example concerns the update of the service life of the Great Belt Link in Denmark on the basis of measurements made five years after construction. The third example provides some design details of a tunnel in the Netherlands, while the fourth one concerns design of a column taking into account the initiation of corrosion both by means of a partial safety factor and by a probabilistic analysis. Differences in using the probabilistic approach in designing a new structure where the service life and reliability are pre-determined, and rehabilitating an existing structure where an analysis may give the answer to an estimate of the remaining service life and reliability level, are demonstrated. 9 refs., 8 tabs., 6 figs.
Directory of Open Access Journals (Sweden)
Bellier Joseph
2016-01-01
Full Text Available Hydrological ensemble forecasting performances are analysed over 5 basins up to 2000 km2 in the French Upper Rhone region. Streamflow forecasts are issued at an hourly time step from lumped ARX rainfall-runoff models forced by different precipitation forecasts. Ensemble meteorological forecasts from ECMWF and NCEP are considered, as well as analogue-based forecasts fed by their corresponding control forecast. Analogue forecasts are rearranged using an adaptation of the Schaake-Shuffle method in order to ensure the temporal coherence. A new evaluation approach is proposed, separating forecasting performances on peak amplitudes and peak timings for high flow events. Evaluation is conducted against both simulated and observed streamflow (so that relative meteorological and hydrological uncertainties can be assessed, by means of CRPS and rank histograms, over the 2007-2014 period. Results show a general agreement of the forecasting performances when averaged over the 5 basins. However, ensemble-based and analogue-based streamflow forecasts produce a different signature on peak events in terms of bias, spread and reliability. Strengths and weaknesses of both approaches are discussed as well as potential improvements, notably towards their merging.
230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis
Paces, James B.
2014-01-01
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.
230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis
Energy Technology Data Exchange (ETDEWEB)
Paces, James B. [U.S. Geological Survey
2014-08-31
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.
Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis
Energy Technology Data Exchange (ETDEWEB)
Smith, Curtis L; Mandelli, Diego; Zhegang Ma
2014-11-01
As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.
Energy Technology Data Exchange (ETDEWEB)
Mandelli, Diego; Prescott, Steven R; Smith, Curtis L; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Kinoshita, Robert A
2011-07-01
In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.
Community-based early warning systems for flood risk mitigation in Nepal
Smith, Paul J.; Brown, Sarah; Dugar, Sumit
2017-03-01
This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.
Flash Flooding and 'Muddy Floods' on Arable Land
Boardman, J.
2012-04-01
Flash flooding is often associated with upland, grazed catchments. It does, however, occur in lowland arable-dominated areas. In southern England, notable examples have occurred at Rottingdean (Brighton) in 1987, at Faringdon (Oxfordshire) in 1993 and at Breaky Bottom vineyard (near Brighton) in 1987 and 2000. All resulted in damage to nearby property. Runoff was largely from recently cultivated ground. The characteristics of such floods are: Rapid runoff from bare soil surfaces. Saturated excess overland flow is likely in the early parts of storms but high intensity rainfall on loamy soils results in crusting and Hortonian overland flow; High rates of erosion; Sediment transport to downvalley sites causing property damage ('muddy flooding'). Muddy floods are known from several areas of Europe e.g. Belgium, northern France, South Limburg (Netherlands) and Slovakia (Boardman et al 2006). In other areas they occur but have gone unreported or are classified under different terms. The necessary conditions for occurrence are areas of arable land which is bare at times of the year when there is a risk of storms. For muddy floods to cause damage (and hence be reported), vulnerable property must lie downstream from such areas of arable land. In some areas the incidence of muddy floods relates to autumn and early winter rainfall and winter cereal crops (e.g. southern England). In continental Europe, flooding is more common in summer and is associated with convectional storms and land uses including sugar beet, maize and potatoes. Predictions of increased numbers of high-intensity storms with future climate change, suggest that arable areas will continue to generate both flash floods and muddy floods.
Ho Chi Minh City adaptation to increasing risk of coastal and fluvial floods
Scussolini, Paolo; Lasage, Ralph
2016-04-01
Coastal megacities in southeast Asia are a hotspot of vulnerability to floods. In such contexts, the combination of fast socio-economic development and of climate change impacts on precipitation and sea level generates concerns about the flood damage to people and assets. This work focuses on Ho Chi Minh City, Vietnam, for which we estimate the present and future direct risk from river and coastal floods. A model cascade is used that comprises the Saigon river basin and the urban network, plus the land-use-dependent damaging process. Changes in discharge for five return periods are simulated, enabling the probabilistic calculation of the expected annual economic damage to assets, for differnt scenarios of global emissions, local socio-economic growth, and land subsidence, up to year 2100. The implementation of a range of adaptation strategies is simulated, including building dykes, elevating, creating reservoirs, managing water and sediment upstream, flood-proofing, halting groundwater abstraction. Results are presented on 1) the relative weight of each future driver in determining the flood risk of Ho Chi Minh, and 2) the efficiency and feasibility of each adaptation strategy.
Performance of the ARPA-SMR limited-area ensemble prediction system: two flood cases
Directory of Open Access Journals (Sweden)
A. Montani
2001-01-01
Full Text Available The performance of the ARPA-SMR Limited-area Ensemble Prediction System (LEPS, generated by nesting a limited-area model on selected members of the ECMWF targeted ensemble, is evaluated for two flood events that occurred during September 1992. The predictability of the events is studied for forecast times ranging from 2 to 4 days. The extent to which floods localised in time and space can be forecast at high resolution in probabilistic terms was investigated. Rainfall probability maps generated by both LEPS and ECMWF targeted ensembles are compared for different precipitation thresholds in order to assess the impact of enhanced resolution. At all considered forecast ranges, LEPS performs better, providing a more accurate description of the event with respect to the spatio-temporal location, as well as its intensity. In both flood cases, LEPS probability maps turn out to be a very valuable tool to assist forecasters to issue flood alerts at different forecast ranges. It is also shown that at the shortest forecast range, the deterministic prediction provided by the limited area model, when run in a higher-resolution configuration, provides a very accurate rainfall pattern and a good quantitative estimate of the total rainfall deployed in the flooded regions.
Improving Gas Flooding Efficiency
Energy Technology Data Exchange (ETDEWEB)
Reid Grigg; Robert Svec; Zheng Zeng; Alexander Mikhalin; Yi Lin; Guoqiang Yin; Solomon Ampir; Rashid Kassim
2008-03-31
This study focuses on laboratory studies with related analytical and numerical models, as well as work with operators for field tests to enhance our understanding of and capabilities for more efficient enhanced oil recovery (EOR). Much of the work has been performed at reservoir conditions. This includes a bubble chamber and several core flood apparatus developed or modified to measure interfacial tension (IFT), critical micelle concentration (CMC), foam durability, surfactant sorption at reservoir conditions, and pressure and temperature effects on foam systems.Carbon dioxide and N{sub 2} systems have been considered, under both miscible and immiscible conditions. The injection of CO2 into brine-saturated sandstone and carbonate core results in brine saturation reduction in the range of 62 to 82% brine in the tests presented in this paper. In each test, over 90% of the reduction occurred with less than 0.5 PV of CO{sub 2} injected, with very little additional brine production after 0.5 PV of CO{sub 2} injected. Adsorption of all considered surfactant is a significant problem. Most of the effect is reversible, but the amount required for foaming is large in terms of volume and cost for all considered surfactants. Some foams increase resistance to the value beyond what is practical in the reservoir. Sandstone, limestone, and dolomite core samples were tested. Dissolution of reservoir rock and/or cement, especially carbonates, under acid conditions of CO2 injection is a potential problem in CO2 injection into geological formations. Another potential change in reservoir injectivity and productivity will be the precipitation of dissolved carbonates as the brine flows and pressure decreases. The results of this report provide methods for determining surfactant sorption and can be used to aid in the determination of surfactant requirements for reservoir use in a CO{sub 2}-foam flood for mobility control. It also provides data to be used to determine rock permeability
The National Flood Interoperability Experiment: Bridging Resesarch and Operations
Salas, F. R.
2015-12-01
The National Weather Service's new National Water Center, located on the University of Alabama campus in Tuscaloosa, will become the nation's hub for comprehensive water resources forecasting. In conjunction with its federal partners the US Geological Survey, Army Corps of Engineers and Federal Emergency Management Agency, the National Weather Service will operationally support both short term flood prediction and long term seasonal forecasting of water resource conditions. By summer 2016, the National Water Center will begin evaluating four streamflow data products at the scale of the NHDPlus river reaches (approximately 2.67 million). In preparation for the release of these products, from September 2014 to August 2015, the National Weather Service partnered with the Consortium of Universities for the Advancement of Hydrologic Science, Inc. to support the National Flood Interoperability Experiment which included a seven week in-residence Summer Institute in Tuscaloosa for university students interested in learning about operational hydrology and flood forecasting. As part of the experiment, 15 hour forecasts from the operational High Resolution Rapid Refresh atmospheric model were used to drive a three kilometer Noah-MP land surface model loosely coupled to a RAPID river routing model operating on the NHDPlus dataset. This workflow was run every three hours during the Summer Institute and the results were made available to those engaged to pursue a range of research topics focused on flood forecasting (e.g. reservoir operations, ensemble forecasting, probabilistic flood inundation mapping, rainfall product evaluation etc.) Although the National Flood Interoperability Experiment was finite in length, it provided a platform through which the academic community could engage federal agencies and vice versa to narrow the gap between research and operations and demonstrate how state of the art research infrastructure, models, services, datasets etc. could be utilized
Flood hazard and management: a UK perspective.
Wheater, Howard S
2006-08-15
This paper discusses whether flood hazard in the UK is increasing and considers issues of flood risk management. Urban development is known to increase fluvial flood frequency, hence design measures are routinely implemented to minimize the impact. Studies suggest that historical effects, while potentially large at small scale, are not significant for large river basins. Storm water flooding within the urban environment is an area where flood hazard is inadequately defined; new methods are needed to assess and manage flood risk. Development on flood plains has led to major capital expenditure on flood protection, but government is attempting to strengthen the planning role of the environmental regulator to prevent this. Rural land use management has intensified significantly over the past 30 years, leading to concerns that flood risk has increased, at least at local scale; the implications for catchment-scale flooding are unclear. New research is addressing this issue, and more broadly, the role of land management in reducing flood risk. Climate change impacts on flooding and current guidelines for UK practice are reviewed. Large uncertainties remain, not least for the occurrence of extreme precipitation, but precautionary guidance is in place. Finally, current levels of flood protection are discussed. Reassessment of flood hazard has led to targets for increased flood protection, but despite important developments to communicate flood risk to the public, much remains to be done to increase public awareness of flood hazard.
Somerset County Flood Information System
Hoppe, Heidi L.
2007-01-01
The timely warning of a flood is crucial to the protection of lives and property. One has only to recall the floods of August 2, 1973, September 16 and 17, 1999, and April 16, 2007, in Somerset County, New Jersey, in which lives were lost and major property damage occurred, to realize how costly, especially in terms of human life, an unexpected flood can be. Accurate forecasts and warnings cannot be made, however, without detailed information about precipitation and streamflow in the drainage basin. Since the mid 1960's, the National Weather Service (NWS) has been able to forecast flooding on larger streams in Somerset County, such as the Raritan and Millstone Rivers. Flooding on smaller streams in urban areas was more difficult to predict. In response to this problem the NWS, in cooperation with the Green Brook Flood Control Commission, installed a precipitation gage in North Plainfield, and two flash-flood alarms, one on Green Brook at Seeley Mills and one on Stony Brook at Watchung, in the early 1970's. In 1978, New Jersey's first countywide flood-warning system was installed by the U.S. Geological Survey (USGS) in Somerset County. This system consisted of a network of eight stage and discharge gages equipped with precipitation gages linked by telephone telemetry and eight auxiliary precipitation gages. The gages were installed throughout the county to collect precipitation and runoff data that could be used to improve flood-monitoring capabilities and flood-frequency estimates. Recognizing the need for more detailed hydrologic information for Somerset County, the USGS, in cooperation with Somerset County, designed and installed the Somerset County Flood Information System (SCFIS) in 1990. This system is part of a statewide network of stream gages, precipitation gages, weather stations, and tide gages that collect data in real time. The data provided by the SCFIS improve the flood forecasting ability of the NWS and aid Somerset County and municipal agencies in
Energy Technology Data Exchange (ETDEWEB)
Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)
2017-04-15
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
Lateral Flooding Associated to Wave Flood Generation on River Surface
Ramírez-Núñez, C.; Parrot, J.-F.
2016-06-01
This research provides a wave flood simulation using a high resolution LiDAR Digital Terrain Model. The simulation is based on the generation of waves of different amplitudes that modify the river level in such a way that water invades the adjacent areas. The proposed algorithm firstly reconstitutes the original river surface of the studied river section and then defines the percentage of water loss when the wave floods move downstream. This procedure was applied to a gently slope area in the lower basin of Coatzacoalcos river, Veracruz (Mexico) defining the successive areas where lateral flooding occurs on its downstream movement.
Developing consistent scenarios to assess flood hazards in mountain streams.
Mazzorana, B; Comiti, F; Scherer, C; Fuchs, S
2012-02-01
The characterizing feature of extreme events in steep mountain streams is the multiplicity of possible tipping process patterns such as those involving sudden morphological changes due to intense local erosion, aggradation as well as clogging of critical flow sections due to wood accumulations. Resolving a substantial part of the uncertainties underlying these hydrological cause-effect chains is a major challenge for flood risk management. Our contribution is from a methodological perspective based on an expert-based methodology to unfold natural hazard process scenarios in mountain streams to retrace their probabilistic structure. As a first step we set up a convenient system representation for natural hazard process routing. In this setting, as a second step, we proceed deriving the possible and thus consistent natural hazard process patterns by means of Formative Scenario Analysis. In a last step, hazard assessment is refined by providing, through expert elicitation, the spatial probabilistic structure of individual scenario trajectories. As complement to the theory the applicability of the method is shown through embedded examples. To conclude we discuss the major advantages of the presented methodological approach for hazard assessment compared to traditional approaches, and with respect to the risk governance process.
The Decidability Frontier for Probabilistic Automata on Infinite Words
Chatterjee, Krishnendu; Tracol, Mathieu
2011-01-01
We consider probabilistic automata on infinite words with acceptance defined by safety, reachability, B\\"uchi, coB\\"uchi, and limit-average conditions. We consider quantitative and qualitative decision problems. We present extensions and adaptations of proofs for probabilistic finite automata and present a complete characterization of the decidability and undecidability frontier of the quantitative and qualitative decision problems for probabilistic automata on infinite words.
Towards the development of a global probabilistic tsunami risk assessment methodology
Schaefer, Andreas; Daniell, James; Wenzel, Friedemann
2017-04-01
The assessment of tsunami risk is on many levels still ambiguous and under discussion. Over the last two decades, various methodologies and models have been developed to quantify tsunami risk, most of the time on a local or regional level, with either deterministic or probabilistic background. Probabilistic modelling has significant difficulties, as the underlying tsunami hazard modelling demands an immense amount of computational time and thus limits the assessment substantially, being often limited to either institutes with supercomputing access or the modellers are forced to reduce modelling resolution either quantitatively or qualitatively. Furthermore, data on the vulnerability of infrastructure and buildings is empirically limited to a few disasters in the recent years. Thus, a reliable quantification of socio-economic vulnerability is still questionable. Nonetheless, significant improvements have been developed recently on both the methodological site as well as computationally. This study, introduces a methodological framework for a globally uniform probabilistic tsunami risk assessment. Here, the power of recently developed hardware for desktop-based parallel computing plays a crucial role in the calculation of numerical tsunami wave propagation, while large-scale parametric models and paleo-seismological data enhances the return period assessment of tsunami-genic megathrust earthquake events. Adaptation of empirical tsunami vulnerability functions in conjunction with methodologies from flood modelling support a more reliable vulnerability quantification. In addition, methodologies for exposure modelling in coastal areas are introduced focusing on the diversity of coastal exposure landscapes and data availability. Overall, this study introduces a first overview of how a global tsunami risk modelling framework may be accomplished, while covering methodological, computational and data-driven aspects.
Probabilistic sensitivity analysis of dune erosion calculations
Den Heijer, C.; Van de Graaff, J.; Van Gelder, P.H.A.J.M.
2008-01-01
Coastal dunes protect low lying coastal areas against the sea. Extreme waves and water levels during severe storms may cause breaching of the dunes. Consequently, serious damage due to flooding and direct wave attack could occur, resulting in loss of life and property. Proper coastal management impl
Probabilistic Projections of Climate Change Impacts on the Agricultural Sector in Bangladesh
Ruane, A. C.; Rosenzweig, C.; Major, D. C.
2008-12-01
We describe a novel approach to impact assessment that generates probabilistic distributions of climate change impacts by passing model and societal uncertainties in a continuous manner throughout the assessment process. Rather than driving impact models with conditions based upon summary statistics from an ensemble of global climate models (GCMs) or relying on a prescribed range of inputs, end-to-end assessment is conducted for a wide variety of GCMs and emissions scenarios. The resulting distribution of impacts may be used to elucidate internal dynamics of the system and to attach model and societal-based probabilities to individual outcomes. To demonstrate the method, preliminary results from a World Bank project on the effect of climate change on Bangladesh's agricultural sector are presented. Working with a wide range of collaborators in Bangladesh, 48 climate change scenarios (16 GCMs and 3 emissions scenarios) were generated from 2020-2100 for each of 16 regions in Bangladesh. These scenarios were then used to drive the Decision Support System for Agrotechnology Transfer (DSSAT) biophysical model for major cereal crops. Output generated from a smaller subset of hydrologic and coastal model scenarios is then used to adjust the yield production to account for projected river floods in the Ganges/Brahmaputra/Meghna basin and coastal inundation from the Bay of Bengal, respectively. The result is a probabilistic distribution of agricultural impacts for Bangladesh that retains model and societal uncertainties throughout the assessment process.
Probabilistic high-resolution forecast of heavy precipitation over Central Europe
Directory of Open Access Journals (Sweden)
C. Marsigli
2004-01-01
Full Text Available The limited-area ensemble prediction system COSMO-LEPS has been running operationally at ECMWF since November 2002. Five runs of the non-hydrostatic limited-area model Lokal Modell (LM are available every day, nested on five selected members of three consecutive 12-h lagged ECMWF global ensembles. The limited-area ensemble forecasts range up to 120h and LM-based probabilistic products are disseminated to several national weather services. COSMO-LEPS has been constructed in order to have a probabilistic system with high resolution, focussing the attention on extreme events in regions with complex orography. In this paper, the performance of COSMO-LEPS for a heavy precipitation event that affected Central Europe in August 2002 has been examined. At the 4-day forecast range, the probability maps indicate the possibility of the overcoming of high precipitation thresholds (up to 150mm/24h over the region actually affected by the flood. Furthermore, one out of the five ensemble members predicts 4 days ahead a precipitation structure very similar to the observed one.
Probabilistic Dressing of a Storm Surge Prediction in the Adriatic Sea
Directory of Open Access Journals (Sweden)
R. Mel
2016-01-01
Full Text Available Providing a reliable, accurate, and fully informative storm surge forecast is of paramount importance for managing the hazards threatening coastal environments. Specifically, a reliable probabilistic forecast is crucial for the management of the movable barriers that are planned to become operational in 2018 for the protection of Venice and its lagoon. However, a probabilistic forecast requires multiple simulations and a considerable computational time, which makes it expensive in real-time applications. This paper describes the ensemble dressing method, a cheap operational flood prediction system that includes information about the uncertainty of the ensemble members by computing it directly from the meteorological input and the local spread distribution, without requiring multiple forecasts. Here, a sophisticated error distribution form is developed, which includes the superposition of the uncertainty caused by inaccuracies of the ensemble prediction system, which depends on surge level and lead time, and the uncertainty of the meteorological forcing, which is described using a combination of cross-basin pressure gradients. The ensemble dressing is validated over a 3-month-long period in the year 2010, during which an exceptional sequence of storm surges occurred. Results demonstrate that this computationally cheap method can provide an acceptably realistic estimate of the uncertainty.
Socio-hydrological flood models
Barendrecht, Marlies; Viglione, Alberto; Blöschl, Günter
2017-04-01
Long-term feedbacks between humans and floods may lead to complex phenomena such as coping strategies, levee effects, call effects, adaptation effects, and poverty traps. Such phenomena cannot be represented by traditional flood risk approaches that are based on scenarios. Instead, dynamic models of the coupled human-flood interactions are needed. These types of models should include both social and hydrological variables as well as other relevant variables, such as economic, environmental, political or technical, in order to adequately represent the feedbacks and processes that are of importance in human-flood systems. These socio-hydrological models may play an important role in integrated flood risk management by exploring a wider range of possible futures, including unexpected phenomena, than is possible by creating and studying scenarios. New insights might come to light about the long term effects of certain measures on society and the natural system. Here we discuss a dynamic framework for flood risk and review the models that are presented in literature. We propose a way forward for socio-hydrological modelling of the human-flood system.
Probabilistic logic networks a comprehensive framework for uncertain inference
Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari
2008-01-01
This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.
Probabilistic structural analysis algorithm development for computational efficiency
Wu, Y.-T.
1991-01-01
The PSAM (Probabilistic Structural Analysis Methods) program is developing a probabilistic structural risk assessment capability for the SSME components. An advanced probabilistic structural analysis software system, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), is being developed as part of the PSAM effort to accurately simulate stochastic structures operating under severe random loading conditions. One of the challenges in developing the NESSUS system is the development of the probabilistic algorithms that provide both efficiency and accuracy. The main probability algorithms developed and implemented in the NESSUS system are efficient, but approximate in nature. In the last six years, the algorithms have improved very significantly.
Omira, R.; Matias, L.; Baptista, M. A.
2016-12-01
This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.
Omira, R.; Matias, L.; Baptista, M. A.
2016-08-01
This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.
Salagnac, J.-L.; Diez, J.; Tourbier, J.
2012-04-01
Flooding has always been a major risk world-wide. Humans chose to live and develop settlements close to water (rivers, seas) due to the resources water brings, i.e. food, energy, capacity to economically transport persons and goods, and recreation. However, the risk from flooding, including pluvial flooding, often offsets these huge advantages. Floods sometimes have terrible consequences from both a human and economic point of view. The permanence and growth of urban areas in flood-prone zones despite these risks is a clear indication of the choices of concerned human groups. The observed growing concentration of population along the sea shore, the increase of urban population worldwide, the exponential growth of the world population and possibly climate change are factors that confirm flood will remain a major issue for the next decades. Flood management systems are designed and implemented to cope with such situations. In spite of frequent events, lessons look to be difficult to draw out and progresses are rather slow. The list of potential triggers to improve flood management systems is nevertheless well established: information, education, awareness raising, alert, prevention, protection, feedback from events, ... Many disciplines are concerned which cover a wide range of soft and hard sciences. A huge amount of both printed and electronic literature is available. Regulations are abundant. In spite of all these potentially favourable elements, similar questions spring up after each new significant event: • Was the event forecast precise enough? • Was the alert system efficient? • Why were buildings built in identified flood prone areas? • Why did the concerned population not follow instructions? • Why did the dike break? • What should we do to avoid it happens again? • What about damages evaluation, wastes and debris evacuation, infrastructures and buildings repair, activity recovery, temporary relocation of inhabitants, health concerns, insurance
Adaptive flood risk management in urban areas
Mees, H.L.P.; Driessen, P.P.J.; Runhaar, H.A.C.
2012-01-01
In recent times a shift has occurred from traditional flood management focused on the prevention of flooding (reduction of the probability) only, to more adaptive strategies focused on the reduction of the impacts of floods as a means to improve the resilience of occupied flood plains to increased r
Safety in the Chemical Laboratory: Flood Control.
Pollard, Bruce D.
1983-01-01
Describes events leading to a flood in the Wehr Chemistry Laboratory at Marquette University, discussing steps taken to minimize damage upon discovery. Analyzes the problem of flooding in the chemical laboratory and outlines seven steps of flood control: prevention; minimization; early detection; stopping the flood; evaluation; clean-up; and…
Local Flood Action Groups: Governance And Resilience
Forrest, Steven; Trell, Elen-Maarja; Woltjer, Johan; Macoun, Milan; Maier, Karel
2015-01-01
A diverse range of citizen groups focusing on flood risk management have been identified in several European countries. The paper discusses the role of flood action (citizen) groups in the context of flood resilience and will do this by analysing the UK and its diverse range of flood groups. These c
Probabilistic forecasts based on radar rainfall uncertainty
Liguori, S.; Rico-Ramirez, M. A.
2012-04-01
The potential advantages resulting from integrating weather radar rainfall estimates in hydro-meteorological forecasting systems is limited by the inherent uncertainty affecting radar rainfall measurements, which is due to various sources of error [1-3]. The improvement of quality control and correction techniques is recognized to play a role for the future improvement of radar-based flow predictions. However, the knowledge of the uncertainty affecting radar rainfall data can also be effectively used to build a hydro-meteorological forecasting system in a probabilistic framework. This work discusses the results of the implementation of a novel probabilistic forecasting system developed to improve ensemble predictions over a small urban area located in the North of England. An ensemble of radar rainfall fields can be determined as the sum of a deterministic component and a perturbation field, the latter being informed by the knowledge of the spatial-temporal characteristics of the radar error assessed with reference to rain-gauges measurements. This approach is similar to the REAL system [4] developed for use in the Southern-Alps. The radar uncertainty estimate can then be propagated with a nowcasting model, used to extrapolate an ensemble of radar rainfall forecasts, which can ultimately drive hydrological ensemble predictions. A radar ensemble generator has been calibrated using radar rainfall data made available from the UK Met Office after applying post-processing and corrections algorithms [5-6]. One hour rainfall accumulations from 235 rain gauges recorded for the year 2007 have provided the reference to determine the radar error. Statistics describing the spatial characteristics of the error (i.e. mean and covariance) have been computed off-line at gauges location, along with the parameters describing the error temporal correlation. A system has then been set up to impose the space-time error properties to stochastic perturbations, generated in real-time at
Flood damage, vulnerability and risk perception - challenges for flood damage research
2005-01-01
The current state-of-the-art in flood damage analysis mainly focuses on the economic evaluation of tangible flood effects. It is contended in this discussion paper that important economic, social and ecological aspects of flood-related vulnerabilities are neglected. It is a challenge for flood research to develop a wider perspective for flood damage evaluation.
A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.
Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan
2016-08-05
Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation.
Anthropogenic greenhouse gas contribution to flood risk in England and Wales in autumn 2000.
Pall, Pardeep; Aina, Tolu; Stone, Dáithí A; Stott, Peter A; Nozawa, Toru; Hilberts, Arno G J; Lohmann, Dag; Allen, Myles R
2011-02-17
Interest in attributing the risk of damaging weather-related events to anthropogenic climate change is increasing. Yet climate models used to study the attribution problem typically do not resolve the weather systems associated with damaging events such as the UK floods of October and November 2000. Occurring during the wettest autumn in England and Wales since records began in 1766, these floods damaged nearly 10,000 properties across that region, disrupted services severely, and caused insured losses estimated at £1.3 billion (refs 5, 6). Although the flooding was deemed a 'wake-up call' to the impacts of climate change at the time, such claims are typically supported only by general thermodynamic arguments that suggest increased extreme precipitation under global warming, but fail to account fully for the complex hydrometeorology associated with flooding. Here we present a multi-step, physically based 'probabilistic event attribution' framework showing that it is very likely that global anthropogenic greenhouse gas emissions substantially increased the risk of flood occurrence in England and Wales in autumn 2000. Using publicly volunteered distributed computing, we generate several thousand seasonal-forecast-resolution climate model simulations of autumn 2000 weather, both under realistic conditions, and under conditions as they might have been had these greenhouse gas emissions and the resulting large-scale warming never occurred. Results are fed into a precipitation-runoff model that is used to simulate severe daily river runoff events in England and Wales (proxy indicators of flood events). The precise magnitude of the anthropogenic contribution remains uncertain, but in nine out of ten cases our model results indicate that twentieth-century anthropogenic greenhouse gas emissions increased the risk of floods occurring in England and Wales in autumn 2000 by more than 20%, and in two out of three cases by more than 90%.
A probabilistic tsunami hazard assessment for Indonesia
Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.
2014-11-01
Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.
A comparison of confluence and ample sets in probabilistic and non-probabilistic branching time
Hansen, Henri; Timmer, Mark; Massink, M.; Norman, G.; Wiklicky, H.
2014-01-01
Confluence reduction and partial order reduction by means of ample sets are two different techniques for state space reduction in both traditional and probabilistic model checking. This paper provides an extensive comparison between these two methods, and answers the question how they relate in term
Hansen, Henri; Timmer, Mark
2012-01-01
Confluence reduction and partial order reduction by means of ample sets are two different techniques for state space reduction in both traditional and probabilistic model checking. This presentation provides an extensive comparison between these two methods, answering the long-standing question of h
Urban flood risk assessment using sewer flooding databases.
Caradot, Nicolas; Granger, Damien; Chapgier, Jean; Cherqui, Frédéric; Chocat, Bernard
2011-01-01
Sustainable water management is a global challenge for the 21st century. One key aspect remains protection against urban flooding. The main objective is to ensure or maintain an adequate level of service for all inhabitants. However, level of service is still difficult to assess and the high-risk locations difficult to identify. In this article, we propose a methodology, which (i) allows water managers to measure the service provided by the urban drainage system with regard to protection against urban flooding; and (ii) helps stakeholders to determine effective strategies for improving the service provided. One key aspect of this work is to use a database of sewer flood event records to assess flood risk. Our methodology helps urban water managers to assess the risk of sewer flooding; this approach does not seek to predict flooding but rather to inform decision makers on the current level of risk and on actions which need to be taken to reduce the risk. This work is based on a comprehensive definition of risk, including territorial vulnerability and perceptions of urban water stakeholders. This paper presents the results and the methodological contributions from implementing the methodology on two case studies: the cities of Lyon and Mulhouse.
Dugar, Sumit; Smith, Paul; Parajuli, Binod; Khanal, Sonu; Brown, Sarah; Gautam, Dilip; Bhandari, Dinanath; Gurung, Gehendra; Shakya, Puja; Kharbuja, RamGopal; Uprety, Madhab
2017-04-01
Operationalising effective Flood Early Warning Systems (EWS) in developing countries like Nepal poses numerous challenges, with complex topography and geology, sparse network of river and rainfall gauging stations and diverse socio-economic conditions. Despite these challenges, simple real-time monitoring based EWSs have been in place for the past decade. A key constraint of these simple systems is the very limited lead time for response - as little as 2-3 hours, especially for rivers originating from steep mountainous catchments. Efforts to increase lead time for early warning are focusing on imbedding forecasts into the existing early warning systems. In 2016, the Nepal Department of Hydrology and Meteorology (DHM) piloted an operational Probabilistic Flood Forecasting Model in major river basins across Nepal. This comprised a low data approach to forecast water levels, developed jointly through a research/practitioner partnership with Lancaster University and WaterNumbers (UK) and the International NGO Practical Action. Using Data-Based Mechanistic Modelling (DBM) techniques, the model assimilated rainfall and water levels to generate localised hourly flood predictions, which are presented as probabilistic forecasts, increasing lead times from 2-3 hours to 7-8 hours. The Nepal DHM has simultaneously started utilizing forecasts from the Global Flood Awareness System (GLoFAS) that provides streamflow predictions at the global scale based upon distributed hydrological simulations using numerical ensemble weather forecasts from the ECMWF (European Centre for Medium-Range Weather Forecasts). The aforementioned global and local models have already affected the approach to early warning in Nepal, being operational during the 2016 monsoon in the West Rapti basin in Western Nepal. On 24 July 2016, GLoFAS hydrological forecasts for the West Rapti indicated a sharp rise in river discharge above 1500 m3/sec (equivalent to the river warning level at 5 meters) with 53
Maximum confidence measurements via probabilistic quantum cloning
Institute of Scientific and Technical Information of China (English)
Zhang Wen-Hai; Yu Long-Bao; Cao Zhuo-Liang; Ye Liu
2013-01-01
Probabilistic quantum cloning (PQC) cannot copy a set of linearly dependent quantum states.In this paper,we show that if incorrect copies are allowed to be produced,linearly dependent quantum states may also be cloned by the PQC.By exploiting this kind of PQC to clone a special set of three linearly dependent quantum states,we derive the upper bound of the maximum confidence measure of a set.An explicit transformation of the maximum confidence measure is presented.
Probabilistic analysis of a thermosetting pultrusion process
DEFF Research Database (Denmark)
Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri
2016-01-01
process. A new application for the probabilistic analysis of the pultrusion process is introduced using the response surface method (RSM). The results obtained from the RSM are validated by employing the Monte Carlo simulation (MCS) with Latin hypercube sampling technique. According to the results......In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion...
Quantum correlations support probabilistic pure state cloning
Energy Technology Data Exchange (ETDEWEB)
Roa, Luis, E-mail: lroa@udec.cl [Departamento de Física, Universidad de Concepción, Casilla 160-C, Concepción (Chile); Alid-Vaccarezza, M.; Jara-Figueroa, C. [Departamento de Física, Universidad de Concepción, Casilla 160-C, Concepción (Chile); Klimov, A.B. [Departamento de Física, Universidad de Guadalajara, Avenida Revolución 1500, 44420 Guadalajara, Jalisco (Mexico)
2014-02-01
The probabilistic scheme for making two copies of two nonorthogonal pure states requires two auxiliary systems, one for copying and one for attempting to project onto the suitable subspace. The process is performed by means of a unitary-reduction scheme which allows having a success probability of cloning different from zero. The scheme becomes optimal when the probability of success is maximized. In this case, a bipartite state remains as a free degree which does not affect the probability. We find bipartite states for which the unitarity does not introduce entanglement, but does introduce quantum discord between some involved subsystems.
Probabilistic Analysis of the Quality Calculus
DEFF Research Database (Denmark)
Nielson, Hanne Riis; Nielson, Flemming
2013-01-01
We consider a fragment of the Quality Calculus, previously introduced for defensive programming of software components such that it becomes natural to plan for default behaviour in case the ideal behaviour fails due to unreliable communication. This paper develops a probabilistically based trust...... analysis supporting the Quality Calculus. It uses information about the probabilities that expected input will be absent in order to determine the trustworthiness of the data used for controlling the distributed system; the main challenge is to take accord of the stochastic dependency between some...
Signature recognition using neural network probabilistic
Directory of Open Access Journals (Sweden)
Heri Nurdiyanto
2016-03-01
Full Text Available The signature of each person is different and has unique characteristics. Thus, this paper discusses the development of a personal identification system based on it is unique digital signature. The process of preprocessing used gray scale method, while Shannon Entropy and Probabilistic Neural Network are used respectively for feature extraction and identification. This study uses five signature types with five signatures in every type. While the test results compared to actual data compared to real data, the proposed system performance was only 40%.
Probabilistic results for a mobile service scenario
DEFF Research Database (Denmark)
Møller, Jesper; Yiu, Man Lung
We consider the following stochastic model for a mobile service scenario. Consider a stationary Poisson process in Rd, with its points radially ordered with respect to the origin (the anchor); if d = 2, the points may correspond to locations of e.g. restaurants. A user, with a location different...... the inferred privacy region is a random set obtained by an adversary who only knows the anchor and the points received from the server, where the adversary ‘does the best' to infer the possible locations of the user. Probabilistic results related to the communication cost and the inferred privacy region...
Domain Knowledge Uncertainty and Probabilistic Parameter Constraints
Mao, Yi
2012-01-01
Incorporating domain knowledge into the modeling process is an effective way to improve learning accuracy. However, as it is provided by humans, domain knowledge can only be specified with some degree of uncertainty. We propose to explicitly model such uncertainty through probabilistic constraints over the parameter space. In contrast to hard parameter constraints, our approach is effective also when the domain knowledge is inaccurate and generally results in superior modeling accuracy. We focus on generative and conditional modeling where the parameters are assigned a Dirichlet or Gaussian prior and demonstrate the framework with experiments on both synthetic and real-world data.
Probabilistic Recovery Guarantees for Sparsely Corrupted Signals
Pope, Graeme; Studer, Christoph
2012-01-01
We consider the recovery of sparse signals subject to sparse interference, as introduced in Studer et al., IEEE Trans. IT, 2012. We present novel probabilistic recovery guarantees for this framework, covering varying degrees of knowledge of the signal and interference support, which are relevant for a large number of practical applications. Our results assume that the sparsifying dictionaries are solely characterized by coherence parameters and we require randomness only in the signal and/or interference. The obtained recovery guarantees show that one can recover sparsely corrupted signals with overwhelming probability, even if the sparsity of both the signal and interference scale (near) linearly with the number of measurements.
Probabilistic double guarantee kidnapping detection in SLAM.
Tian, Yang; Ma, Shugen
2016-01-01
For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.
Probabilistic assessment of pressurised thermal shocks
Energy Technology Data Exchange (ETDEWEB)
Pištora, Vladislav, E-mail: pis@ujv.cz; Pošta, Miroslav; Lauerová, Dana
2014-04-01
Rector pressure vessel (RPV) is a key component of all PWR and VVER nuclear power plants (NPPs). Assuring its integrity is therefore of high importance. Due to high neutron fluence the RPV material is embrittled during NPP operation. The embrittled RPV may undergo severe loading during potential events of the type of pressurised thermal shock (PTS), possibly occurring in the NPP. The resistance of RPV against fast fracture has to be proven by comprehensive analyses. In most countries (with exception of the USA), proving RPV integrity is based on the deterministic PTS assessment. In the USA, the “screening criteria” for maximum allowable embrittlement of RPV material, which form part of the USA regulations, are based on the probabilistic PTS assessment. In other countries, probabilistic PTS assessment is performed only at research level or as supplementary to the deterministic PTS assessment for individual RPVs. In this paper, description of complete probabilistic PTS assessment for a VVER 1000 RPV is presented, in particular, both the methodology and the results are attached. The methodology corresponds to the Unified Procedure for Lifetime Assessment of Components and Piping in WWER NPPs, “VERLIFE”, Version 2008. The main parameters entering the analysis, which are treated as statistical distributions, are as follows: -initial value of material reference temperature T{sub 0}, -reference temperature shift ΔT{sub 0} due to neutron fluence, -neutron fluence, -size, shape, position and density of cracks in the RPV wall, -fracture toughness of RPV material (Master Curve concept is used). The first step of the analysis consists in selection of sequences potentially leading to PTS, their grouping, establishing their frequencies, and selecting of representative scenarios within all groups. Modified PSA model is used for this purpose. The second step consists in thermal hydraulic analyses of the representative scenarios, with the goal to prepare input data for the
Probabilistic Design of Offshore Structural Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard
1988-01-01
Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi...
Quantitative analysis of probabilistic BPMN workflows
DEFF Research Database (Denmark)
Herbert, Luke Thomas; Sharp, Robin
2012-01-01
We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...
Probabilistic modeling of solar power systems
Safie, Fayssal M.
1989-01-01
The author presents a probabilistic approach based on Markov chain theory to model stand-alone photovoltaic power systems and predict their long-term service performance. The major advantage of this approach is that it allows designers and developers of these systems to analyze the system performance as well as the battery subsystem performance in the long run and determine the system design requirements that meet a specified service performance level. The methodology presented is illustrated by using data for a radio repeater system for the Boston, Massachusetts, location.
Ensemble postprocessing for probabilistic quantitative precipitation forecasts
Bentzien, S.; Friederichs, P.
2012-12-01
Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical
Probabilistic remote state preparation by W states
Institute of Scientific and Technical Information of China (English)
Liu Jin-Ming; Wang Yu-Zhu
2004-01-01
In this paper we consider a scheme for probabilistic remote state preparation of a general qubit by using W states. The scheme consists of the sender, Alice, and two remote receivers Bob and Carol. Alice performs a projective measurement on her qubit in the basis spanned by the state she wants to prepare and its orthocomplement. This allows either Bob or Carol to reconstruct the state with finite success probability. It is shown that for some special ensembles of qubits, the remote state preparation scheme requires only two classical bits, unlike the case in the scheme of quantum teleportation where three classical bits are needed.
Flood Progression Modelling and Impact Analysis
DEFF Research Database (Denmark)
Mioc, Darka; Anton, François; Nickerson, B.
People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model...... that computes floodplain polygons before the flood occurs. This allows emergency managers to access the impact of the flood before it occurs and make the early decisions for evacuation of the population and flood rescue. This research shows that the use of GIS and LiDAR technologies combined with hydrological...... modelling can significantly improve the decision making and visualization of flood impact needed for emergency planning and flood rescue. Furthermore, the 3D GIS application we developed for modelling flooded buildings and infrastructure provides a better platform for modelling and visualizing flood...
Smoky River coal flood risk mapping study
Energy Technology Data Exchange (ETDEWEB)
NONE
2004-06-01
The Canada-Alberta Flood Damage Reduction Program (FDRP) is designed to reduce flood damage by identifying areas susceptible to flooding and by encouraging application of suitable land use planning, zoning, and flood preparedness and proofing. The purpose of this study is to define flood risk and floodway limits along the Smoky River near the former Smoky River Coal (SRC) plant. Alberta Energy has been responsible for the site since the mine and plant closed in 2000. The study describes flooding history, available data, features of the river and valley, calculation of flood levels, and floodway determination, and includes flood risk maps. The HEC-RAS program is used for the calculations. The flood risk area was calculated using the 1:100 year return period flood as the hydrological event. 7 refs., 11 figs., 7 tabs., 3 apps.
Flood Resilient Systems and their Application for Flood Resilient Planning
Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.
2012-04-01
Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project
A climate emulator for coastal flooding events
Rueda Zamora, A. C.; Méndez Incera, F. J.; Camus, P.; Tomas, A.
2014-12-01
The evaluation of coastal flooding requires the definition of the multivariate marine climate conditions (wave height, wave period, wave direction, wind, surge levels). Historical reanalysis databases are a valuable information source. However, the limited time period covered implies uncertainty in the statistical characterization of extremes. Besides, downscaling is needed to extend data to climate change scenarios or long-term historical periods, or even to understand the interannual variability. A statistical downscaling approach is adopted due to its low computational cost. The relationship between large-scale atmospheric variables (predictor) and local marine climate variables (predictand) is established by means of a physical division of the predictand based on weather types. The multivariate dependence structure of the predictand (extreme events) is introduced linking the independent marginal distributions of the variables by a probabilistic copula regression. Therefore, the climate emulator is a stochastic hybrid model with the following steps: 1) Collecting historical data for the predictor (atmospheric variables) and predictand (sea state parameters, storm-surge); 2) Predictor definition, i.e. using ESTELA method in the case of wave generation characteristics (Pérez et al., 2014a); 3) Defining the most appropriate statistical model (distribution modeling based on weather-type); 4) Stochastic simulation of the present climate; 5) Marine climate downscaling under climate change scenarios (selecting the best GCMs from CMIP5, Pérez et al., 2014b). References: Perez, J., Menéndez, M., Méndez, F.J., Losada, I.J. (2014a). Evaluating the performance of CMIP3 and CMIP5 global climate models over the north-east Atlantic region, Climate Dynamics, DOI 10.1007/s00382-014-2078-8. Perez, J., Menéndez, M., Méndez, F.J., Losada, I.J. (2014b) ESTELA: A method for evaluating the source and travel-time of the wave energy reaching a local area. Ocean Dynamics, DOI 10
Omira, Rachid; Baptista, Maria Ana; Matias, Luis
2015-04-01
This study constitutes the first assessment of probabilistic tsunami inundation in the NE Atlantic region, using an event-tree approach. It aims to develop a probabilistic tsunami inundation approach for the NE Atlantic coast with an application to two test sites of ASTARTE project, Tangier-Morocco and Sines-Portugal. Only tsunamis of tectonic origin are considered here, taking into account near-, regional- and far-filed sources. The multidisciplinary approach, proposed here, consists of an event-tree method that gathers seismic hazard assessment, tsunami numerical modelling, and statistical methods. It presents also a treatment of uncertainties related to source location and tidal stage in order to derive the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height during a given return period. We derive high-resolution probabilistic maximum wave heights and flood distributions for both test-sites Tangier and Sines considering 100-, 500-, and 1000-year return periods. We find that the probability that a maximum wave height exceeds 1 m somewhere along the Sines coasts reaches about 55% for 100-year return period, and is up to 100% for 1000-year return period. Along Tangier coast, the probability of inundation occurrence (flow depth > 0m) is up to 45% for 100-year return period and reaches 96% in some near-shore costal location for 500-year return period. Acknowledgements: This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3).
Flood Fighting Products Research Facility
Federal Laboratory Consortium — A wave research basin at the ERDC Coastal and Hydraulics Laboratory has been modified specifically for testing of temporary, barrier-type, flood fighting products....
Cyber surveillance for flood disasters
National Research Council Canada - National Science Library
Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han
2015-01-01
... river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image...
Flash floods: forecasting and warning
National Research Council Canada - National Science Library
Sene, Kevin
2013-01-01
.... Floods of this type are often characterised by fast flowing deep water and a high debris content which - combined with the short lead time available for warnings - add to the risk to people and property...
FEMA Flood Insurance Studies Inventory
Kansas Data Access and Support Center — This digital data set provides an inventory of Federal Emergency Management Agency (FEMA) Flood Insurance Studies (FIS) that have been conducted for communities and...
NASA's Support to Flood Response
Green, D. S.; Murray, J. J.; Stough, T.
2016-12-01
The extent of flood and inundation, the impacts on people and infrastructure, and generally the situational awareness on all scales for decision making are areas where NASA is mobilizing scientific results, advanced sensing and technologies, experts and partnerships to support response. NASA has targeted mature application science and ready technology for flood and inundation monitoring and assessment. This includes supporting timely data management and product dissemination with users and partners. Requirements are captured in the form of science-area questions, while solutions measure readiness for use by considering standard tools and approaches that make information more accessible, interoperable, understandable and reliable. The program collaborates with capacity building and areas of education and outreach needed to create and leverage non-traditional partnerships in transdisciplinary areas including socio-economic practice, preparedness and resilience assessment, early warning and forecast response, and emergency management, relief and recovery. The program outcomes also seek alignment with and support to global and community priorities related to water resources and food security. This presentation will examine the achievements of individual projects and the challenges and opportunities of more comprehensive and collaborative teams behind NASA's response to global flooding. Examples from recent event mobilization will be reviewed including to the serious of domestic floods across the south and Midwest United States throughout 2015 and 2016. Progress on the combined use of optical, microwave and SAR remote sensing measurements, topographic and geodetic data and mapping, data sharing practices will be reviewed. Other response case studies will examine global flood events monitored, characterized and supported in various boundary regions and nations. Achievements and future plans will be described for capabilities including global flood modeling, near real
Elk River Watershed - Flood Study
Barnes, C. C.; Byrne, J. M.; MacDonald, R. J.; Lewis, D.
2014-12-01
Flooding has the potential to cause significant impacts to economic activities as well as to disrupt or displace populations. Changing climate regimes such as extreme precipitation events increase flood vulnerability and put additional stresses on infrastructure. Potential flooding from just under 100 (2009 NPRI Reviewed Facility Data Release, Environment Canada) toxic tailings ponds located in Canada increase risk to human safety and the environment. One such geotechnical failure spilt billions of litres of toxic tailings into the Fraser River watershed, British Columbia, when a tailings pond dam breach occurred in August 2014. Damaged and washed out roadways cut access to essential services as seen by the extensive floods that occurred in Saskatchewan and Manitoba in July 2014, and in Southern Alberta in 2013. Recovery efforts from events such as these can be lengthy, and have substantial social and economic impacts both in loss of revenue and cost of repair. The objective of this study is to investigate existing conditions in the Elk River watershed and model potential future hydrological changes that can increase flood risk hazards. By analyzing existing hydrology, meteorology, land cover, land use, economic, and settlement patterns a baseline is established for existing conditions in the Elk River watershed. Coupling the Generate Earth Systems Science (GENESYS) high-resolution spatial hydrometeorological model with flood hazard analysis methodology, high-resolution flood vulnerability base line maps are created using historical climate conditions. Further work in 2015 will examine possible impacts for a range of climate change and land use change scenarios to define changes to future flood risk and vulnerability.
Flooding Effect on Earth Walls
Directory of Open Access Journals (Sweden)
Meysam Banimahd
2010-12-01
Full Text Available Earth building is a sustainable, environmentally friendly and economical method of construction that has been used worldwide for many centuries. For the past three decades, earth has seen a revival as a building material for a modern construction method due to its benefits in terms of low carbon content, low cost and energy involved during construction, as well as the fact that it is a sustainable technology of building. Climate change is influencing precipitation levels and patterns around the world, and as a consequence, flood risk is increasing rapidly. When flooding occurs, earth buildings are exposed to water by submersion, causing an increase in the degree of saturation of the earth structures and therefore a decrease of the suction between particles. This study investigated the effect of cycles of flooding (consecutive events of flooding followed by dry periods on earth walls. A series of characterization tests were carried out to obtain the physical and mechanical properties of the studied earth material. In a second stage, Flooding Simulation Tests (FST were performed to explore the earth walls’ response to repeated flooding events. The results obtained for the tested earth wall/samples with reinforced material (straw reveal hydraulic hysteresis when wall/samples are subject to cycles of wetting and drying.
Extreme flooding tolerance in Rorippa.
Akman, Melis; Bhikharie, Amit; Mustroph, Angelika; Sasidharan, Rashmi
2014-01-01
Low oxygen stress imposed by floods creates a strong selection force shaping plant ecosystems in flood-prone areas. Plants inhabiting these environments adopt various adaptations and survival strategies to cope with increasing water depths. Two Rorippa species, R. sylvestris and R. amphibia that grow in naturally flooded areas, have high submergence tolerance achieved by the so-called quiescence and escape strategies, respectively. In order to dissect the molecular mechanisms involved in these strategies, we investigated submergence-induced changes in gene expression in flooded roots of Rorippa species. There was a higher induction of glycolysis and fermentation genes and faster carbohydrate reduction in R. amphibia, indicating a higher demand for energy potentially leading to faster mortality by starvation. Moreover, R. sylvestris showed induction of genes improving submergence tolerance, potentially enhancing survival in prolonged floods. Additionally, we compared transcript profiles of these 2 tolerant species to relatively intolerant Arabidopsis and found that only Rorippa species induced various inorganic pyrophosphate dependent genes, alternatives to ATP demanding pathways, thereby conserving energy, and potentially explaining the difference in flooding survival between Rorippa and Arabidopsis.
FLOODING ATTACK AWARE SECURE AODV
Directory of Open Access Journals (Sweden)
S. Madhavi
2013-01-01
Full Text Available Providing security in a Mobile Ad hoc Network (MANET is a challenging task due to its inherent nature. Flooding is a type of Denial of Service (DoS attack in MANET. Intentional flooding may lead to disturbances in the networking operation. This kind of attack consumes battery power, storage space and bandwidth. Flooding the excessive number of packets may degrade the performance of the network. This study considers hello flooding attack. As the hello packets are continuously flooded by the malicious node, the neighbor node is not able to process other packets. The functioning of the legitimate node is diverted and destroys the networking operation. Absence of hello packet during the periodical hello interval may lead to wrong assumption that the neighbor node has moved away. So one of the intermediate neighbor nodes sends Route Error (RERR message and the source node reinitiates the route discovery process. In a random fashion the hello interval values are changed and convey this information to other nodes in the network in a secured manner. This study identifies and prevents the flooding attack. This methodology considers the performance parameters such as packet delivery ratio, delay and throughput. This algorithm is implemented in Secure AODV and tested in ad hoc environment. The result of the proposed algorithm decreases the control overhead by 2%.
Scales of Natural Flood Management
Nicholson, Alex; Quinn, Paul; Owen, Gareth; Hetherington, David; Piedra Lara, Miguel; O'Donnell, Greg
2016-04-01
The scientific field of Natural flood Management (NFM) is receiving much attention and is now widely seen as a valid solution to sustainably manage flood risk whilst offering significant multiple benefits. However, few examples exist looking at NFM on a large scale (>10km2). Well-implemented NFM has the effect of restoring more natural catchment hydrological and sedimentological processes, which in turn can have significant flood risk and WFD benefits for catchment waterbodies. These catchment scale improvements in-turn allow more 'natural' processes to be returned to rivers and streams, creating a more resilient system. Although certain NFM interventions may appear distant and disconnected from main stem waterbodies, they will undoubtedly be contributing to WFD at the catchment waterbody scale. This paper offers examples of NFM, and explains how they can be maximised through practical design across many scales (from feature up to the whole catchment). New tools to assist in the selection of measures and their location, and to appreciate firstly, the flooding benefit at the local catchment scale and then show a Flood Impact Model that can best reflect the impacts of local changes further downstream. The tools will be discussed in the context of our most recent experiences on NFM projects including river catchments in the north east of England and in Scotland. This work has encouraged a more integrated approach to flood management planning that can use both traditional and novel NFM strategies in an effective and convincing way.