Estimation of Internal Flooding Frequency for Screening Analysis of Flooding PSA
International Nuclear Information System (INIS)
Choi, Sun Yeong; Yang, Jun Eon
2005-01-01
The purpose of this paper is to estimate the internal frequency for the quantitative screening analysis of the flooding PSA (Probabilistic Safety Assessment) with the appropriate data and estimation method. In the case of the existing flood PSA for domestic NPPs (Nuclear Power Plant), the screening analysis was performed firstly and then detailed analysis was performed for the area not screened out. For the quantitative screening analysis, the plant area based flood frequency by MLE (Maximum Likelihood Estimation) method was used, while the component based flood frequency is used for the detailed analysis. The existing quantitative screening analysis for domestic NPPs have used data from all LWRs (Light Water Reactor), namely PWR (Pressurized Water Reactor) and BWR (Boiling Water Reactor) for the internal flood frequency of the auxiliary building and turbine building. However, in the case of the primary auxiliary building, the applicability of the data from all LWRs needs to be examined carefully because of the significant difference in equipments between the PWR and BWR structure. NUREG/CR-5750 suggested the Bayesian update method with Jeffrey's noninformative prior to estimate the initiating event frequency for the flood. It, however, did not describe any procedure of the flood PSA. Recently, Fleming and Lydell suggested the internal flooding frequency in the unit of the plant operation year-pipe length (in meter) by pipe size of each specific system which is susceptible to the flooding such as the service water system and the circulating water system. They used the failure rate, the rupture conditional probability given the failure to estimate the internal flooding frequency, and the Bayesian update to reduce uncertainties. To perform the quantitative screening analysis with the method, it requires pipe length by each pipe size of the specific system per each divided area to change the concept of the component based frequency to the concept of the plant area
Uncertainty Measures of Regional Flood Frequency Estimators
DEFF Research Database (Denmark)
Rosbjerg, Dan; Madsen, Henrik
1995-01-01
Regional flood frequency models have different assumptions regarding homogeneity and inter-site independence. Thus, uncertainty measures of T-year event estimators are not directly comparable. However, having chosen a particular method, the reliability of the estimate should always be stated, e...
FEH Local: Improving flood estimates using historical data
Directory of Open Access Journals (Sweden)
Prosdocimi Ilaria
2016-01-01
Full Text Available The traditional approach to design flood estimation (for example, to derive the 100-year flood is to apply a statistical model to time series of peak river flow measured by gauging stations. Such records are typically not very long, for example in the UK only about 10% of the stations have records that are more than 50 years in length. Along-explored way to augment the data available from a gauging station is to derive information about historical flood events and paleo-floods, which can be obtained from careful exploration of archives, old newspapers, flood marks or other signs of past flooding that are still discernible in the catchment, and the history of settlements. The inclusion of historical data in flood frequency estimation has been shown to substantially reduce the uncertainty around the estimated design events and is likely to provide insight into the rarest events which might have pre-dated the relatively short systematic records. Among other things, the FEH Local project funded by the Environment Agency aims to develop methods to easily incorporate historical information into the standard method of statistical flood frequency estimation in the UK. Different statistical estimation procedures are explored, namely maximum likelihood and partial probability weighted moments, and the strengths and weaknesses of each method are investigated. The project assesses the usefulness of historical data and aims to provide practitioners with useful guidelines to indicate in what circumstances the inclusion of historical data is likely to be beneficial in terms of reducing both the bias and the variability of the estimated flood frequency curves. The guidelines are based on the results of a large Monte Carlo simulation study, in which different estimation procedures and different data availability scenarios are studied. The study provides some indication of the situations under which different estimation procedures might give a better performance.
Directory of Open Access Journals (Sweden)
Fazlul Karim
2017-06-01
Full Text Available Understanding the nature of frequent floods is important for characterising channel morphology, riparian and aquatic habitat, and informing river restoration efforts. This paper presents results from an analysis on frequency estimates of low magnitude floods using the annual maximum and partial series data compared to actual flood series. Five frequency distribution models were fitted to data from 24 gauging stations in the Great Barrier Reef (GBR lagoon catchments in north-eastern Australia. Based on the goodness of fit test, Generalised Extreme Value, Generalised Pareto and Log Pearson Type 3 models were used to estimate flood frequencies across the study region. Results suggest frequency estimates based on a partial series are better, compared to an annual series, for small to medium floods, while both methods produce similar results for large floods. Although both methods converge at a higher recurrence interval, the convergence recurrence interval varies between catchments. Results also suggest frequency estimates vary slightly between two or more partial series, depending on flood threshold, and the differences are large for the catchments that experience less frequent floods. While a partial series produces better frequency estimates, it can underestimate or overestimate the frequency if the flood threshold differs largely compared to bankfull discharge. These results have significant implications in calculating the dependency of floodplain ecosystems on the frequency of flooding and their subsequent management.
SHYREG, a national database of flood frequency estimation
Directory of Open Access Journals (Sweden)
Arnaud Patrick
2016-01-01
Full Text Available SHYREG method is a regionalized method for rainfall and flood frequency analysis (FFA. It is based on processes simulation. It couples an hourly rainfall generator with a rainfall-runoff model, simplified enough to be regionalized. The method has been calibrated using all hydro meteorological data available at the national level. In France, that represents about 2800 raingauges of the French Weather Service network and about 1800 stations of the hydrometric National Bank network. Then, the method has been regionalized to provide a rainfall and flow quantiles database. An evaluation of the method was carried out during different thesis works and more recently during the ANR project Extraflo, with the aim of comparing different FFA approaches. The accuracy of the method in estimating rainfall and flow quantiles has been proved, as well as its stability due to a parameterization based on average values. The link with rainfall seems preferable to extrapolation based solely on the flow. Thus, another interest of the method is to take into account extreme flood behaviour with help of rainfall frequency estimation. In addition, the approach is implicitly multi-durational, and only one regionalization meets all the needs in terms hydrological hazards characterisation. For engineering needs and to avoid repeating the method implementation, this method has been applied throughout a 50 meters resolution mesh to provide a complete flood quantiles database over the French territory providing regional information on hydrological hazards. However, it is subject to restrictions related to the nature of the method: the SHYREG flows are “natural”, and do not take into account specific cases like the basins highly influenced by presence of hydraulic works, flood expansion areas, high snowmelt or karsts. Information about these restrictions and uncertainty estimation is provided with this database, which can be consulted via web access.
Methodology for Estimation of Flood Magnitude and Frequency for New Jersey Streams
Watson, Kara M.; Schopp, Robert D.
2009-01-01
Methodologies were developed for estimating flood magnitudes at the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals for unregulated or slightly regulated streams in New Jersey. Regression equations that incorporate basin characteristics were developed to estimate flood magnitude and frequency for streams throughout the State by use of a generalized least squares regression analysis. Relations between flood-frequency estimates based on streamflow-gaging-station discharge and basin characteristics were determined by multiple regression analysis, and weighted by effective years of record. The State was divided into five hydrologically similar regions to refine the regression equations. The regression analysis indicated that flood discharge, as determined by the streamflow-gaging-station annual peak flows, is related to the drainage area, main channel slope, percentage of lake and wetland areas in the basin, population density, and the flood-frequency region, at the 95-percent confidence level. The standard errors of estimate for the various recurrence-interval floods ranged from 48.1 to 62.7 percent. Annual-maximum peak flows observed at streamflow-gaging stations through water year 2007 and basin characteristics determined using geographic information system techniques for 254 streamflow-gaging stations were used for the regression analysis. Drainage areas of the streamflow-gaging stations range from 0.18 to 779 mi2. Peak-flow data and basin characteristics for 191 streamflow-gaging stations located in New Jersey were used, along with peak-flow data for stations located in adjoining States, including 25 stations in Pennsylvania, 17 stations in New York, 16 stations in Delaware, and 5 stations in Maryland. Streamflow records for selected stations outside of New Jersey were included in the present study because hydrologic, physiographic, and geologic boundaries commonly extend beyond political boundaries. The StreamStats web application was developed
Flood frequency analysis of historical flood data under stationary and non-stationary modelling
Machado, M. J.; Botero, B. A.; López, J.; Francés, F.; Díez-Herrero, A.; Benito, G.
2015-06-01
analysis using documentary data (plus gauged records) improved the estimates of the probabilities of rare floods (return intervals of 100 yr and higher). Under non-stationary modelling flood occurrence associated with an exceedance probability of 0.01 (i.e. return period of 100 yr) has changed over the last 500 yr due to decadal and multi-decadal variability of the NAO. Yet, frequency analysis under stationary models was successful in providing an average discharge around which value flood quantiles estimated by non-stationary models fluctuate through time.
Flood-frequency characteristics of Wisconsin streams
Walker, John F.; Peppler, Marie C.; Danz, Mari E.; Hubbard, Laura E.
2017-05-22
Flood-frequency characteristics for 360 gaged sites on unregulated rural streams in Wisconsin are presented for percent annual exceedance probabilities ranging from 0.2 to 50 using a statewide skewness map developed for this report. Equations of the relations between flood-frequency and drainage-basin characteristics were developed by multiple-regression analyses. Flood-frequency characteristics for ungaged sites on unregulated, rural streams can be estimated by use of the equations presented in this report. The State was divided into eight areas of similar physiographic characteristics. The most significant basin characteristics are drainage area, soil saturated hydraulic conductivity, main-channel slope, and several land-use variables. The standard error of prediction for the equation for the 1-percent annual exceedance probability flood ranges from 56 to 70 percent for Wisconsin Streams; these values are larger than results presented in previous reports. The increase in the standard error of prediction is likely due to increased variability of the annual-peak discharges, resulting in increased variability in the magnitude of flood peaks at higher frequencies. For each of the unregulated rural streamflow-gaging stations, a weighted estimate based on the at-site log Pearson type III analysis and the multiple regression results was determined. The weighted estimate generally has a lower uncertainty than either the Log Pearson type III or multiple regression estimates. For regulated streams, a graphical method for estimating flood-frequency characteristics was developed from the relations of discharge and drainage area for selected annual exceedance probabilities. Graphs for the major regulated streams in Wisconsin are presented in the report.
Paretti, Nicholas V.; Kennedy, Jeffrey R.; Turney, Lovina A.; Veilleux, Andrea G.
2014-01-01
Flooding is among the worst natural disasters responsible for loss of life and property in Arizona, underscoring the importance of accurate estimation of flood magnitude for proper structural design and floodplain mapping. Twenty-four years of additional peak-flow data have been recorded since the last comprehensive regional flood frequency analysis conducted in Arizona. Periodically, flood frequency estimates and regional regression equations must be revised to maintain the accurate estimation of flood frequency and magnitude.
The index-flood and the GRADEX methods combination for flood frequency analysis.
Fuentes, Diana; Di Baldassarre, Giuliano; Quesada, Beatriz; Xu, Chong-Yu; Halldin, Sven; Beven, Keith
2017-04-01
Flood frequency analysis is used in many applications, including flood risk management, design of hydraulic structures, and urban planning. However, such analysis requires of long series of observed discharge data which are often not available in many basins around the world. In this study, we tested the usefulness of combining regional discharge and local precipitation data to estimate the event flood volume frequency curve for 63 catchments in Mexico, Central America and the Caribbean. This was achieved by combining two existing flood frequency analysis methods, the regionalization index-flood approach with the GRADEX method. For up to 10-years return period, similar shape of the scaled flood frequency curve for catchments with similar flood behaviour was assumed from the index-flood approach. For return periods larger than 10-years the probability distribution of rainfall and discharge volumes were assumed to be asymptotically and exponential-type functions with the same scale parameter from the GRADEX method. Results showed that if the mean annual flood (MAF), used as index-flood, is known, the index-flood approach performed well for up to 10 years return periods, resulting in 25% mean relative error in prediction. For larger return periods the prediction capability decreased but could be improved by the use of the GRADEX method. As the MAF is unknown at ungauged and short-period measured basins, we tested predicting the MAF using catchments climate-physical characteristics, and discharge statistics, the latter when observations were available for only 8 years. Only the use of discharge statistics resulted in acceptable predictions.
Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.
2013-12-01
In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example
Amplification of flood frequencies with local sea level rise and emerging flood regimes
Buchanan, Maya K.; Oppenheimer, Michael; Kopp, Robert E.
2017-06-01
The amplification of flood frequencies by sea level rise (SLR) is expected to become one of the most economically damaging impacts of climate change for many coastal locations. Understanding the magnitude and pattern by which the frequency of current flood levels increase is important for developing more resilient coastal settlements, particularly since flood risk management (e.g. infrastructure, insurance, communications) is often tied to estimates of flood return periods. The Intergovernmental Panel on Climate Change’s Fifth Assessment Report characterized the multiplication factor by which the frequency of flooding of a given height increases (referred to here as an amplification factor; AF). However, this characterization neither rigorously considered uncertainty in SLR nor distinguished between the amplification of different flooding levels (such as the 10% versus 0.2% annual chance floods); therefore, it may be seriously misleading. Because both historical flood frequency and projected SLR are uncertain, we combine joint probability distributions of the two to calculate AFs and their uncertainties over time. Under probabilistic relative sea level projections, while maintaining storm frequency fixed, we estimate a median 40-fold increase (ranging from 1- to 1314-fold) in the expected annual number of local 100-year floods for tide-gauge locations along the contiguous US coastline by 2050. While some places can expect disproportionate amplification of higher frequency events and thus primarily a greater number of historically precedented floods, others face amplification of lower frequency events and thus a particularly fast growing risk of historically unprecedented flooding. For example, with 50 cm of SLR, the 10%, 1%, and 0.2% annual chance floods are expected respectively to recur 108, 335, and 814 times as often in Seattle, but 148, 16, and 4 times as often in Charleston, SC.
Climate, orography and scale controls on flood frequency in Triveneto (Italy
Directory of Open Access Journals (Sweden)
S. Persiano
2016-05-01
Full Text Available The growing concern about the possible effects of climate change on flood frequency regime is leading Authorities to review previously proposed reference procedures for design-flood estimation, such as national flood frequency models. Our study focuses on Triveneto, a broad geographical region in North-eastern Italy. A reference procedure for design flood estimation in Triveneto is available from the Italian NCR research project "VA.PI.", which considered Triveneto as a single homogeneous region and developed a regional model using annual maximum series (AMS of peak discharges that were collected up to the 1980s by the former Italian Hydrometeorological Service. We consider a very detailed AMS database that we recently compiled for 76 catchments located in Triveneto. All 76 study catchments are characterized in terms of several geomorphologic and climatic descriptors. The objective of our study is threefold: (1 to inspect climatic and scale controls on flood frequency regime; (2 to verify the possible presence of changes in flood frequency regime by looking at changes in time of regional L-moments of annual maximum floods; (3 to develop an updated reference procedure for design flood estimation in Triveneto by using a focused-pooling approach (i.e. Region of Influence, RoI. Our study leads to the following conclusions: (1 climatic and scale controls on flood frequency regime in Triveneto are similar to the controls that were recently found in Europe; (2 a single year characterized by extreme floods can have a remarkable influence on regional flood frequency models and analyses for detecting possible changes in flood frequency regime; (3 no significant change was detected in the flood frequency regime, yet an update of the existing reference procedure for design flood estimation is highly recommended and we propose the RoI approach for properly representing climate and scale controls on flood frequency in Triveneto, which cannot be regarded
Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis
2014-01-01
Reliable estimates of the magnitude and frequency of floods are essential for such things as the design of transportation and water-conveyance structures, Flood Insurance Studies, and flood-plain management. The flood-frequency estimates are particularly important in densely populated urban areas. A multistate approach was used to update methods for determining the magnitude and frequency of floods in urban and small, rural streams that are not substantially affected by regulation or tidal fluctuations in Georgia, South Carolina, and North Carolina. The multistate approach has the advantage over a single state approach of increasing the number of stations available for analysis, expanding the geographical coverage that would allow for application of regional regression equations across state boundaries, and building on a previous flood-frequency investigation of rural streamflow-gaging stations (streamgages) in the Southeastern United States. In addition, streamgages from the inner Coastal Plain of New Jersey were included in the analysis. Generalized least-squares regression techniques were used to generate predictive equations for estimating the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probability flows for urban and small, rural ungaged basins for three hydrologic regions; the Piedmont-Ridge and Valley, Sand Hills, and Coastal Plain. Incorporation of urban streamgages from New Jersey also allowed for the expansion of the applicability of the predictive equations in the Coastal Plain from 2.1 to 53.5 square miles. Explanatory variables in the regression equations included drainage area (DA) and percent of impervious area (IA) for the Piedmont-Ridge and Valley region; DA and percent of developed land for the Sand Hills; and DA, IA, and 24-hour, 50-year maximum precipitation for the Coastal Plain. An application spreadsheet also was developed that can be used to compute the flood-frequency estimates along with the 95-percent prediction
Curran, Janet H.; Barth, Nancy A.; Veilleux, Andrea G.; Ourso, Robert T.
2016-03-16
Estimates of the magnitude and frequency of floods are needed across Alaska for engineering design of transportation and water-conveyance structures, flood-insurance studies, flood-plain management, and other water-resource purposes. This report updates methods for estimating flood magnitude and frequency in Alaska and conterminous basins in Canada. Annual peak-flow data through water year 2012 were compiled from 387 streamgages on unregulated streams with at least 10 years of record. Flood-frequency estimates were computed for each streamgage using the Expected Moments Algorithm to fit a Pearson Type III distribution to the logarithms of annual peak flows. A multiple Grubbs-Beck test was used to identify potentially influential low floods in the time series of peak flows for censoring in the flood frequency analysis.For two new regional skew areas, flood-frequency estimates using station skew were computed for stations with at least 25 years of record for use in a Bayesian least-squares regression analysis to determine a regional skew value. The consideration of basin characteristics as explanatory variables for regional skew resulted in improvements in precision too small to warrant the additional model complexity, and a constant model was adopted. Regional Skew Area 1 in eastern-central Alaska had a regional skew of 0.54 and an average variance of prediction of 0.45, corresponding to an effective record length of 22 years. Regional Skew Area 2, encompassing coastal areas bordering the Gulf of Alaska, had a regional skew of 0.18 and an average variance of prediction of 0.12, corresponding to an effective record length of 59 years. Station flood-frequency estimates for study sites in regional skew areas were then recomputed using a weighted skew incorporating the station skew and regional skew. In a new regional skew exclusion area outside the regional skew areas, the density of long-record streamgages was too sparse for regional analysis and station skew was used
Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2017-11-01
Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with
2014-03-01
Reliable estimates of the magnitude and frequency : of floods are essential for the design of transportation and : water-conveyance structures, flood-insurance studies, and : flood-plain management. Such estimates are particularly : important in dens...
Evaluation of design flood estimates with respect to sample size
Kobierska, Florian; Engeland, Kolbjorn
2016-04-01
Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.
Estimation of Flood-Frequency Discharges for Rural, Unregulated Streams in West Virginia
Wiley, Jeffrey B.; Atkins, John T.
2010-01-01
Flood-frequency discharges were determined for 290 streamgage stations having a minimum of 9 years of record in West Virginia and surrounding states through the 2006 or 2007 water year. No trend was determined in the annual peaks used to calculate the flood-frequency discharges. Multiple and simple least-squares regression equations for the 100-year (1-percent annual-occurrence probability) flood discharge with independent variables that describe the basin characteristics were developed for 290 streamgage stations in West Virginia and adjacent states. The regression residuals for the models were evaluated and used to define three regions of the State, designated as Eastern Panhandle, Central Mountains, and Western Plateaus. Exploratory data analysis procedures identified 44 streamgage stations that were excluded from the development of regression equations representative of rural, unregulated streams in West Virginia. Regional equations for the 1.1-, 1.5-, 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year flood discharges were determined by generalized least-squares regression using data from the remaining 246 streamgage stations. Drainage area was the only significant independent variable determined for all equations in all regions. Procedures developed to estimate flood-frequency discharges on ungaged streams were based on (1) regional equations and (2) drainage-area ratios between gaged and ungaged locations on the same stream. The procedures are applicable only to rural, unregulated streams within the boundaries of West Virginia that have drainage areas within the limits of the stations used to develop the regional equations (from 0.21 to 1,461 square miles in the Eastern Panhandle, from 0.10 to 1,619 square miles in the Central Mountains, and from 0.13 to 1,516 square miles in the Western Plateaus). The accuracy of the equations is quantified by measuring the average prediction error (from 21.7 to 56.3 percent) and equivalent years of record (from 2.0 to 70
Development of a customised design flood estimation tool to ...
African Journals Online (AJOL)
The estimation of design flood events, i.e., floods characterised by a specific magnitude-frequency relationship, at a particular site in a specific region is necessary for the planning, design and operation of hydraulic structures. Both the occurrence and frequency of flood events, along with the uncertainty involved in the ...
FEH Local: improving flood estimates using historical data
Prosdocimi, Ilaria; Stewart, Lisa; Faulkner, Duncan; Mitchell, Chrissy
2016-01-01
The traditional approach to design flood estimation (for example, to derive the 100-year flood) is to apply a statistical model to time series of peak river flow measured by gauging stations. Such records are typically not very long, for example in the UK only about 10% of the stations have records that are more than 50 years in length. Along-explored way to augment the data available from a gauging station is to derive information about historical flood events and paleo-floods, which can be ...
Do regional methods really help reduce uncertainties in flood frequency analyses?
Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric
2013-04-01
Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged
Nobert, Joel; Mugo, Margaret; Gadain, Hussein
Reliable estimation of flood magnitudes corresponding to required return periods, vital for structural design purposes, is impacted by lack of hydrological data in the study area of Lake Victoria Basin in Kenya. Use of regional information, derived from data at gauged sites and regionalized for use at any location within a homogenous region, would improve the reliability of the design flood estimation. Therefore, the regional index flood method has been applied. Based on data from 14 gauged sites, a delineation of the basin into two homogenous regions was achieved using elevation variation (90-m DEM), spatial annual rainfall pattern and Principal Component Analysis of seasonal rainfall patterns (from 94 rainfall stations). At site annual maximum series were modelled using the Log normal (LN) (3P), Log Logistic Distribution (LLG), Generalized Extreme Value (GEV) and Log Pearson Type 3 (LP3) distributions. The parameters of the distributions were estimated using the method of probability weighted moments. Goodness of fit tests were applied and the GEV was identified as the most appropriate model for each site. Based on the GEV model, flood quantiles were estimated and regional frequency curves derived from the averaged at site growth curves. Using the least squares regression method, relationships were developed between the index flood, which is defined as the Mean Annual Flood (MAF) and catchment characteristics. The relationships indicated area, mean annual rainfall and altitude were the three significant variables that greatly influence the index flood. Thereafter, estimates of flood magnitudes in ungauged catchments within a homogenous region were estimated from the derived equations for index flood and quantiles from the regional curves. These estimates will improve flood risk estimation and to support water management and engineering decisions and actions.
Restrepo-Estrada, Camilo; de Andrade, Sidgley Camargo; Abe, Narumi; Fava, Maria Clara; Mendiondo, Eduardo Mario; de Albuquerque, João Porto
2018-02-01
Floods are one of the most devastating types of worldwide disasters in terms of human, economic, and social losses. If authoritative data is scarce, or unavailable for some periods, other sources of information are required to improve streamflow estimation and early flood warnings. Georeferenced social media messages are increasingly being regarded as an alternative source of information for coping with flood risks. However, existing studies have mostly concentrated on the links between geo-social media activity and flooded areas. Thus, there is still a gap in research with regard to the use of social media as a proxy for rainfall-runoff estimations and flood forecasting. To address this, we propose using a transformation function that creates a proxy variable for rainfall by analysing geo-social media messages and rainfall measurements from authoritative sources, which are later incorporated within a hydrological model for streamflow estimation. We found that the combined use of official rainfall values with the social media proxy variable as input for the Probability Distributed Model (PDM), improved streamflow simulations for flood monitoring. The combination of authoritative sources and transformed geo-social media data during flood events achieved a 71% degree of accuracy and a 29% underestimation rate in a comparison made with real streamflow measurements. This is a significant improvement on the respective values of 39% and 58%, achieved when only authoritative data were used for the modelling. This result is clear evidence of the potential use of derived geo-social media data as a proxy for environmental variables for improving flood early-warning systems.
Doubling of coastal flooding frequency within decades due to sea-level rise
Vitousek, Sean; Barnard, Patrick L.; Fletcher, Charles H.; Frazer, Neil; Erikson, Li; Storlazzi, Curt D.
2017-01-01
Global climate change drives sea-level rise, increasing the frequency of coastal flooding. In most coastal regions, the amount of sea-level rise occurring over years to decades is significantly smaller than normal ocean-level fluctuations caused by tides, waves, and storm surge. However, even gradual sea-level rise can rapidly increase the frequency and severity of coastal flooding. So far, global-scale estimates of increased coastal flooding due to sea-level rise have not considered elevated water levels due to waves, and thus underestimate the potential impact. Here we use extreme value theory to combine sea-level projections with wave, tide, and storm surge models to estimate increases in coastal flooding on a continuous global scale. We find that regions with limited water-level variability, i.e., short-tailed flood-level distributions, located mainly in the Tropics, will experience the largest increases in flooding frequency. The 10 to 20 cm of sea-level rise expected no later than 2050 will more than double the frequency of extreme water-level events in the Tropics, impairing the developing economies of equatorial coastal cities and the habitability of low-lying Pacific island nations.
Doubling of coastal flooding frequency within decades due to sea-level rise.
Vitousek, Sean; Barnard, Patrick L; Fletcher, Charles H; Frazer, Neil; Erikson, Li; Storlazzi, Curt D
2017-05-18
Global climate change drives sea-level rise, increasing the frequency of coastal flooding. In most coastal regions, the amount of sea-level rise occurring over years to decades is significantly smaller than normal ocean-level fluctuations caused by tides, waves, and storm surge. However, even gradual sea-level rise can rapidly increase the frequency and severity of coastal flooding. So far, global-scale estimates of increased coastal flooding due to sea-level rise have not considered elevated water levels due to waves, and thus underestimate the potential impact. Here we use extreme value theory to combine sea-level projections with wave, tide, and storm surge models to estimate increases in coastal flooding on a continuous global scale. We find that regions with limited water-level variability, i.e., short-tailed flood-level distributions, located mainly in the Tropics, will experience the largest increases in flooding frequency. The 10 to 20 cm of sea-level rise expected no later than 2050 will more than double the frequency of extreme water-level events in the Tropics, impairing the developing economies of equatorial coastal cities and the habitability of low-lying Pacific island nations.
Conditional flood frequency and catchment state: a simulation approach
Brettschneider, Marco; Bourgin, François; Merz, Bruno; Andreassian, Vazken; Blaquiere, Simon
2017-04-01
Catchments have memory and the conditional flood frequency distribution for a time period ahead can be seen as non-stationary: it varies with the catchment state and climatic factors. From a risk management perspective, understanding the link of conditional flood frequency to catchment state is a key to anticipate potential periods of higher flood risk. Here, we adopt a simulation approach to explore the link between flood frequency obtained by continuous rainfall-runoff simulation and the initial state of the catchment. The simulation chain is based on i) a three state rainfall generator applied at the catchment scale, whose parameters are estimated for each month, and ii) the GR4J lumped rainfall-runoff model, whose parameters are calibrated with all available data. For each month, a large number of stochastic realizations of the continuous rainfall generator for the next 12 months are used as inputs for the GR4J model in order to obtain a large number of stochastic realizations for the next 12 months. This process is then repeated for 50 different initial states of the soil moisture reservoir of the GR4J model and for all the catchments. Thus, 50 different conditional flood frequency curves are obtained for the 50 different initial catchment states. We will present an analysis of the link between the catchment states, the period of the year and the strength of the conditioning of the flood frequency compared to the unconditional flood frequency. A large sample of diverse catchments in France will be used.
Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis
2014-01-01
Reliable estimates of the magnitude and frequency of floods are essential for the design of transportation and water-conveyance structures, flood-insurance studies, and flood-plain management. Such estimates are particularly important in densely populated urban areas. In order to increase the number of streamflow-gaging stations (streamgages) available for analysis, expand the geographical coverage that would allow for application of regional regression equations across State boundaries, and build on a previous flood-frequency investigation of rural U.S Geological Survey streamgages in the Southeast United States, a multistate approach was used to update methods for determining the magnitude and frequency of floods in urban and small, rural streams that are not substantially affected by regulation or tidal fluctuations in Georgia, South Carolina, and North Carolina. The at-site flood-frequency analysis of annual peak-flow data for urban and small, rural streams (through September 30, 2011) included 116 urban streamgages and 32 small, rural streamgages, defined in this report as basins draining less than 1 square mile. The regional regression analysis included annual peak-flow data from an additional 338 rural streamgages previously included in U.S. Geological Survey flood-frequency reports and 2 additional rural streamgages in North Carolina that were not included in the previous Southeast rural flood-frequency investigation for a total of 488 streamgages included in the urban and small, rural regression analysis. The at-site flood-frequency analyses for the urban and small, rural streamgages included the expected moments algorithm, which is a modification of the Bulletin 17B log-Pearson type III method for fitting the statistical distribution to the logarithms of the annual peak flows. Where applicable, the flood-frequency analysis also included low-outlier and historic information. Additionally, the application of a generalized Grubbs-Becks test allowed for the
Lind, Greg D.; Stonewall, Adam J.
2018-02-13
In this study, “naturalized” daily streamflow records, created by the U.S. Army Corps of Engineers and the Bureau of Reclamation, were used to compute 1-, 3-, 7-, 10-, 15-, 30-, and 60-day annual maximum streamflow durations, which are running averages of daily streamflow for the number of days in each duration. Once the annual maximum durations were computed, the floodduration frequencies could be estimated. The estimated flood-duration frequencies correspond to the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent probabilities of their occurring or being exceeded each year. For this report, the focus was on the Willamette River Basin in Oregon, which is a subbasin of the Columbia River Basin. This study is part of a larger one encompassing the entire Columbia Basin.
Quantification of Uncertainty in the Flood Frequency Analysis
Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.
2017-12-01
Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.
Real-time updating of the flood frequency distribution through data assimilation
Aguilar, Cristina; Montanari, Alberto; Polo, María-José
2017-07-01
We explore the memory properties of catchments for predicting the likelihood of floods based on observations of average flows in pre-flood seasons. Our approach assumes that flood formation is driven by the superimposition of short- and long-term perturbations. The former is given by the short-term meteorological forcing leading to infiltration and/or saturation excess, while the latter is originated by higher-than-usual storage in the catchment. To exploit the above sensitivity to long-term perturbations, a meta-Gaussian model and a data assimilation approach are implemented for updating the flood frequency distribution a season in advance. Accordingly, the peak flow in the flood season is predicted in probabilistic terms by exploiting its dependence on the average flow in the antecedent seasons. We focus on the Po River at Pontelagoscuro and the Danube River at Bratislava. We found that the shape of the flood frequency distribution is noticeably impacted by higher-than-usual flows occurring up to several months earlier. The proposed technique may allow one to reduce the uncertainty associated with the estimation of flood frequency.
An improved method for estimating the frequency correlation function
Chelli, Ali; Pä tzold, Matthias
2012-01-01
For time-invariant frequency-selective channels, the transfer function is a superposition of waves having different propagation delays and path gains. In order to estimate the frequency correlation function (FCF) of such channels, the frequency averaging technique can be utilized. The obtained FCF can be expressed as a sum of auto-terms (ATs) and cross-terms (CTs). The ATs are caused by the autocorrelation of individual path components. The CTs are due to the cross-correlation of different path components. These CTs have no physical meaning and leads to an estimation error. We propose a new estimation method aiming to improve the estimation accuracy of the FCF of a band-limited transfer function. The basic idea behind the proposed method is to introduce a kernel function aiming to reduce the CT effect, while preserving the ATs. In this way, we can improve the estimation of the FCF. The performance of the proposed method and the frequency averaging technique is analyzed using a synthetically generated transfer function. We show that the proposed method is more accurate than the frequency averaging technique. The accurate estimation of the FCF is crucial for the system design. In fact, we can determine the coherence bandwidth from the FCF. The exact knowledge of the coherence bandwidth is beneficial in both the design as well as optimization of frequency interleaving and pilot arrangement schemes. © 2012 IEEE.
An improved method for estimating the frequency correlation function
Chelli, Ali
2012-04-01
For time-invariant frequency-selective channels, the transfer function is a superposition of waves having different propagation delays and path gains. In order to estimate the frequency correlation function (FCF) of such channels, the frequency averaging technique can be utilized. The obtained FCF can be expressed as a sum of auto-terms (ATs) and cross-terms (CTs). The ATs are caused by the autocorrelation of individual path components. The CTs are due to the cross-correlation of different path components. These CTs have no physical meaning and leads to an estimation error. We propose a new estimation method aiming to improve the estimation accuracy of the FCF of a band-limited transfer function. The basic idea behind the proposed method is to introduce a kernel function aiming to reduce the CT effect, while preserving the ATs. In this way, we can improve the estimation of the FCF. The performance of the proposed method and the frequency averaging technique is analyzed using a synthetically generated transfer function. We show that the proposed method is more accurate than the frequency averaging technique. The accurate estimation of the FCF is crucial for the system design. In fact, we can determine the coherence bandwidth from the FCF. The exact knowledge of the coherence bandwidth is beneficial in both the design as well as optimization of frequency interleaving and pilot arrangement schemes. © 2012 IEEE.
Formetta, Giuseppe; Stewart, Elizabeth; Bell, Victoria; Reynard, Nick
2017-04-01
Estimation of peak discharge for an assigned return period is a crucial issue in engineering hydrology. It is required for designing and managing hydraulic infrastructure such as dams, reservoirs and bridges. In the UK, the Flood Estimation Handbook (FEH) recommends the use of the index flood method to estimate the design flood as the product of a local scale factor (the index flood, IF) and a dimensionless regional growth factor (GF). For gauged catchments the IF is usually estimated as the median annual maximum flood (QMED), while for ungauged catchments it is computed through multiple linear regression models based on a set of morpho-climatic indices of the basin. The GF is estimated by fitting the annual maxima with the generalised logistic distribution (GL) using two methods depending on the record length and the target return period: single-site or pooled analysis. The single site-analysis estimates the GF from the annual maxima of the subject site alone; the pooled analysis uses data from a set of catchments hydrologically similar to the subject site. In this work estimates of floods up to 100-year return period obtained from the FEH approach are compared to those obtained using Grid-to-Grid, a continuous physically-based hydrological model. The model converts rainfall and potential evapotranspiration into river flows by modelling surface/sub-surface runoff, lateral water movements, and snow-pack. It is configured on a 1km2 grid resolution and it uses spatial datasets of topography, soil, and land cover. It was set up in Great Britain and has been evaluated for the period 1960-2014 in forward-mode (i.e. without parameter calibration) using daily meteorological forcing data. The modelled floods with a given return period (5,10, 30, 50, and 100 years) were computed from the modelled discharge annual maxima and compared to the FEH estimates for 100 catchments in Great Britain. Preliminary results suggest that there is a good agreement between modelled and
A non-stationary cost-benefit based bivariate extreme flood estimation approach
Qi, Wei; Liu, Junguo
2018-02-01
Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.
Paleoflood Data, Extreme Floods and Frequency: Data and Models for Dam Safety Risk Scenarios
England, J. F.; Godaire, J.; Klinger, R.
2007-12-01
Extreme floods and probability estimates are crucial components in dam safety risk analysis and scenarios for water-resources decision making. The field-based collection of paleoflood data provides needed information on the magnitude and probability of extreme floods at locations of interest in a watershed or region. The stratigraphic record present along streams in the form of terrace and floodplain deposits represent direct indicators of the magnitude of large floods on a river, and may provide 10 to 100 times longer records than conventional stream gaging records of large floods. Paleoflood data is combined with gage and historical streamflow estimates to gain insights to flood frequency scaling, model extrapolations and uncertainty, and provide input scenarios to risk analysis event trees. We illustrate current data collection and flood frequency modeling approaches via case studies in the western United States, including the American River in California and the Arkansas River in Colorado. These studies demonstrate the integration of applied field geology, hydraulics, and surface-water hydrology. Results from these studies illustrate the gains in information content on extreme floods, provide data- based means to separate flood generation processes, guide flood frequency model extrapolations, and reduce uncertainties. These data and scenarios strongly influence water resources management decisions.
Formetta, Giuseppe; Bell, Victoria; Stewart, Elizabeth
2018-02-01
Regional flood frequency analysis is one of the most commonly applied methods for estimating extreme flood events at ungauged sites or locations with short measurement records. It is based on: (i) the definition of a homogeneous group (pooling-group) of catchments, and on (ii) the use of the pooling-group data to estimate flood quantiles. Although many methods to define a pooling-group (pooling schemes, PS) are based on catchment physiographic similarity measures, in the last decade methods based on flood seasonality similarity have been contemplated. In this paper, two seasonality-based PS are proposed and tested both in terms of the homogeneity of the pooling-groups they generate and in terms of the accuracy in estimating extreme flood events. The method has been applied in 420 catchments in Great Britain (considered as both gauged and ungauged) and compared against the current Flood Estimation Handbook (FEH) PS. Results for gauged sites show that, compared to the current PS, the seasonality-based PS performs better both in terms of homogeneity of the pooling-group and in terms of the accuracy of flood quantile estimates. For ungauged locations, a national-scale hydrological model has been used for the first time to quantify flood seasonality. Results show that in 75% of the tested locations the seasonality-based PS provides an improvement in the accuracy of the flood quantile estimates. The remaining 25% were located in highly urbanized, groundwater-dependent catchments. The promising results support the aspiration that large-scale hydrological models complement traditional methods for estimating design floods.
Towards a systematic approach to comparing distributions used in flood frequency analysis
Bobée, B.; Cavadias, G.; Ashkar, F.; Bernier, J.; Rasmussen, P.
1993-02-01
The estimation of flood quantiles from available streamflow records has been a topic of extensive research in this century. However, the large number of distributions and estimation methods proposed in the scientific literature has led to a state of confusion, and a gap prevails between theory and practice. This concerns both at-site and regional flood frequency estimation. To facilitate the work of "hydrologists, designers of hydraulic structures, irrigation engineers and planners of water resources", the World Meteorological Organization recently published a report which surveys and compares current methodologies, and recommends a number of statistical distributions and estimation procedures. This report is an important step towards the clarification of this difficult topic, but we think that it does not effectively satisfy the needs of practitioners as intended, because it contains some statements which are not statistically justified and which require further discussion. In the present paper we review commonly used procedures for flood frequency estimation, point out some of the reasons for the present state of confusion concerning the advantages and disadvantages of the various methods, and propose the broad lines of a possible comparison strategy. We recommend that the results of such comparisons be discussed in an international forum of experts, with the purpose of attaining a more coherent and broadly accepted strategy for estimating floods.
Flooding PSA with Plant Specific Operating Experiences of Korean PWRs
International Nuclear Information System (INIS)
Choi, Sun Yeong; Yang, Joon Yull
2006-01-01
The purpose of this paper is to update the flooding PSA with Korean plant specific operating experience data and the appropriate estimation method for the flooding frequency to improve the PSA quality. The existing flooding PSA used the NPE (Nuclear Power Experience) database up to 1985 for the flooding frequency. They are all USA plant operating experiences. So an upgraded flooding frequency with Korean specific plant operation experience is required. We also propose a method of only using the PWR (Pressurized Water Reactor) data for the flooding frequency estimation in the case of the flooding area in the primary building even though the existing flooding PSA used both PWR and BWR (Boiled Water Reactor) data for all kinds of plant areas. We evaluate the CDF (Core Damage Frequency) with the modified flooding frequency and compare the results with that of the existing flooding PSA method
Energy Technology Data Exchange (ETDEWEB)
Wang, Wei; Li, Hong-Yi; Leung, Lai-Yung; Yigzaw, Wondmagegn Y.; Zhao, Jianshi; Lu, Hui; Deng, Zhiqun; Demissie, Yonas; Bloschl, Gunter
2017-10-01
Anthropogenic activities, e.g., reservoir operation, may alter the characteristics of Flood Frequency Curve (FFC) and challenge the basic assumption of stationarity used in flood frequency analysis. This paper presents a combined data-modeling analysis of the nonlinear filtering effects of reservoirs on the FFCs over the contiguous United States. A dimensionless Reservoir Impact Index (RII), defined as the total upstream reservoir storage capacity normalized by the annual streamflow volume, is used to quantify reservoir regulation effects. Analyses are performed for 388 river stations with an average record length of 50 years. The first two moments of the FFC, mean annual maximum flood (MAF) and coefficient of variations (CV), are calculated for the pre- and post-dam periods and compared to elucidate the reservoir regulation effects as a function of RII. It is found that MAF generally decreases with increasing RII but stabilizes when RII exceeds a threshold value, and CV increases with RII until a threshold value beyond which CV decreases with RII. The processes underlying the nonlinear threshold behavior of MAF and CV are investigated using three reservoir models with different levels of complexity. All models capture the non-linear relationships of MAF and CV with RII, suggesting that the basic flood control function of reservoirs is key to the non-linear relationships. The relative roles of reservoir storage capacity, operation objectives, available storage prior to a flood event, and reservoir inflow pattern are systematically investigated. Our findings may help improve flood-risk assessment and mitigation in regulated river systems at the regional scale.
Feasibility of estimating generalized extreme-value distribution of floods
International Nuclear Information System (INIS)
Ferreira de Queiroz, Manoel Moises
2004-01-01
Flood frequency analysis by generalized extreme-value probability distribution (GEV) has found increased application in recent years, given its flexibility in dealing with the three asymptotic forms of extreme distribution derived from different initial probability distributions. Estimation of higher quantiles of floods is usually accomplished by extrapolating one of the three inverse forms of GEV distribution fitted to the experimental data for return periods much higher than those actually observed. This paper studies the feasibility of fitting GEV distribution by moments of linear combinations of higher order statistics (LH moments) using synthetic annual flood series with varying characteristics and lengths. As the hydrologic events in nature such as daily discharge occur with finite values, their annual maximums are expected to follow the asymptotic form of the limited GEV distribution. Synthetic annual flood series were thus obtained from the stochastic sequences of 365 daily discharges generated by Monte Carlo simulation on the basis of limited probability distribution underlying the limited GEV distribution. The results show that parameter estimation by LH moments of this distribution, fitted to annual flood samples of less than 100-year length derived from initial limited distribution, may indicate any form of extreme-value distribution, not just the limited form as expected, and with large uncertainty in fitted parameters. A frequency analysis, on the basis of GEV distribution and LH moments, of annual flood series of lengths varying between 13 and 73 years observed at 88 gauge stations on Parana River in Brazil, indicated all the three forms of GEV distribution.(Author)
Techniques for estimating flood-depth frequency relations for streams in West Virginia
Wiley, J.B.
1987-01-01
Multiple regression analyses are applied to data from 119 U.S. Geological Survey streamflow stations to develop equations that estimate baseline depth (depth of 50% flow duration) and 100-yr flood depth on unregulated streams in West Virginia. Drainage basin characteristics determined from the 100-yr flood depth analysis were used to develop 2-, 10-, 25-, 50-, and 500-yr regional flood depth equations. Two regions with distinct baseline depth equations and three regions with distinct flood depth equations are delineated. Drainage area is the most significant independent variable found in the central and northern areas of the state where mean basin elevation also is significant. The equations are applicable to any unregulated site in West Virginia where values of independent variables are within the range evaluated for the region. Examples of inapplicable sites include those in reaches below dams, within and directly upstream from bridge or culvert constrictions, within encroached reaches, in karst areas, and where streams flow through lakes or swamps. (Author 's abstract)
2014-03-01
The central purpose of this report is to present methods : for estimating the magnitude and frequency of floods on : urban and small, rural streams in the Southeast United States : with particular focus on Georgia, South Carolina, and North : Carolin...
Estimation of initiating event frequency for external flood events by extreme value theorem
International Nuclear Information System (INIS)
Chowdhury, Sourajyoti; Ganguly, Rimpi; Hari, Vibha
2017-01-01
External flood is an important common cause initiating event in nuclear power plants (NPPs). It may potentially lead to severe core damage (SCD) by first causing the failure of the systems required for maintaining the heat sinks and then by contributing to failures of engineered systems designed to mitigate such failures. The sample NPP taken here is twin 220 MWe Indian standard pressurized heavy water reactor (PHWR) situated inland. A comprehensive in-house Level-1 internal event PSA for full power had already been performed. External flood assessment was further conducted in area of external hazard risk assessment in response to post-Fukushima measures taken in nuclear industries. The present paper describes the methodology to calculate initiating event (IE) frequency for external flood events for the sample inland Indian NPP. General extreme value (GEV) theory based on maximum likelihood method (MLM) and order statistics approach (OSA) is used to analyse the rainfall data for the site. Thousand-year return level and necessary return periods for extreme rainfall are evaluated. These results along with plant-specific topographical calculations quantitatively establish that external flooding resulting from upstream dam break, river flooding and heavy rainfall (flash flood) would be unlikely for the sample NPP in consideration.
Large-scale derived flood frequency analysis based on continuous simulation
Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno
2016-04-01
There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several
Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments
Berk, Mario; Å pačková, Olga; Straub, Daniel
2017-12-01
The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.
Historical floods in flood frequency analysis: Is this game worth the candle?
Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa
2017-11-01
In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.
RainyDay: An Online, Open-Source Tool for Physically-based Rainfall and Flood Frequency Analysis
Wright, D.; Yu, G.; Holman, K. D.
2017-12-01
Flood frequency analysis in ungaged or changing watersheds typically requires rainfall intensity-duration-frequency (IDF) curves combined with hydrologic models. IDF curves only depict point-scale rainfall depth, while true rainstorms exhibit complex spatial and temporal structures. Floods result from these rainfall structures interacting with watershed features such as land cover, soils, and variable antecedent conditions as well as river channel processes. Thus, IDF curves are traditionally combined with a variety of "design storm" assumptions such as area reduction factors and idealized rainfall space-time distributions to translate rainfall depths into inputs that are suitable for flood hydrologic modeling. The impacts of such assumptions are relatively poorly understood. Meanwhile, modern precipitation estimates from gridded weather radar, grid-interpolated rain gages, satellites, and numerical weather models provide more realistic depictions of rainfall space-time structure. Usage of such datasets for rainfall and flood frequency analysis, however, are hindered by relatively short record lengths. We present RainyDay, an open-source stochastic storm transposition (SST) framework for generating large numbers of realistic rainfall "scenarios." SST "lengthens" the rainfall record by temporal resampling and geospatial transposition of observed storms to extract space-time information from regional gridded rainfall data. Relatively short (10-15 year) records of bias-corrected radar rainfall data are sufficient to estimate rainfall and flood events with much longer recurrence intervals including 100-year and 500-year events. We describe the SST methodology as implemented in RainyDay and compare rainfall IDF results from RainyDay to conventional estimates from NOAA Atlas 14. Then, we demonstrate some of the flood frequency analysis properties that are possible when RainyDay is integrated with a distributed hydrologic model, including robust estimation of flood
Hamdi, Y.; Bardet, L.; Duluc, C.-M.; Rebour, V.
2014-09-01
Nuclear power plants located in the French Atlantic coast are designed to be protected against extreme environmental conditions. The French authorities remain cautious by adopting a strict policy of nuclear plants flood prevention. Although coastal nuclear facilities in France are designed to very low probabilities of failure (e.g. 1000 year surge), exceptional surges (outliers induced by exceptional climatic events) had shown that the extreme sea levels estimated with the current statistical approaches could be underestimated. The estimation of extreme surges then requires the use of a statistical analysis approach having a more solid theoretical motivation. This paper deals with extreme surge frequency estimation using historical information (HI) about events occurred before the systematic record period. It also contributes to addressing the problem of the presence of outliers in data sets. The frequency models presented in the present paper have been quite successful in the field of hydrometeorology and river flooding but they have not been applied to sea levels data sets to prevent marine flooding. In this work, we suggest two methods of incorporating the HI: the Peaks-Over-Threshold method with HI (POTH) and the Block Maxima method with HI (BMH). Two kinds of historical data can be used in the POTH method: classical Historical Maxima (HMax) data, and Over a Threshold Supplementary (OTS) data. In both cases, the data are structured in historical periods and can be used only as complement to the main systematic data. On the other hand, in the BMH method, the basic hypothesis in statistical modeling of HI is that at least one threshold of perception exists for the whole period (historical and systematic) and that during a giving historical period preceding the period of tide gauging, only information about surges above this threshold have been recorded or archived. The two frequency models were applied to a case study from France, at the La Rochelle site where
Hamdi, Y.; Bardet, L.; Duluc, C.-M.; Rebour, V.
2015-07-01
Nuclear power plants located in the French Atlantic coast are designed to be protected against extreme environmental conditions. The French authorities remain cautious by adopting a strict policy of nuclear-plants flood prevention. Although coastal nuclear facilities in France are designed to very low probabilities of failure (e.g., 1000-year surge), exceptional surges (outliers induced by exceptional climatic events) have shown that the extreme sea levels estimated with the current statistical approaches could be underestimated. The estimation of extreme surges then requires the use of a statistical analysis approach having a more solid theoretical motivation. This paper deals with extreme-surge frequency estimation using historical information (HI) about events occurred before the systematic record period. It also contributes to addressing the problem of the presence of outliers in data sets. The frequency models presented in the present paper have been quite successful in the field of hydrometeorology and river flooding but they have not been applied to sea level data sets to prevent marine flooding. In this work, we suggest two methods of incorporating the HI: the peaks-over-threshold method with HI (POTH) and the block maxima method with HI (BMH). Two kinds of historical data can be used in the POTH method: classical historical maxima (HMax) data, and over-a-threshold supplementary (OTS) data. In both cases, the data are structured in historical periods and can be used only as complement to the main systematic data. On the other hand, in the BMH method, the basic hypothesis in statistical modeling of HI is that at least one threshold of perception exists for the whole period (historical and systematic) and that during a giving historical period preceding the period of tide gauging, only information about surges above this threshold have been recorded or archived. The two frequency models were applied to a case study from France, at the La Rochelle site where
Frequency and seasonality of flash floods in Slovenia
Directory of Open Access Journals (Sweden)
Trobec Tajan
2017-01-01
Full Text Available The purpose of this paper is to assess and analyse the dynamics of flash flooding events in Slovenia. The paper examines in particular the frequency of flash floods and their seasonal distribution. The methodology is based on the analysis of historical records and modern flood data. The results of a long-term frequency analysis of 138 flash floods that occurred between 1550 and 2015 are presented. Because of the lack of adequate historical flood data prior to 1950 the main analysis is based on data for the periodbetween1951 and2015, while the analysis of data for the period between1550 and1950 is added as a supplement to the main analysis. Analysis of data for the period after 1950 shows that on average 1.3 flash floods occur each year in Slovenia. The linear trend for the number of flash floods is increasing but is not statistically significant. Despite the fact that the majority of Slovenian rivers have one of the peaks in spring and one of the lows in summer, 90% of flash floods actually occur during meteorological summer or autumn - i.e. between June and November, which shows that discharge regimes and flood regimes are not necessarily related. Because of the lack of flood records from the more distant past as well as the large variability of flash flood events in the last several decades, we cannot provide a definitive answer to the question about possible changes in their frequency and seasonality by relying solely on the detected trends. Nevertheless, considering the results of analysis and future climate change scenarios the frequency of flash floods in Slovenia could increase while the period of flash flood occurrence could be extended.
Component external leakage and rupture frequency estimates
International Nuclear Information System (INIS)
Eide, S.A.; Khericha, S.T.; Calley, M.B.; Johnson, D.A.; Marteeny, M.L.
1991-11-01
In order to perform detailed internal flooding risk analyses of nuclear power plants, external leakage and rupture frequencies are needed for various types of components - piping, valves, pumps, flanges, and others. However, there appears to be no up-to-date, comprehensive source for such frequency estimates. This report attempts to fill that void. Based on a comprehensive search of Licensee Event Reports (LERs) contained in Nuclear Power Experience (NPE), and estimates of component populations and exposure times, component external leakage and rupture frequencies were generated. The remainder of this report covers the specifies of the NPE search for external leakage and rupture events, analysis of the data, a comparison with frequency estimates from other sources, and a discussion of the results
A Fresh Start for Flood Estimation in Ungauged Basins
Woods, R. A.
2017-12-01
The two standard methods for flood estimation in ungauged basins, regression-based statistical models and rainfall-runoff models using a design rainfall event, have survived relatively unchanged as the methods of choice for more than 40 years. Their technical implementation has developed greatly, but the models' representation of hydrological processes has not, despite a large volume of hydrological research. I suggest it is time to introduce more hydrology into flood estimation. The reliability of the current methods can be unsatisfactory. For example, despite the UK's relatively straightforward hydrology, regression estimates of the index flood are uncertain by +/- a factor of two (for a 95% confidence interval), an impractically large uncertainty for design. The standard error of rainfall-runoff model estimates is not usually known, but available assessments indicate poorer reliability than statistical methods. There is a practical need for improved reliability in flood estimation. Two promising candidates to supersede the existing methods are (i) continuous simulation by rainfall-runoff modelling and (ii) event-based derived distribution methods. The main challenge with continuous simulation methods in ungauged basins is to specify the model structure and parameter values, when calibration data are not available. This has been an active area of research for more than a decade, and this activity is likely to continue. The major challenges for the derived distribution method in ungauged catchments include not only the correct specification of model structure and parameter values, but also antecedent conditions (e.g. seasonal soil water balance). However, a much smaller community of researchers are active in developing or applying the derived distribution approach, and as a result slower progress is being made. A change in needed: surely we have learned enough about hydrology in the last 40 years that we can make a practical hydrological advance on our methods for
Influences on flood frequency distributions in Irish river catchments
Directory of Open Access Journals (Sweden)
S. Ahilan
2012-04-01
Full Text Available This study explores influences on flood frequency distributions in Irish rivers. A Generalised Extreme Value (GEV type I distribution is recommended in Ireland for estimating flood quantiles in a single site flood frequency analysis. This paper presents the findings of an investigation that identified the GEV statistical distributions that best fit the annual maximum (AM data series extracted from 172 gauging stations of 126 rivers in Ireland. Analysis of these data was undertaken to explore hydraulic and hydro-geological factors that influence flood frequency distributions. A hierarchical approach of increasing statistical power that used probability plots, moment and L-moment diagrams, the Hosking goodness of fit algorithm and a modified Anderson-Darling (A-D statistical test was followed to determine whether a type I, type II or type III distribution was valid. Results of the Hosking et al. method indicated that of the 143 stations with flow records exceeding 25 yr, data for 95 (67% was best represented by GEV type I distributions and a further 9 (6% and 39 (27% stations followed type II and type III distributions respectively. Type I, type II and type III distributions were determined for 83 (58%, 16 (11% and 34 (24% stations respectively using the modified A-D method (data from 10 stations was not represented by GEV family distributions. The influence of karst terrain on these flood frequency distributions was assessed by incorporating results on an Arc-GIS platform showing karst features and using Monte Carlo simulations to assess the significance of the number and clustering of the observed distributions. Floodplain effects were identified by using two-sample t-tests to identify statistical correlations between the distributions and catchment properties that are indicative of strong floodplain activity. The data reveals that type I distributions are spatially well represented throughout the country. While also well represented throughout
A Bayesian Analysis of the Flood Frequency Hydrology Concept
2016-02-01
ERDC/CHL CHETN-X-1 February 2016 Approved for public release; distribution is unlimited. A Bayesian Analysis of the Flood Frequency Hydrology ...flood frequency hydrology concept as a formal probabilistic-based means by which to coherently combine and also evaluate the worth of different types...and development. INTRODUCTION: Merz and Blöschl (2008a,b) proposed the concept of flood frequency hydrology , which emphasizes the importance of
Costa, Veber; Fernandes, Wilson
2017-11-01
Extreme flood estimation has been a key research topic in hydrological sciences. Reliable estimates of such events are necessary as structures for flood conveyance are continuously evolving in size and complexity and, as a result, their failure-associated hazards become more and more pronounced. Due to this fact, several estimation techniques intended to improve flood frequency analysis and reducing uncertainty in extreme quantile estimation have been addressed in the literature in the last decades. In this paper, we develop a Bayesian framework for the indirect estimation of extreme flood quantiles from rainfall-runoff models. In the proposed approach, an ensemble of long daily rainfall series is simulated with a stochastic generator, which models extreme rainfall amounts with an upper-bounded distribution function, namely, the 4-parameter lognormal model. The rationale behind the generation model is that physical limits for rainfall amounts, and consequently for floods, exist and, by imposing an appropriate upper bound for the probabilistic model, more plausible estimates can be obtained for those rainfall quantiles with very low exceedance probabilities. Daily rainfall time series are converted into streamflows by routing each realization of the synthetic ensemble through a conceptual hydrologic model, the Rio Grande rainfall-runoff model. Calibration of parameters is performed through a nonlinear regression model, by means of the specification of a statistical model for the residuals that is able to accommodate autocorrelation, heteroscedasticity and nonnormality. By combining the outlined steps in a Bayesian structure of analysis, one is able to properly summarize the resulting uncertainty and estimating more accurate credible intervals for a set of flood quantiles of interest. The method for extreme flood indirect estimation was applied to the American river catchment, at the Folsom dam, in the state of California, USA. Results show that most floods
DONG, Q.; Zhang, X.; Lall, U.; Sang, Y. F.; Xie, P.
2017-12-01
With the current global climate changing and human activities intensifying, the uncertainties and danger of floods increased significantly. However, the current flood frequency analysis is still based on the stationary assumption. This assumption not only limits the benefits of the water conservancy projects, but also brings hazard because it ignores the risk of flooding under climate change. In this paper, we relax the stationary hypothesis in the flood frequency analysis model based on the teleconnection and use the intrinsic relation of flood elements to improve the annual flood frequency results by Bayesian inference approaches. Daily discharges of the the Three Gorges Dam(TGD) in 1953-2013 years are used as an example. Firstly, according to the linear correlation between the climate indices and the distribution parameters, the prior distributions of peak and volume are established with the selected large scale climate predictors. After that, by using the copula function and predictands, the conditional probability function of peak and volume is obtained. Then, the Bayesian theory links the prior distributions and conditional distributions and get the posterior distributions. We compare the difference under different prior distributions and find the optimal flood frequency distribution model. Finally, we discuss the impact of dynamic flood frequency analysis on the plan and management of hydraulic engineering. The results show that compared with the prior probability, the posterior probability considering the correlation of the flood elements is more accurate and the uncertainty is smaller. And the dynamic flood frequency model has a great impact on the management of the existing hydraulic engineering, which can improve the engineering operation benefit and reducing its flood risk, but it nearly didn't influence the plan of hydraulic engineering. The study of this paper is helpful to the dynamic flood risk management of TGD, and provide reference for the
Consistency of extreme flood estimation approaches
Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf
2017-04-01
Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.
Estimating flood discharge using witness movies in post-flood hydrological surveys
Le Coz, Jérôme; Hauet, Alexandre; Le Boursicaud, Raphaël; Pénard, Lionel; Bonnifait, Laurent; Dramais, Guillaume; Thollet, Fabien; Braud, Isabelle
2015-04-01
surveys were achieved. Identifying fixed GCPs is more difficult in rural environments than in urban areas. Image processing was performed using free software only, especially Fudaa-LSPIV (Le Coz et al., 2014) was used for steps (v), (vi), and (vii). The results illustrate the typical issues and advantages of flood home movies taken by witnesses for improving post-flood discharge estimation. In spite of the non-ideal conditions related to such movies, the LSPIV technique was successfully applied. Corrections for lens distortion and limited camera movements (shake) are not difficult to achieve. Locating precisely the video viewpoint is often easy whereas precise timing may be not, especially when the author cannot be contacted or when the camera clock is false. Based on sensitivity analysis, the determination of the water level appears to be the main source of uncertainty in the results. Nevertheless, the information content of the results remains highly valuable for post-flood studies, in particular for improving the high-flow extrapolation of hydrometric rating curves. This kind of application opens interesting avenues for participative research in flood hydrology, as well as the study of other extreme geophysical events. Typically, as part of the FloodScale ANR research project (2012-2015), specific communication actions have been focused on the determination of flood discharges within the Ardèche river catchement (France) using home movies shared by observers and volunteers. Safety instructions and a simplified field procedure were shared through local media and were made available in French and English on the project website. This way, simple flood observers or even some enthusiastic flood chasers can contribute to participative hydrological science in the same way the so-called storm chasers have significantly contributed to meteorological science since the Tornado Intercept Project (1972). Website : http
Soong, David T.; Straub, Timothy D.; Murphy, Elizabeth A.
2006-01-01
Results of hydrologic model, flood-frequency, hydraulic model, and flood-hazard analysis of the Blackberry Creek watershed in Kane County, Illinois, indicate that the 100-year and 500-year flood plains range from approximately 25 acres in the tributary F watershed (a headwater subbasin at the northeastern corner of the watershed) to almost 1,800 acres in Blackberry Creek main stem. Based on 1996 land-cover data, most of the land in the 100-year and 500-year flood plains was cropland, forested and wooded land, and grassland. A relatively small percentage of urban land was in the flood plains. The Blackberry Creek watershed has undergone rapid urbanization in recent decades. The population and urbanized lands in the watershed are projected to double from the 1990 condition by 2020. Recently, flood-induced damage has occurred more frequently in urbanized areas of the watershed. There are concerns about the effect of urbanization on flood peaks and volumes, future flood-mitigation plans, and potential effects on the water quality and stream habitats. This report describes the procedures used in developing the hydrologic models, estimating the flood-peak discharge magnitudes and recurrence intervals for flood-hazard analysis, developing the hydraulic model, and the results of the analysis in graphical and tabular form. The hydrologic model, Hydrological Simulation Program-FORTRAN (HSPF), was used to perform the simulation of continuous water movements through various patterns of land uses in the watershed. Flood-frequency analysis was applied to an annual maximum series to determine flood quantiles in subbasins for flood-hazard analysis. The Hydrologic Engineering Center-River Analysis System (HEC-RAS) hydraulic model was used to determine the 100-year and 500-year flood elevations, and to determine the 100-year floodway. The hydraulic model was calibrated and verified using high water marks and observed inundation maps for the July 17-18, 1996, flood event. Digital
BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods
Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.
2017-12-01
Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made
Using cost-benefit concepts in design floods improves communication of uncertainty
Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi
2017-04-01
Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs
Combining Empirical and Stochastic Models for Extreme Floods Estimation
Zemzami, M.; Benaabidate, L.
2013-12-01
Hydrological models can be defined as physical, mathematical or empirical. The latter class uses mathematical equations independent of the physical processes involved in the hydrological system. The linear regression and Gradex (Gradient of Extreme values) are classic examples of empirical models. However, conventional empirical models are still used as a tool for hydrological analysis by probabilistic approaches. In many regions in the world, watersheds are not gauged. This is true even in developed countries where the gauging network has continued to decline as a result of the lack of human and financial resources. Indeed, the obvious lack of data in these watersheds makes it impossible to apply some basic empirical models for daily forecast. So we had to find a combination of rainfall-runoff models in which it would be possible to create our own data and use them to estimate the flow. The estimated design floods would be a good choice to illustrate the difficulties facing the hydrologist for the construction of a standard empirical model in basins where hydrological information is rare. The construction of the climate-hydrological model, which is based on frequency analysis, was established to estimate the design flood in the Anseghmir catchments, Morocco. The choice of using this complex model returns to its ability to be applied in watersheds where hydrological information is not sufficient. It was found that this method is a powerful tool for estimating the design flood of the watershed and also other hydrological elements (runoff, volumes of water...).The hydrographic characteristics and climatic parameters were used to estimate the runoff, water volumes and design flood for different return periods.
Flood quantile estimation at ungauged sites by Bayesian networks
Mediero, L.; Santillán, D.; Garrote, L.
2012-04-01
Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a
Flood frequency analysis for nonstationary annual peak records in an urban drainage basin
Villarini, Gabriele; Smith, James A.; Serinaldi, Francesco; Bales, Jerad; Bates, Paul D.; Krajewski, Witold F.
2009-08-01
Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110km) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1mskm to a maximum of 5.1mskm. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2mskm). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2mskm ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades.
Flood extent and water level estimation from SAR using data-model integration
Ajadi, O. A.; Meyer, F. J.
2017-12-01
Synthetic Aperture Radar (SAR) images have long been recognized as a valuable data source for flood mapping. Compared to other sources, SAR's weather and illumination independence and large area coverage at high spatial resolution supports reliable, frequent, and detailed observations of developing flood events. Accordingly, SAR has the potential to greatly aid in the near real-time monitoring of natural hazards, such as flood detection, if combined with automated image processing. This research works towards increasing the reliability and temporal sampling of SAR-derived flood hazard information by integrating information from multiple SAR sensors and SAR modalities (images and Interferometric SAR (InSAR) coherence) and by combining SAR-derived change detection information with hydrologic and hydraulic flood forecast models. First, the combination of multi-temporal SAR intensity images and coherence information for generating flood extent maps is introduced. The application of least-squares estimation integrates flood information from multiple SAR sensors, thus increasing the temporal sampling. SAR-based flood extent information will be combined with a Digital Elevation Model (DEM) to reduce false alarms and to estimate water depth and flood volume. The SAR-based flood extent map is assimilated into the Hydrologic Engineering Center River Analysis System (Hec-RAS) model to aid in hydraulic model calibration. The developed technology is improving the accuracy of flood information by exploiting information from data and models. It also provides enhanced flood information to decision-makers supporting the response to flood extent and improving emergency relief efforts.
Improving Flood Management Planning in Thailand | IDRC ...
International Development Research Centre (IDRC) Digital Library (Canada)
According to World Bank estimates, this disaster caused US$46.5 billion in ... This project seeks to improve the Flood Management Master Plan, proposing ... New Dutch-Canadian funding for the Climate and Development Knowledge Network.
El Alaoui El Fels, Abdelhafid; Alaa, Noureddine; Bachnou, Ali; Rachidi, Said
2018-05-01
The development of the statistical models and flood risk modeling approaches have seen remarkable improvements in their productivities. Their application in arid and semi-arid regions, particularly in developing countries, can be extremely useful for better assessment and planning of flood risk in order to reduce the catastrophic impacts of this phenomenon. This study focuses on the Setti Fadma region (Ourika basin, Morocco) which is potentially threatened by floods and is subject to climatic and anthropogenic forcing. The study is based on two main axes: (i) the extreme flow frequency analysis, using 12 probability laws adjusted by Maximum Likelihood method and (ii) the generation of the flood risk indicator maps are based on the solution proposed by the Nays2DFlood solver of the Hydrodynamic model of two-dimensional Saint-Venant equations. The study is used as a spatial high-resolution digital model (Lidar) in order to get the nearest hydrological simulation of the reality. The results showed that the GEV is the most appropriate law of the extreme flows estimation for different return periods. Taking into consideration the mapping of 100-year flood area, the study revealed that the fluvial overflows extent towards the banks of Ourika and consequently, affects some living areas, cultivated fields and the roads that connects the valley to the city of Marrakech. The aim of this study is to propose new technics of the flood risk management allowing a better planning of the flooded areas.
Flood frequency approach in a Mediterranean Flash Flood basin. A case study in the Besòs catchment
Velasco, D.; Zanon, F.; Corral, C.; Sempere-Torres, D.; Borga, M.
2009-04-01
Flash floods are one of the most devastating natural disasters in the Mediterranean areas. In particular, the region of Catalonia (North-East Spain) is one of the most affected by flash floods in the Iberian Peninsula. The high rainfall intensities generating these events, the specific terrain characteristics giving rise to very fast hydrological responses and the high variability in space and time of both rain and land surface, are the main features of FF and also the main cause of their extreme complexity. Distributed hydrological models have been developed to increase the flow forecast resolution in order to implement effective operational warning systems. Some studies have shown how the distributed-models accuracy is highly sensitive to reduced computational grid scale, so, hydrological model uncertainties must be studied. In these conditions, an estimation of the modeling uncertainty (whatever the accuracy is) becomes highly valuable information to enhance our ability to predict the occurrence of flash flooding. The statistical-distributed modeling approach (Reed, 2004) is proposed in the present study to simulate floods on a small basin and account for hydrologic modeling uncertainty. The Besòs catchment (1020 km2), near Barcelona, has been selected in this study to apply the proposed flood frequency methodology. Hydrometeorological data is available for 11 rain-gauges and 6 streamflow gauges in the last 12 years, and a total of 9 flood events have been identified and analyzed in this study. The DiCHiTop hydrological model (Corral, 2004) was developed to fit operational requirements in the Besòs catchment: distributed, robust and easy to implement. It is a grid-based model that works at a given resolution (here at 1 × 1 km2, the hydrological cell), defining a simplified drainage system at this scale. A loss function is applied at the hydrological cell resolution, provided by a coupled storage model between the SCS model (Mockus, 1957) in urban areas and
Boudaghpour, Siamak; Bagheri, Majid; Bagheri, Zahra
2014-01-01
High flood occurrences with large environmental damages have a growing trend in Iran. Dynamic movements of water during a flood cause different environmental damages in geographical areas with different characteristics such as topographic conditions. In general, environmental effects and damages caused by a flood in an area can be investigated from different points of view. The current essay is aiming at detecting environmental effects of flood occurrences in Halilrood catchment area of Kerman province in Iran using flood zone mapping techniques. The intended flood zone map was introduced in four steps. Steps 1 to 3 pave the way to calculate and estimate flood zone map in the understudy area while step 4 determines the estimation of environmental effects of flood occurrence. Based on our studies, wide range of accuracy for estimating the environmental effects of flood occurrence was introduced by using of flood zone mapping techniques. Moreover, it was identified that the existence of Jiroft dam in the study area can decrease flood zone from 260 hectares to 225 hectares and also it can decrease 20% of flood peak intensity. As a result, 14% of flood zone in the study area can be saved environmentally.
Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin
zhangli, Sun; xiufang, Zhu; yaozhong, Pan
2016-04-01
Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.
Flood frequency matters: Why climate change degrades deep-water quality of peri-alpine lakes
Fink, Gabriel; Wessels, Martin; Wüest, Alfred
2016-09-01
Sediment-laden riverine floods transport large quantities of dissolved oxygen into the receiving deep layers of lakes. Hence, the water quality of deep lakes is strongly influenced by the frequency of riverine floods. Although flood frequency reflects climate conditions, the effects of climate variability on the water quality of deep lakes is largely unknown. We quantified the effects of climate variability on the potential shifts in the flood regime of the Alpine Rhine, the main catchment of Lake Constance, and determined the intrusion depths of riverine density-driven underflows and the subsequent effects on water exchange rates in the lake. A simplified hydrodynamic underflow model was developed and validated with observed river inflow and underflow events. The model was implemented to estimate underflow statistics for different river inflow scenarios. Using this approach, we integrated present and possible future flood frequencies to underflow occurrences and intrusion depths in Lake Constance. The results indicate that more floods will increase the number of underflows and the intensity of deep-water renewal - and consequently will cause higher deep-water dissolved oxygen concentrations. Vice versa, fewer floods weaken deep-water renewal and lead to lower deep-water dissolved oxygen concentrations. Meanwhile, a change from glacial nival regime (present) to a nival pluvial regime (future) is expected to decrease deep-water renewal. While flood frequencies are not expected to change noticeably for the next decades, it is most likely that increased winter discharge and decreased summer discharge will reduce the number of deep density-driven underflows by 10% and favour shallower riverine interflows in the upper hypolimnion. The renewal in the deepest layers is expected to be reduced by nearly 27%. This study underlines potential consequences of climate change on the occurrence of deep river underflows and water residence times in deep lakes.
A comparison of three approaches to non-stationary flood frequency analysis
Debele, S. E.; Strupczewski, W. G.; Bogdanowicz, E.
2017-08-01
Non-stationary flood frequency analysis (FFA) is applied to statistical analysis of seasonal flow maxima from Polish and Norwegian catchments. Three non-stationary estimation methods, namely, maximum likelihood (ML), two stage (WLS/TS) and GAMLSS (generalized additive model for location, scale and shape parameters), are compared in the context of capturing the effect of non-stationarity on the estimation of time-dependent moments and design quantiles. The use of a multimodel approach is recommended, to reduce the errors due to the model misspecification in the magnitude of quantiles. The results of calculations based on observed seasonal daily flow maxima and computer simulation experiments showed that GAMLSS gave the best results with respect to the relative bias and root mean square error in the estimates of trend in the standard deviation and the constant shape parameter, while WLS/TS provided better accuracy in the estimates of trend in the mean value. Within three compared methods the WLS/TS method is recommended to deal with non-stationarity in short time series. Some practical aspects of the GAMLSS package application are also presented. The detailed discussion of general issues related to consequences of climate change in the FFA is presented in the second part of the article entitled "Around and about an application of the GAMLSS package in non-stationary flood frequency analysis".
Improving the flash flood frequency analysis applying dendrogeomorphological evidences
Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.
2009-09-01
Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait
Assessment of homogeneity of regions for regional flood frequency analysis
Lee, Jeong Eun; Kim, Nam Won
2016-04-01
This paper analyzed the effect of rainfall on hydrological similarity, which is an important step for regional flood frequency analysis (RFFA). For the RFFA, storage function method (SFM) using spatial extension technique was applied for the 22 sub-catchments that are partitioned from Chungju dam watershed in Republic of Korea. We used the SFM to generate the annual maximum floods for 22 sub-catchments using annual maximum storm events (1986~2010) as input data. Then the quantiles of rainfall and flood were estimated using the annual maximum series for the 22 sub-catchments. Finally, spatial variations in terms of two quantiles were analyzed. As a result, there were significant correlation between spatial variations of the two quantiles. This result demonstrates that spatial variation of rainfall is an important factor to explain the homogeneity of regions when applying RFFA. Acknowledgements: This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.
Flood frequency analysis for nonstationary annual peak records in an urban drainage basin
Villarini, G.; Smith, J.A.; Serinaldi, F.; Bales, J.; Bates, P.D.; Krajewski, W.F.
2009-01-01
Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110 km2) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1 m3 s- 1 km- 2 to a maximum of 5.1 m3 s- 1 km- 2. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2 m3 s- 1 km- 2). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2 m3 s- 1 km- 2 ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades. ?? 2009 Elsevier Ltd.
Guidelines for determining flood flow frequency—Bulletin 17C
England, John F.; Cohn, Timothy A.; Faber, Beth A.; Stedinger, Jery R.; Thomas, Wilbert O.; Veilleux, Andrea G.; Kiang, Julie E.; Mason, Robert R.
2018-03-29
Accurate estimates of flood frequency and magnitude are a key component of any effective nationwide flood risk management and flood damage abatement program. In addition to accuracy, methods for estimating flood risk must be uniformly and consistently applied because management of the Nation’s water and related land resources is a collaborative effort involving multiple actors including most levels of government and the private sector.Flood frequency guidelines have been published in the United States since 1967, and have undergone periodic revisions. In 1967, the U.S. Water Resources Council presented a coherent approach to flood frequency with Bulletin 15, “A Uniform Technique for Determining Flood Flow Frequencies.” The method it recommended involved fitting the log-Pearson Type III distribution to annual peak flow data by the method of moments.The first extension and update of Bulletin 15 was published in 1976 as Bulletin 17, “Guidelines for Determining Flood Flow Frequency” (Guidelines). It extended the Bulletin 15 procedures by introducing methods for dealing with outliers, historical flood information, and regional skew. Bulletin 17A was published the following year to clarify the computation of weighted skew. The next revision of the Bulletin, the Bulletin 17B, provided a host of improvements and new techniques designed to address situations that often arise in practice, including better methods for estimating and using regional skew, weighting station and regional skew, detection of outliers, and use of the conditional probability adjustment.The current version of these Guidelines are presented in this document, denoted Bulletin 17C. It incorporates changes motivated by four of the items listed as “Future Work” in Bulletin 17B and 30 years of post-17B research on flood processes and statistical methods. The updates include: adoption of a generalized representation of flood data that allows for interval and censored data types; a new method
Ono, T.; Takahashi, T.
2017-12-01
to predict efficiently and accurately. The river flood analysis by using this proposed method will contribute to mitigate flood disaster by improving the accuracy of estimated inundation area.
Directory of Open Access Journals (Sweden)
Wang Xudong
2012-01-01
Full Text Available An automatic pairing joint direction-of-arrival (DOA and frequency estimation is presented to overcome the unsatisfactory performances of estimation of signal parameter via rotational invariance techniques- (ESPRIT- like algorithm of Wang (2010, which requires an additional pairing. By using multiple-delay output of a uniform linear antenna arrays (ULA, the proposed algorithm can estimate joint angles and frequencies with an improved ESPRIT. Compared with Wang’s ESPRIT algorithm, the angle estimation performance of the proposed algorithm is greatly improved. The frequency estimation performance of the proposed algorithm is same with that of Wang’s ESPRIT algorithm. Furthermore, the proposed algorithm can obtain automatic pairing DOA and frequency parameters, and it has a comparative computational complexity in contrast to Wang’s ESPRIT algorithm. By the way, this proposed algorithm can also work well for nonuniform linear arrays. The useful behavior of this proposed algorithm is verified by simulations.
Performance of regional flood frequency analysis methods in ...
African Journals Online (AJOL)
2015-04-03
Apr 3, 2015 ... Available on website http://www.wrc.org.za. ISSN 1816-7950 ... Estimates of design floods are required for the design of hydraulic structures and to quantify the risk of failure of the ... performance when compared to design floods estimated from the annual maximum series extracted from the observed data.
Directory of Open Access Journals (Sweden)
G. Moretti
2008-08-01
Full Text Available The estimation of the peak river flow for ungauged river sections is a topical issue in applied hydrology. Spatially distributed rainfall-runoff models can be a useful tool to this end, since they are potentially able to simulate the river flow at any location of the watershed drainage network. However, it is not fully clear to what extent these models can provide reliable simulations over a wide range of spatial scales. This issue is investigated here by applying a spatially distributed, continuous simulation rainfall-runoff model to infer the flood frequency distribution of the Riarbero River. This is an ungauged mountain creek located in northern Italy, whose drainage area is 17 km^{2}. The hydrological model is first calibrated by using a 1-year record of hourly meteorological data and river flows observed at the outlet of the 1294 km^{2} wide Secchia River basin, of which the Riarbero is a tributary. The model is then validated by performing a 100-year long simulation of synthetic river flow data, which allowed us to compare the simulated and observed flood frequency distributions at the Secchia River outlet and the internal cross river section of Cavola Bridge, where the basin area is 337 km^{2}. Finally, another simulation of hourly river flows was performed by referring to the outlet of the Riarbero River, therefore allowing us to estimate the related flood frequency distribution. The results were validated by using estimates of peak river flow obtained by applying hydrological similarity principles and a regional method. The results show that the flood flow estimated through the application of the distributed model is consistent with the estimate provided by the regional procedure as well as the behaviors of the river banks. Conversely, the method based on hydrological similarity delivers an estimate that seems to be not as reliable. The analysis highlights interesting perspectives for the application of
Gaál, Ladislav; Kohnová, Silvia; Szolgay, Ján.
2010-05-01
During the last 10-15 years, the Slovak hydrologists and water resources managers have been devoting considerable efforts to develop statistical tools for modelling probabilities of flood occurrence in a regional context. Initially, these models followed concepts to regional flood frequency analysis that were based on fixed regions, later the Hosking and Wallis's (HW; 1997) theory was adopted and modified. Nevertheless, it turned out to be that delineating homogeneous regions using these approaches is not a straightforward task, mostly due to the complex orography of the country. In this poster we aim at revisiting flood frequency analyses so far accomplished for Slovakia by adopting one of the pooling approaches, i.e. the region-of-influence (ROI) approach (Burn, 1990). In the ROI approach, unique pooling groups of similar sites are defined for each site under study. The similarity of sites is defined through Euclidean distance in the space of site attributes that had also proved applicability in former cluster analyses: catchment area, afforested area, hydrogeological catchment index and the mean annual precipitation. The homogeneity of the proposed pooling groups is evaluated by the built-in homogeneity test by Lu and Stedinger (1992). Two alternatives of the ROI approach are examined: in the first one the target size of the pooling groups is adjusted to the target return period T of the estimated flood quantiles, while in the other one, the target size is fixed, regardless of the target T. The statistical models of the ROI approach are inter-compared by the conventional regionalization approach based on the HW methodology where the parameters of flood frequency distributions were derived by means of L-moment statistics and a regional formula for the estimation of the index flood was derived by multiple regression methods using physiographic and climatic catchment characteristics. The inter-comparison of different frequency models is evaluated by means of the
Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation
Borga, M.; Creutin, J. D.
Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two
Forest cover, socioeconomics, and reported flood frequency in developing countries
Ferreira, Susana; Ghimire, Ramesh
2012-08-01
In this paper, we analyze the determinants of the number of large floods reported since 1990. Using the same sample of countries as Bradshaw et al. (2007), and, like them, omitting socioeconomic characteristics from the analysis, we found that a reduction in natural forest cover is associated with an increase in the reported count of large floods. This result does not hold in any of three new analyses we perform. First, we expand the sample to include all the developing countries and all countries for which data were available but were omitted in their study. Second, and more importantly, since forest management is just one possible channel through which humans can influence reported flood frequency, we account for other important human-flood interactions. People are typically responsible for deforestation, but they are also responsible for other land use changes (e.g., urbanization), for floodplain and flood emergency management, and for reporting the floods. Thus, in our analysis we account for population, urban population growth, income, and corruption. Third, we exploit the panel nature of the data to control for unobserved country and time heterogeneity. We conclude that not only is the link between forest cover and reported flood frequency at the country level not robust, it also seems to be driven by sample selection and omitted variable bias. The human impact on the reported frequency of large floods at the country level is not through deforestation.
Flooding PSA by considering the operating experience data of Korean PWRs
International Nuclear Information System (INIS)
Choi, Sun Yeong; Yang, Joon Eon
2007-01-01
The existing flooding Probabilistic Safety Analysis (PSA) was updated to reflect the Korean plant specific operating experience data into the flooding frequency to improve the PSA quality. Both the Nuclear Power Experience (NPE) database and the Korea Nuclear PIPE Failure Database (NuPIPE) databases were used in this study, and from these databases, only the Pressurized Water Reactor (PWR) data were used for the flooding frequencies of the flooding areas in the primary auxiliary building. With these databases and a Bayesian method, the flooding frequencies for the flooding areas were estimated. Subsequently, the Core Damage Frequency (CDF) for the flooding PSA of the UlChiN (UCN) unit 3 and 4 plants based on the Korean Standard Nuclear power Plant (KSNP) internal full-power PSA model was recalculated. The evaluation results showed that sixteen flooding events are potentially significant according to the screening criterion, while there were two flooding events exceeding the screening criterion of the existing UCN 3 and 4 flooding PSA. The result was compared with two kinds of cases: 1) the flooding frequency and CDF from the method of the existing flooding PSA with the PWR and Boiled Water Reactor (BWR) data of the NPE database and the Maximum Likelihood Estimate (MLE) method and 2) the flooding frequency and CDF with the NPE database (PWR and BWR data), NuPIPE database, and a Bayesian method. From the comparison, a difference in CDF results was revealed more clearly between the CDF from this study and case 2) than between case 1) and case 2). That is, the number of flooding events exceeding the screen criterion further increased when only the PWR data were used for the primary auxiliary building than when the Korean specific data were used
Harden, Tessa M.; O'Connor, Jim E.; Driscoll, Daniel G.; Stamm, John F.
2011-01-01
, gentle topography, and extensive floodplain storage. Results of the paleoflood investigations are directly applicable only to the specific study reaches and in the case of Rapid Creek, only to pre-regulation conditions. Thus, approaches for broader applications were developed from inferences of overall flood-generation processes, and appropriate domains for application of results were described. Example applications were provided by estimating flood quantiles for selected streamgages, which also allowed direct comparison with results of at-site flood-frequency analyses from a previous study. Several broad issues and uncertainties were examined, including potential biases associated with stratigraphic records that inherently are not always complete, uncertainties regarding statistical approaches, and the unknown applicability of paleoflood records to future watershed conditions. The results of the paleoflood investigations, however, provide much better physically based information on low-probability floods than has been available previously, substantially improving estimates of the magnitude and frequency of large floods in these basins and reducing associated uncertainty.
A space-time hybrid hourly rainfall model for derived flood frequency analysis
Directory of Open Access Journals (Sweden)
U. Haberlandt
2008-12-01
Full Text Available For derived flood frequency analysis based on hydrological modelling long continuous precipitation time series with high temporal resolution are needed. Often, the observation network with recording rainfall gauges is poor, especially regarding the limited length of the available rainfall time series. Stochastic precipitation synthesis is a good alternative either to extend or to regionalise rainfall series to provide adequate input for long-term rainfall-runoff modelling with subsequent estimation of design floods. Here, a new two step procedure for stochastic synthesis of continuous hourly space-time rainfall is proposed and tested for the extension of short observed precipitation time series.
First, a single-site alternating renewal model is presented to simulate independent hourly precipitation time series for several locations. The alternating renewal model describes wet spell durations, dry spell durations and wet spell intensities using univariate frequency distributions separately for two seasons. The dependence between wet spell intensity and duration is accounted for by 2-copulas. For disaggregation of the wet spells into hourly intensities a predefined profile is used. In the second step a multi-site resampling procedure is applied on the synthetic point rainfall event series to reproduce the spatial dependence structure of rainfall. Resampling is carried out successively on all synthetic event series using simulated annealing with an objective function considering three bivariate spatial rainfall characteristics. In a case study synthetic precipitation is generated for some locations with short observation records in two mesoscale catchments of the Bode river basin located in northern Germany. The synthetic rainfall data are then applied for derived flood frequency analysis using the hydrological model HEC-HMS. The results show good performance in reproducing average and extreme rainfall characteristics as well as in
Water mobility key to improved floods
Energy Technology Data Exchange (ETDEWEB)
Pamenter, C B
1967-03-01
The use of polymer floods in the U.S. and Canada is discussed. A 2-yr laboratory study conducted by Dow Chemical Co. early in the life of polymer flooding showed that polymers improved the mobility ratio without damage to porosity or permeability of reservoir rock. A pilot test was made in the Niagara Field, Ky., and the results of this pilot compared to the performance of a waterflood that had been operating in this field for about 4 yr. The results showed that polymer flooding was superior to conventional waterflooding and had a distinct behavior. Another pilot flood conducted by Dow in the Albrecht Field, Starr County, Tex., showed similar results. Union Oil Co. of California also conducted pilot tests in 4 of their California reservoirs. Additional recoverable reserves resulting from polymer flooding for 2 of these reservoirs were estimated at 95,000 and 70,000 bbl. The other 2 tests were not as satisfactory, but this behavior is thought to be the result of not using enough polymer. Two other projects discussed are the NE. Hallsville Field unit in East Texas and the Squirrel sand reservoir in Woodson County, Kans., which were conducted by Hunt Oil Co. and Brazos Oil and Gas Co., respectively.
Ghazavi, Reza; Moafi Rabori, Ali; Ahadnejad Reveshty, Mohsen
2016-01-01
Estimate design storm based on rainfall intensity–duration–frequency (IDF) curves is an important parameter for hydrologic planning of urban areas. The main aim of this study was to estimate rainfall intensities of Zanjan city watershed based on overall relationship of rainfall IDF curves and appropriate model of hourly rainfall estimation (Sherman method, Ghahreman and Abkhezr method). Hydrologic and hydraulic impacts of rainfall IDF curves change in flood properties was evaluated via Stormw...
Extent and frequency of floods on Delaware River in vicinity of Belvidere, New Jersey
Farlekas, George M.
1966-01-01
A stream overflowing its banks is a natural phenomenon. This natural phenomenon of flooding has occurred on the Delaware River in the past and will occur in the future. T' o resulting inundation of large areas can cause property damage, business losses and possible loss of life, and may result in emergency costs for protection, rescue, and salvage work. For optimum development of the river valley consistent with the flood risk, an evaluation of flood conditions is necessary. Basic data and the interpretation of the data on the regimen of the streams, particularly the magnitude of floods to be expected, the frequency of their occurrence, and the areas inundated, are essential for planning and development of flood-prone areas.This report presents information relative to the extent, depth, and frequency of floods on the Delaware River and its tributaries in the vicinity of Belvidere, N.J. Flooding on the tributaries detailed in the report pertains only to the effect of backwater from the Delaware River. Data are presented for several past floods with emphasis given to the floods of August 19, 1955 and May 24, 1942. In addition, information is given for a hypothetical flood based on the flood of August 19, 1955 modified by completed (since 1955) and planned flood-control works.By use of relations presented in this report the extent, depth, and frequency of flooding can be estimated for any site along the reach of the Delaware River under study. Flood data and the evaluation of the data are presented so that local and regional agencies, organizations, and individuals may have a technical basis for making decisions on the use of flood-prone areas. The Delaware River Basin Commission and the U.S. Geological Survey regard this program of flood-plain inundation studies as a positive step toward flood-damage prevention. Flood-plain inundation studies, when followed by appropriate land-use regulations, are a valuable and economical supplement to physical works for flood
Comparison between changes in flood hazard and risk in Spain using historical information
Llasat, Maria-Carmen; Mediero, Luis; Garrote, Luis; Gilabert, Joan
2015-04-01
Recently, the COST Action ES0901 "European procedures for flood frequency estimation (FloodFreq)" had as objective "the comparison and evaluation of methods for flood frequency estimation under the various climatologic and geographic conditions found in Europe". It was highlighted the improvement of regional analyses on at-site estimates, in terms of the uncertainty of quantile estimates. In the case of Spain, a regional analysis was carried out at a national scale, which allows identifying the flow threshold corresponding to a given return period from the observed flow series recorded at a gauging station. In addition, Mediero et al. (2014) studied the possible influence of non-stationarity on flood series for the period 1942-2009. In parallel, Barnolas and Llasat (2007), among others, collected documentary information of catastrophic flood events in Spain for the last centuries. Traditionally, the first approach ("top-down") usually identifies a flood as catastrophic, when its exceeds the 500-year return period flood. However, the second one ("bottom-up approach") accounts for flood damages (Llasat et al, 2005). This study presents a comparison between both approaches, discussing the potential factors that can lead to discrepancies between them, as well as accounting for information about major changes experienced in the catchment that could lead to changes in flood hazard and risk.
Zhang, Qiang; Gu, Xihui; Singh, Vijay P.; Shi, Peijun; Sun, Peng
2018-05-01
Flood risks across the Pearl River basin, China, were evaluated using a peak flood flow dataset covering a period of 1951-2014 from 78 stations and historical flood records of the past 1000 years. The generalized extreme value (GEV) model and the kernel estimation method were used to evaluate frequencies and risks of hazardous flood events. Results indicated that (1) no abrupt changes or significant trends could be detected in peak flood flow series at most of the stations, and only 16 out of 78 stations exhibited significant peak flood flow changes with change points around 1990. Peak flood flow in the West River basin increased and significant increasing trends were identified during 1981-2010; decreasing peak flood flow was found in coastal regions and significant trends were observed during 1951-2014 and 1966-2014. (2) The largest three flood events were found to cluster in both space and time. Generally, basin-scale flood hazards can be expected in the West and North River basins. (3) The occurrence rate of floods increased in the middle Pearl River basin but decreased in the lower Pearl River basin. However, hazardous flood events were observed in the middle and lower Pearl River basin, and this is particularly true for the past 100 years. However, precipitation extremes were subject to moderate variations and human activities, such as building of levees, channelization of river systems, and rapid urbanization; these were the factors behind the amplification of floods in the middle and lower Pearl River basin, posing serious challenges for developing measures of mitigation of flood hazards in the lower Pearl River basin, particularly the Pearl River Delta (PRD) region.
Floods on small streams in North Carolina, probable magnitude and frequency
Hinson, Herbert G.
1965-01-01
The magnitude and frequency of floods are defined regionally for small streams (drainage area, 1 to 150 sq mi) in North Carolina. Composite frequency curves for each of two regions relate the magnitude of the annual flood, in ratio to the mean annual flood, to recurrence intervals of 1.1 to 50 years. In North Carolina, the mean annual flood (Q2.33) is related to drainage area (A) by the following equation: Q2. 33 = GA0.66, where G, the geographic factor, is the product of a statewide coefficient (US) times a correction which reflects differences in basin characteristics. Isograms of the G factor covering the State are presented.
The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation
Felder, Guido; Zischg, Andreas; Weingartner, Rolf
2017-07-01
Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.
Flood frequency analysis at ungauged sites in the KwaZulu-Natal ...
African Journals Online (AJOL)
Use of the index-flood method at ungauged sites requires methods for estimation of the index-flood parameter at these sites. This study attempts to relate the mean annual flood to site characteristics of catchments in KwaZulu-Natal, South Africa. The ordinary, weighted and generalised least square methods for estimating ...
Biondi, Daniela; De Luca, Davide Luciano
2015-04-01
The use of rainfall-runoff models represents an alternative to statistical approaches (such as at-site or regional flood frequency analysis) for design flood estimation, and constitutes an answer to the increasing need for synthetic design hydrographs (SDHs) associated to a specific return period. However, the lack of streamflow observations and the consequent high uncertainty associated with parameter estimation, usually pose serious limitations to the use of process-based approaches in ungauged catchments, which in contrast represent the majority in practical applications. This work presents the application of a Bayesian procedure that, for a predefined rainfall-runoff model, allows for the assessment of posterior parameters distribution, using the limited and uncertain information available for the response of an ungauged catchment (Bulygina et al. 2009; 2011). The use of regional estimates of river flow statistics, interpreted as hydrological signatures that measure theoretically relevant system process behaviours (Gupta et al. 2008), within this framework represents a valuable option and has shown significant developments in recent literature to constrain the plausible model response and to reduce the uncertainty in ungauged basins. In this study we rely on the first three L-moments of annual streamflow maxima, for which regressions are available from previous studies (Biondi et al. 2012; Laio et al. 2011). The methodology was carried out for a catchment located in southern Italy, and used within a Monte Carlo scheme (MCs) considering both event-based and continuous simulation approaches for design flood estimation. The applied procedure offers promising perspectives to perform model calibration and uncertainty analysis in ungauged basins; moreover, in the context of design flood estimation, process-based methods coupled with MCs approach have the advantage of providing simulated floods uncertainty analysis that represents an asset in risk-based decision
Harden, Tessa M.; O'Connor, Jim E.
2017-06-14
Stratigraphic analysis, coupled with geochronologic techniques, indicates that a rich history of large Tennessee River floods is preserved in the Tennessee River Gorge area. Deposits of flood sediment from the 1867 peak discharge of record (460,000 cubic feet per second at Chattanooga, Tennessee) are preserved at many locations throughout the study area at sites with flood-sediment accumulation. Small exposures at two boulder overhangs reveal evidence of three to four other floods similar in size, or larger, than the 1867 flood in the last 3,000 years—one possibly as much or more than 50 percent larger. Records of floods also are preserved in stratigraphic sections at the mouth of the gorge at Williams Island and near Eaves Ferry, about 70 river miles upstream of the gorge. These stratigraphic records may extend as far back as about 9,000 years ago, giving a long history of Tennessee River floods. Although more evidence is needed to confirm these findings, a more in-depth comprehensive paleoflood study is feasible for the Tennessee River.
Directory of Open Access Journals (Sweden)
J. A. Ortega
2009-02-01
Full Text Available The Guadiana River has a significant record of historical floods, but the systematic data record is only 59 years. From layers left by ancient floods we know about we can add new data to the record, and we can estimate maximum discharges of other floods only known by the moment of occurrence and by the damages caused. A hydraulic model has been performed in the area of Pulo de Lobo and calibrated by means of the rating curve of Pulo do Lobo Station. The palaeofloods have been dated by means of ^{14}C y ^{137}Cs. As non-systematic information has been used in order to calculate distribution functions, the quantiles have changed with respect to the same function when using systematic information. The results show a variation in the curves that can be blamed on the human transformations responsible for changing the hydrologic conditions as well as on the latest climate changes. High magnitude floods are related to cold periods, especially at transitional moments of change from cold to warm periods. This tendency has changed from the last medium-high magnitude flood, which took place in a systematic period. Both reasons seem to justify a change in the frequency curves indicating a recent decrease in the return period of big floods over 8000 m^{3} s^{−1}. The palaeofloods indicate a bigger return period for the same water level discharge thus showing the river basin reference values in its natural condition previous to the transformation of the basin caused by anthropic action.
An at-site flood estimation method in the context of nonstationarity I. A simulation study
Gado, Tamer A.; Nguyen, Van-Thanh-Van
2016-04-01
The stationarity of annual flood peak records is the traditional assumption of flood frequency analysis. In some cases, however, as a result of land-use and/or climate change, this assumption is no longer valid. Therefore, new statistical models are needed to capture dynamically the change of probability density functions over time, in order to obtain reliable flood estimation. In this study, an innovative method for nonstationary flood frequency analysis was presented. Here, the new method is based on detrending the flood series and applying the L-moments along with the GEV distribution to the transformed ;stationary; series (hereafter, this is called the LM-NS). The LM-NS method was assessed through a comparative study with the maximum likelihood (ML) method for the nonstationary GEV model, as well as with the stationary (S) GEV model. The comparative study, based on Monte Carlo simulations, was carried out for three nonstationary GEV models: a linear dependence of the mean on time (GEV1), a quadratic dependence of the mean on time (GEV2), and linear dependence in both the mean and log standard deviation on time (GEV11). The simulation results indicated that the LM-NS method performs better than the ML method for most of the cases studied, whereas the stationary method provides the least accurate results. An additional advantage of the LM-NS method is to avoid the numerical problems (e.g., convergence problems) that may occur with the ML method when estimating parameters for small data samples.
Estimated flood-inundation maps for Cowskin Creek in western Wichita, Kansas
Studley, Seth E.
2003-01-01
The October 31, 1998, flood on Cowskin Creek in western Wichita, Kansas, caused millions of dollars in damages. Emergency management personnel and flood mitigation teams had difficulty in efficiently identifying areas affected by the flooding, and no warning was given to residents because flood-inundation information was not available. To provide detailed information about future flooding on Cowskin Creek, high-resolution estimated flood-inundation maps were developed using geographic information system technology and advanced hydraulic analysis. Two-foot-interval land-surface elevation data from a 1996 flood insurance study were used to create a three-dimensional topographic representation of the study area for hydraulic analysis. The data computed from the hydraulic analyses were converted into geographic information system format with software from the U.S. Army Corps of Engineers' Hydrologic Engineering Center. The results were overlaid on the three-dimensional topographic representation of the study area to produce maps of estimated flood-inundation areas and estimated depths of water in the inundated areas for 1-foot increments on the basis of stream stage at an index streamflow-gaging station. A Web site (http://ks.water.usgs.gov/Kansas/cowskin.floodwatch) was developed to provide the public with information pertaining to flooding in the study area. The Web site shows graphs of the real-time streamflow data for U.S. Geological Survey gaging stations in the area and monitors the National Weather Service Arkansas-Red Basin River Forecast Center for Cowskin Creek flood-forecast information. When a flood is forecast for the Cowskin Creek Basin, an estimated flood-inundation map is displayed for the stream stage closest to the National Weather Service's forecasted peak stage. Users of the Web site are able to view the estimated flood-inundation maps for selected stages at any time and to access information about this report and about flooding in general. Flood
Roland, Mark A.; Stuckey, Marla H.
2007-01-01
The Delaware and North Branch Susquehanna River Basins in Pennsylvania experienced severe flooding as a result of intense rainfall during June 2006. The height of the flood waters on the rivers and tributaries approached or exceeded the peak of record at many locations. Updated flood-magnitude and flood-frequency data for streamflow-gaging stations on tributaries in the Delaware and North Branch Susquehanna River Basins were analyzed using data through the 2006 water year to determine if there were any major differences in the flood-discharge data. Flood frequencies for return intervals of 2, 5, 10, 50, 100, and 500 years (Q2, Q5, Q10, Q50, Q100, and Q500) were determined from annual maximum series (AMS) data from continuous-record gaging stations (stations) and were compared to flood discharges obtained from previously published Flood Insurance Studies (FIS) and to flood frequencies using partial-duration series (PDS) data. A Wilcoxon signed-rank test was performed to determine any statistically significant differences between flood frequencies computed from updated AMS station data and those obtained from FIS. Percentage differences between flood frequencies computed from updated AMS station data and those obtained from FIS also were determined for the 10, 50, 100, and 500 return intervals. A Mann-Kendall trend test was performed to determine statistically significant trends in the updated AMS peak-flow data for the period of record at the 41 stations. In addition to AMS station data, PDS data were used to determine flood-frequency discharges. The AMS and PDS flood-frequency data were compared to determine any differences between the two data sets. An analysis also was performed on AMS-derived flood frequencies for four stations to evaluate the possible effects of flood-control reservoirs on peak flows. Additionally, flood frequencies for three stations were evaluated to determine possible effects of urbanization on peak flows. The results of the Wilcoxon signed
Improving Flood Predictions in Data-Scarce Basins
Vimal, Solomon; Zanardo, Stefano; Rafique, Farhat; Hilberts, Arno
2017-04-01
Flood modeling methodology at Risk Management Solutions Ltd. has evolved over several years with the development of continental scale flood risk models spanning most of Europe, the United States and Japan. Pluvial (rain fed) and fluvial (river fed) flood maps represent the basis for the assessment of regional flood risk. These maps are derived by solving the 1D energy balance equation for river routing and 2D shallow water equation (SWE) for overland flow. The models are run with high performance computing and GPU based solvers as the time taken for simulation is large in such continental scale modeling. These results are validated with data from authorities and business partners, and have been used in the insurance industry for many years. While this methodology has been proven extremely effective in regions where the quality and availability of data are high, its application is very challenging in other regions where data are scarce. This is generally the case for low and middle income countries, where simpler approaches are needed for flood risk modeling and assessment. In this study we explore new methods to make use of modeling results obtained in data-rich contexts to improve predictive ability in data-scarce contexts. As an example, based on our modeled flood maps in data-rich countries, we identify statistical relationships between flood characteristics and topographic and climatic indicators, and test their generalization across physical domains. Moreover, we apply the Height Above Nearest Drainage (HAND)approach to estimate "probable" saturated areas for different return period flood events as functions of basin characteristics. This work falls into the well-established research field of Predictions in Ungauged Basins.
Flood Finder: Mobile-based automated water level estimation and mapping during floods
International Nuclear Information System (INIS)
Pongsiriyaporn, B; Jariyavajee, C; Laoharawee, N; Narkthong, N; Pitichat, T; Goldin, S E
2014-01-01
Every year, Southeast Asia faces numerous flooding disasters, resulting in very high human and economic loss. Responding to a sudden flood is difficult due to the lack of accurate and up-to- date information about the incoming water status. We have developed a mobile application called Flood Finder to solve this problem. Flood Finder allows smartphone users to measure, share and search for water level information at specified locations. The application uses image processing to compute the water level from a photo taken by users. The photo must be of a known reference object with a standard size. These water levels are more reliable and consistent than human estimates since they are derived from an algorithmic measuring function. Flood Finder uploads water level readings to the server, where they can be searched and mapped by other users via the mobile phone app or standard browsers. Given the widespread availability of smartphones in Asia, Flood Finder can provide more accurate and up-to-date information for better preparation for a flood disaster as well as life safety and property protection
LiDAR and IFSAR-Based Flood Inundation Model Estimates for Flood-Prone Areas of Afghanistan
Johnson, W. C.; Goldade, M. M.; Kastens, J.; Dobbs, K. E.; Macpherson, G. L.
2014-12-01
Extreme flood events are not unusual in semi-arid to hyper-arid regions of the world, and Afghanistan is no exception. Recent flashfloods and flashflood-induced landslides took nearly 100 lives and destroyed or damaged nearly 2000 homes in 12 villages within Guzargah-e-Nur district of Baghlan province in northeastern Afghanistan. With available satellite imagery, flood-water inundation estimation can be accomplished remotely, thereby providing a means to reduce the impact of such flood events by improving shared situational awareness during major flood events. Satellite orbital considerations, weather, cost, data licensing restrictions, and other issues can often complicate the acquisition of appropriately timed imagery. Given the need for tools to supplement imagery where not available, complement imagery when it is available, and bridge the gap between imagery based flood mapping and traditional hydrodynamic modeling approaches, we have developed a topographic floodplain model (FLDPLN), which has been used to identify and map river valley floodplains with elevation data ranging from 90-m SRTM to 1-m LiDAR. Floodplain "depth to flood" (DTF) databases generated by FLDPLN are completely seamless and modular. FLDPLN has been applied in Afghanistan to flood-prone areas along the northern and southern flanks of the Hindu Kush mountain range to generate a continuum of 1-m increment flood-event models up to 10 m in depth. Elevation data used in this application of FLDPLN included high-resolution, drone-acquired LiDAR (~1 m) and IFSAR (5 m; INTERMAP). Validation of the model has been accomplished using the best available satellite-derived flood inundation maps, such as those issued by Unitar's Operational Satellite Applications Programme (UNOSAT). Results provide a quantitative approach to evaluating the potential risk to urban/village infrastructure as well as to irrigation systems, agricultural fields and archaeological sites.
Estimation of antecedent wetness conditions for flood modelling in northern Morocco
Directory of Open Access Journals (Sweden)
Y. Tramblay
2012-11-01
Full Text Available In northern Morocco are located most of the dams and reservoirs of the country, while this region is affected by severe rainfall events causing floods. To improve the management of the water regulation structures, there is a need to develop rainfall–runoff models to both maximize the storage capacity and reduce the risks caused by floods. In this study, a model is developed to reproduce the flood events for a 655 km^{2} catchment located upstream of the 6th largest dam in Morocco. Constrained by data availability, a standard event-based model combining a SCS-CN (Soil Conservation Service Curve Number loss model and a Clark unit hydrograph was developed for hourly discharge simulation using 16 flood events that occurred between 1984 and 2008. The model was found satisfactory to reproduce the runoff and the temporal evolution of floods, even with limited rainfall data. Several antecedent wetness conditions estimators for the catchment were compared with the initial condition of the model. Theses estimators include an antecedent discharge index, an antecedent precipitation index and a continuous daily soil moisture accounting model (SMA, based on precipitation and evapotranspiration. The SMA model performed the best to estimate the initial conditions of the event-based hydrological model (R^{2} = 0.9. Its daily output has been compared with ASCAT and AMSR-E remote sensing data products, which were both able to reproduce with accuracy the daily simulated soil moisture dynamics at the catchment scale. This same approach could be implemented in other catchments of this region for operational purposes. The results of this study suggest that remote sensing data are potentially useful to estimate the soil moisture conditions in the case of ungauged catchments in Northern Africa.
An improved Q estimation approach: the weighted centroid frequency shift method
Li, Jingnan; Wang, Shangxu; Yang, Dengfeng; Dong, Chunhui; Tao, Yonghui; Zhou, Yatao
2016-06-01
Seismic wave propagation in subsurface media suffers from absorption, which can be quantified by the quality factor Q. Accurate estimation of the Q factor is of great importance for the resolution enhancement of seismic data, precise imaging and interpretation, and reservoir prediction and characterization. The centroid frequency shift method (CFS) is currently one of the most commonly used Q estimation methods. However, for seismic data that contain noise, the accuracy and stability of Q extracted using CFS depend on the choice of frequency band. In order to reduce the influence of frequency band choices and obtain Q with greater precision and robustness, we present an improved CFS Q measurement approach—the weighted CFS method (WCFS), which incorporates a Gaussian weighting coefficient into the calculation procedure of the conventional CFS. The basic idea is to enhance the proportion of advantageous frequencies in the amplitude spectrum and reduce the weight of disadvantageous frequencies. In this novel method, we first construct a Gauss function using the centroid frequency and variance of the reference wavelet. Then we employ it as the weighting coefficient for the amplitude spectrum of the original signal. Finally, the conventional CFS is adopted for the weighted amplitude spectrum to extract the Q factor. Numerical tests of noise-free synthetic data demonstrate that the WCFS is feasible and efficient, and produces more accurate results than the conventional CFS. Tests for noisy synthetic data indicate that the new method has better anti-noise capability than the CFS. The application to field vertical seismic profile (VSP) data further demonstrates its validity5.
Kelly, Brian P.; Huizinga, Richard J.
2008-01-01
In the interest of improved public safety during flooding, the U.S. Geological Survey, in cooperation with the city of Kansas City, Missouri, completed a flood-inundation study of the Blue River in Kansas City, Missouri, from the U.S. Geological Survey streamflow gage at Kenneth Road to 63rd Street, of Indian Creek from the Kansas-Missouri border to its mouth, and of Dyke Branch from the Kansas-Missouri border to its mouth, to determine the estimated extent of flood inundation at selected flood stages on the Blue River, Indian Creek, and Dyke Branch. The results of this study spatially interpolate information provided by U.S. Geological Survey gages, Kansas City Automated Local Evaluation in Real Time gages, and the National Weather Service flood-peak prediction service that comprise the Blue River flood-alert system and are a valuable tool for public officials and residents to minimize flood deaths and damage in Kansas City. To provide public access to the information presented in this report, a World Wide Web site (http://mo.water.usgs.gov/indep/kelly/blueriver) was created that displays the results of two-dimensional modeling between Hickman Mills Drive and 63rd Street, estimated flood-inundation maps for 13 flood stages, the latest gage heights, and National Weather Service stage forecasts for each forecast location within the study area. The results of a previous study of flood inundation on the Blue River from 63rd Street to the mouth also are available. In addition the full text of this report, all tables and maps are available for download (http://pubs.usgs.gov/sir/2008/5068). Thirteen flood-inundation maps were produced at 2-foot intervals for water-surface elevations from 763.8 to 787.8 feet referenced to the Blue River at the 63rd Street Automated Local Evaluation in Real Time stream gage operated by the city of Kansas City, Missouri. Each map is associated with gages at Kenneth Road, Blue Ridge Boulevard, Kansas City (at Bannister Road), U.S. Highway 71
Freni, G; La Loggia, G; Notaro, V
2010-01-01
Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly
Real Time Estimation of the Calgary Floods Using Limited Remote Sensing Data
Directory of Open Access Journals (Sweden)
Emily Schnebele
2014-02-01
Full Text Available Every year, flood disasters are responsible for widespread destruction and loss of human life. Remote sensing data are capable of providing valuable, synoptic coverage of flood events but are not always available because of satellite revisit limitations, obstructions from cloud cover or vegetation canopy, or expense. In addition, knowledge of road accessibility is imperative during all phases of a flood event. In June 2013, the City of Calgary experienced sudden and extensive flooding but lacked comprehensive remote sensing coverage. Using this event as a case study, this work illustrates how data from non-authoritative sources are used to augment traditional data and methods to estimate flood extent and identify affected roads during a flood disaster. The application of these data, which may have varying resolutions and uncertainities, provide an estimation of flood extent when traditional data and methods are lacking or incomplete. When flooding occurs over multiple days, it is possible to construct an estimate of the advancement and recession of the flood event. Non-authoritative sources also provide flood information at the micro-level, which can be difficult to capture from remote sensing data; however, the distibution and quantity of data collected from these sources will affect the quality of the flood estimations.
Future flood risk estimates along the river Rhine
Directory of Open Access Journals (Sweden)
A. H. te Linde
2011-02-01
Full Text Available In Europe, water management is moving from flood defence to a risk management approach, which takes both the probability and the potential consequences of flooding into account. It is expected that climate change and socio-economic development will lead to an increase in flood risk in the Rhine basin. To optimize spatial planning and flood management measures, studies are needed that quantify future flood risks and estimate their uncertainties. In this paper, we estimated the current and future fluvial flood risk in 2030 for the entire Rhine basin in a scenario study. The change in value at risk is based on two land-use projections derived from a land-use model representing two different socio-economic scenarios. Potential damage was calculated by a damage model, and changes in flood probabilities were derived from two climate scenarios and hydrological modeling. We aggregated the results into seven sections along the Rhine. It was found that the annual expected damage in the Rhine basin may increase by between 54% and 230%, of which the major part (~ three-quarters can be accounted for by climate change. The highest current potential damage can be found in the Netherlands (110 billion €, compared with the second (80 billion € and third (62 billion € highest values in two areas in Germany. Results further show that the area with the highest fluvial flood risk is located in the Lower Rhine in Nordrhein-Westfalen in Germany, and not in the Netherlands, as is often perceived. This is mainly due to the higher flood protection standards in the Netherlands as compared to Germany.
Higher moments method for generalized Pareto distribution in flood frequency analysis
Zhou, C. R.; Chen, Y. F.; Huang, Q.; Gu, S. H.
2017-08-01
The generalized Pareto distribution (GPD) has proven to be the ideal distribution in fitting with the peak over threshold series in flood frequency analysis. Several moments-based estimators are applied to estimating the parameters of GPD. Higher linear moments (LH moments) and higher probability weighted moments (HPWM) are the linear combinations of Probability Weighted Moments (PWM). In this study, the relationship between them will be explored. A series of statistical experiments and a case study are used to compare their performances. The results show that if the same PWM are used in LH moments and HPWM methods, the parameter estimated by these two methods is unbiased. Particularly, when the same PWM are used, the PWM method (or the HPWM method when the order equals 0) shows identical results in parameter estimation with the linear Moments (L-Moments) method. Additionally, this phenomenon is significant when r ≥ 1 that the same order PWM are used in HPWM and LH moments method.
Methods for design flood estimation in South Africa | Smithers ...
African Journals Online (AJOL)
The estimation of design floods is necessary for the design of hydraulic structures and to quantify the risk of failure of the structures. Most of the methods used for design flood estimation in South Africa were developed in the late 1960s and early 1970s and are in need of updating with more than 40 years of additional data ...
Mousa, Mustafa
2014-04-01
This article describes a machine learning approach to water level estimation in a dual ultrasonic/passive infrared urban flood sensor system. We first show that an ultrasonic rangefinder alone is unable to accurately measure the level of water on a road due to thermal effects. Using additional passive infrared sensors, we show that ground temperature and local sensor temperature measurements are sufficient to correct the rangefinder readings and improve the flood detection performance. Since floods occur very rarely, we use a supervised learning approach to estimate the correction to the ultrasonic rangefinder caused by temperature fluctuations. Preliminary data shows that water level can be estimated with an absolute error of less than 2 cm. © 2014 IEEE.
Estimates of present and future flood risk in the conterminous United States
Wing, Oliver E. J.; Bates, Paul D.; Smith, Andrew M.; Sampson, Christopher C.; Johnson, Kris A.; Fargione, Joseph; Morefield, Philip
2018-03-01
Past attempts to estimate rainfall-driven flood risk across the US either have incomplete coverage, coarse resolution or use overly simplified models of the flooding process. In this paper, we use a new 30 m resolution model of the entire conterminous US with a 2D representation of flood physics to produce estimates of flood hazard, which match to within 90% accuracy the skill of local models built with detailed data. These flood depths are combined with exposure datasets of commensurate resolution to calculate current and future flood risk. Our data show that the total US population exposed to serious flooding is 2.6-3.1 times higher than previous estimates, and that nearly 41 million Americans live within the 1% annual exceedance probability floodplain (compared to only 13 million when calculated using FEMA flood maps). We find that population and GDP growth alone are expected to lead to significant future increases in exposure, and this change may be exacerbated in the future by climate change.
Estimation of Damage Costs Associated with Flood Events
Andrews, T. A.; Wauthier, C.; Zipp, K.
2017-12-01
This study investigates the possibility of creating a mathematical function that enables the estimation of flood-damage costs. We begin by examining the costs associated with past flood events in the United States. The data on these tropical storms and hurricanes are provided by the National Oceanic and Atmospheric Administration. With the location, extent of flooding, and damage reparation costs identified, we analyze variables such as: number of inches rained, land elevation, type of landscape, region development in regards to building density and infrastructure, and population concentration. We seek to identify the leading drivers of high flood-damage costs and understand which variables play a large role in the costliness of these weather events. Upon completion of our mathematical analysis, we turn out attention to the 2017 natural disaster of Texas. We divide the region, as we did above, by land elevation, type of landscape, region development in regards to building density and infrastructure, and population concentration. Then, we overlay the number of inches rained in those regions onto the divided landscape and apply our function. We hope to use these findings to estimate the potential flood-damage costs of Hurricane Harvey. This information is then transformed into a hazard map that could provide citizens and businesses of flood-stricken zones additional resources for their insurance selection process.
Risk to life due to flooding in post-Katrina New Orleans
Miller, A.; Jonkman, S. N.; Van Ledden, M.
2015-01-01
Since the catastrophic flooding of New Orleans due to Hurricane Katrina in 2005, the city's hurricane protection system has been improved to provide protection against a hurricane load with a 1/100 per year exceedance frequency. This paper investigates the risk to life in post-Katrina New Orleans. In a flood risk analysis the probabilities and consequences of various flood scenarios have been analyzed for the central area of the city (the metro bowl) to give a preliminary estimate of the risk to life in the post-Katrina situation. A two-dimensional hydrodynamic model has been used to simulate flood characteristics of various breaches. The model for estimation of fatality rates is based on the loss of life data for Hurricane Katrina. Results indicate that - depending on the flood scenario - the estimated loss of life in case of flooding ranges from about 100 to nearly 500, with the highest life loss due to breaching of the river levees leading to large flood depths. The probability and consequence estimates are combined to determine the individual risk and societal risk for New Orleans. When compared to risks of other large-scale engineering systems (e.g., other flood prone areas, dams and the nuclear sector) and acceptable risk criteria found in literature, the risks for the metro bowl are found to be relatively high. Thus, despite major improvements to the flood protection system, the flood risk to life of post-Katrina New Orleans is still expected to be significant. Indicative effects of reduction strategies on the risk level are discussed as a basis for further evaluation and discussion.
Floods in the United States: Magnitude and frequency
Jarvis, Clarence S.; ,
1936-01-01
From time immemorial floods have transformed beneficent river waters into a menace to humanity. Man's progress toward economic stability has been repeatedly halted or even thrown backward by the interruption of his efforts to make effective use of rivers and of valley lands. This handicap is not imposed by the destructiveness of large rivers alone, or of rivers in widely separated areas, for there are few if any streams, brooks, or rivulets that are not subject to flows beyond their channel capacities. Yet, though man for ages has suffered seriously from recurring floods, he has not been deterred from continuing to extend his activities in areas that are virtually foredoomed to flood damage.Today in the United States serious floods may occur in any section in any year, and even, in some regions, several times a year. Many of these floods leave behind them the tragedy of death and disease and of property irreparably damaged. The aggregate direct property damage caused by floods in this country has been estimated roughly to average $35,000,000 a year. In addition there are serious indirect and intangible losses of great but not precisely calculable magnitude.
DEFF Research Database (Denmark)
Kjeldsen, Thomas Rødding; Smithers, J.C.; Schulze, R.E.
2002-01-01
A regional frequency analysis of annual maximum series (AMS) of flood flows from relatively unregulated rivers in the KwaZulu-Natal province of South Africa has been conducted, including identification of homogeneous regions and suitable regional frequency distributions for the regions. The study...
Combining Neural Networks with Existing Methods to Estimate 1 in 100-Year Flood Event Magnitudes
Newson, A.; See, L.
2005-12-01
Over the last fifteen years artificial neural networks (ANN) have been shown to be advantageous for the solution of many hydrological modelling problems. The use of ANNs for flood magnitude estimation in ungauged catchments, however, is a relatively new and under researched area. In this paper ANNs are used to make estimates of the magnitude of the 100-year flood event (Q100) for a number of ungauged catchments. The data used in this study were provided by the Centre for Ecology and Hydrology's Flood Estimation Handbook (FEH), which contains information on catchments across the UK. Sixteen catchment descriptors for 719 catchments were used to train an ANN, which was split into a training, validation and test data set. The goodness-of-fit statistics on the test data set indicated good model performance, with an r-squared value of 0.8 and a coefficient of efficiency of 79 percent. Data for twelve ungauged catchments were then put through the trained ANN to produce estimates of Q100. Two other accepted methodologies were also employed: the FEH statistical method and the FSR (Flood Studies Report) design storm technique, both of which are used to produce flood frequency estimates. The advantage of developing an ANN model is that it provides a third figure to aid a hydrologist in making an accurate estimate. For six of the twelve catchments, there was a relatively low spread between estimates. In these instances, an estimate of Q100 could be made with a fair degree of certainty. Of the remaining six catchments, three had areas greater than 1000km2, which means the FSR design storm estimate cannot be used. Armed with the ANN model and the FEH statistical method the hydrologist still has two possible estimates to consider. For these three catchments, the estimates were also fairly similar, providing additional confidence to the estimation. In summary, the findings of this study have shown that an accurate estimation of Q100 can be made using the catchment descriptors of
Improved estimation of leak location of pipelines using frequency band variation
Energy Technology Data Exchange (ETDEWEB)
Lee, Young Sup [Embedded System Engineering Department, Incheon National University, Incheon (Korea, Republic of); Yoon, Dong Jin [Safety Measurement Center, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)
2014-02-15
Leakage is an important factor to be considered for the management of underground water supply pipelines in a smart water grid system, especially if the pipelines are aged and buried under the pavement or various structures of a highly populated city. Because the exact detection of the location of such leaks in pipelines is essential for their efficient operation, a new methodology for leak location detection based on frequency band variation, windowing filters, and probability is proposed in this paper. Because the exact detection of the leak location depends on the precision of estimation of time delay between sensor signals due to leak noise, some window functions that offer weightings at significant frequencies are applied for calculating the improved cross-correlation function. Experimental results obtained by applying this methodology to an actual buried water supply pipeline, ∼ 253.9 m long and made of cast iron, revealed that the approach of frequency band variation with those windows and probability offers better performance for leak location detection.
Improved estimation of leak location of pipelines using frequency band variation
International Nuclear Information System (INIS)
Lee, Young Sup; Yoon, Dong Jin
2014-01-01
Leakage is an important factor to be considered for the management of underground water supply pipelines in a smart water grid system, especially if the pipelines are aged and buried under the pavement or various structures of a highly populated city. Because the exact detection of the location of such leaks in pipelines is essential for their efficient operation, a new methodology for leak location detection based on frequency band variation, windowing filters, and probability is proposed in this paper. Because the exact detection of the leak location depends on the precision of estimation of time delay between sensor signals due to leak noise, some window functions that offer weightings at significant frequencies are applied for calculating the improved cross-correlation function. Experimental results obtained by applying this methodology to an actual buried water supply pipeline, ∼ 253.9 m long and made of cast iron, revealed that the approach of frequency band variation with those windows and probability offers better performance for leak location detection.
Longo, Elisa; Tito Aronica, Giuseppe; Di Baldassarre, Giuliano; Mukolwe, Micah
2015-04-01
Flooding is one of the most impactful natural hazards. In particular, by looking at the data of damages from natural hazards in Europe collected in the International Disaster Database (EM-DAT) one can see a significant increase over the past four decades of both frequency of floods and associated economic damages. Similarly, dramatic trends are also found by analyzing other types of flood losses, such as the number of people affected by floods, homeless, injured or killed. To deal with the aforementioned increase of flood risk, more and more efforts are being made to promote integrated flood risk management, for instance, at the end of 2007, the European Community (EC) issued the Flood Directive (F.D.) 2007/60/EC. One of the major innovations was that the F.D. 2007/60/C requires Member State to carry out risk maps and then take appropriate measures to reduce the evaluated risk. The main goal of this research was to estimate flood damaging using a computer code based on a recently developed method (KULTURisk, www.kulturisk.eu) and to compare the estimated damage with the observed one. The study area was the municipality of Eilenburg, which in 2002 was subjected to a destructive flood event. Were produced flood damage maps with new procedures (e.g. KULTURisk) and compared the estimates with observed data. This study showed the possibility to extend the lesson learned with the Eilenburg case study in other similar contexts. The outcomes of this test provided interesting insights about the flood risk mapping, which are expected to contribute to raise awareness to the flooding issues,to plan (structural and/or non-structural) measures of flood risk reduction and to support better land-use and urban planning.
Estimation of phosphorus flux in rivers during flooding.
Chen, Yen-Chang; Liu, Jih-Hung; Kuo, Jan-Tai; Lin, Cheng-Fang
2013-07-01
Reservoirs in Taiwan are inundated with nutrients that result in algal growth, and thus also reservoir eutrophication. Controlling the phosphorus load has always been the most crucial issue for maintaining reservoir water quality. Numerous agricultural activities, especially the production of tea in riparian areas, are conducted in watersheds in Taiwan. Nutrients from such activities, including phosphorus, are typically flushed into rivers during flooding, when over 90% of the yearly total amount of phosphorous enters reservoirs. Excessive or enhanced soil erosion from rainstorms can dramatically increase the river sediment load and the amount of particulate phosphorus flushed into rivers. When flow rates are high, particulate phosphorus is the dominant form of phosphorus, but sediment and discharge measurements are difficult during flooding, which makes estimating phosphorus flux in rivers difficult. This study determines total amounts of phosphorus transport by measuring flood discharge and phosphorous levels during flooding. Changes in particulate phosphorus, dissolved phosphorus, and their adsorption behavior during a 24-h period are analyzed owing to the fact that the time for particulate phosphorus adsorption and desorption approaching equilibrium is about 16 h. Erosion of the reservoir watershed was caused by adsorption and desorption of suspended solids in the river, a process which can be summarily described using the Lagmuir isotherm. A method for estimating the phosphorus flux in the Daiyujay Creek during Typhoon Bilis in 2006 is presented in this study. Both sediment and phosphorus are affected by the drastic discharge during flooding. Water quality data were collected during two flood events, flood in June 9, 2006 and Typhoon Bilis, to show the concentrations of suspended solids and total phosphorus during floods are much higher than normal stages. Therefore, the drastic changes of total phosphorus, particulate phosphorus, and dissolved phosphorus in
Challenges estimating the return period of extreme floods for reinsurance applications
Raven, Emma; Busby, Kathryn; Liu, Ye
2013-04-01
Mapping and modelling extreme natural events is fundamental within the insurance and reinsurance industry for assessing risk. For example, insurers might use a 1 in 100-year flood hazard map to set the annual premium of a property, whilst a reinsurer might assess the national scale loss associated with the 1 in 200-year return period for capital and regulatory requirements. Using examples from a range of international flood projects, we focus on exploring how to define what the n-year flood looks like for predictive uses in re/insurance applications, whilst considering challenges posed by short historical flow records and the spatial and temporal complexities of flood. First, we shall explore the use of extreme value theory (EVT) statistics for extrapolating data beyond the range of observations in a marginal analysis. In particular, we discuss how to estimate the return period of historical flood events and explore the impact that a range of statistical decisions have on these estimates. Decisions include: (1) selecting which distribution type to apply (e.g. generalised Pareto distribution (GPD) vs. generalised extreme value distribution (GEV)); (2) if former, the choice of the threshold above which the GPD is fitted to the data; and (3) the necessity to perform a cluster analysis to group flow peaks to temporally represent individual flood events. Second, we summarise a specialised multivariate extreme value model, which combines the marginal analysis above with dependence modelling to generate industry standard event sets containing thousands of simulated, equi-probable floods across a region/country. These events represent the typical range of anticipated flooding across a region and can be used to estimate the largest or most widespread events that are expected to occur. Finally, we summarise how a reinsurance catastrophe model combines the event set with detailed flood hazard maps to estimate the financial cost of floods; both the full event set and also
Estimating floodwater depths from flood inundation maps and topography
Cohen, Sagy; Brakenridge, G. Robert; Kettner, Albert; Bates, Bradford; Nelson, Jonathan M.; McDonald, Richard R.; Huang, Yu-Fen; Munasinghe, Dinuke; Zhang, Jiaqi
2018-01-01
Information on flood inundation extent is important for understanding societal exposure, water storage volumes, flood wave attenuation, future flood hazard, and other variables. A number of organizations now provide flood inundation maps based on satellite remote sensing. These data products can efficiently and accurately provide the areal extent of a flood event, but do not provide floodwater depth, an important attribute for first responders and damage assessment. Here we present a new methodology and a GIS-based tool, the Floodwater Depth Estimation Tool (FwDET), for estimating floodwater depth based solely on an inundation map and a digital elevation model (DEM). We compare the FwDET results against water depth maps derived from hydraulic simulation of two flood events, a large-scale event for which we use medium resolution input layer (10 m) and a small-scale event for which we use a high-resolution (LiDAR; 1 m) input. Further testing is performed for two inundation maps with a number of challenging features that include a narrow valley, a large reservoir, and an urban setting. The results show FwDET can accurately calculate floodwater depth for diverse flooding scenarios but also leads to considerable bias in locations where the inundation extent does not align well with the DEM. In these locations, manual adjustment or higher spatial resolution input is required.
Filipova, Valeriya; Lawrence, Deborah; Klempe, Harald
2018-02-01
Applying copula-based bivariate flood frequency analysis is advantageous because the results provide information on both the flood peak and volume. More data are, however, required for such an analysis, and it is often the case that only data series with a limited record length are available. To overcome this issue of limited record length, data regarding climatic and geomorphological properties can be used to complement statistical methods. In this paper, we present a study of 27 catchments located throughout Norway, in which we assess whether catchment properties, flood generation processes and flood regime have an effect on the correlation between flood peak and volume and, in turn, on the selection of copulas. To achieve this, the annual maximum flood events were first classified into events generated primarily by rainfall, snowmelt or a combination of these. The catchments were then classified into flood regime, depending on the predominant flood generation process producing the annual maximum flood events. A contingency table and Fisher's exact test were used to determine the factors that affect the selection of copulas in the study area. The results show that the two-parameter copulas BB1 and BB7 are more commonly selected in catchments with high steepness, high mean annual runoff and rainfall flood regime. These findings suggest that in these types of catchments, the dependence structure between flood peak and volume is more complex and cannot be modeled effectively using a one-parameter copula. The results illustrate that by relating copula types to flood regime and catchment properties, additional information can be supplied for selecting copulas in catchments with limited data.
Aronica, G. T.; Candela, A.
2007-12-01
SummaryIn this paper a Monte Carlo procedure for deriving frequency distributions of peak flows using a semi-distributed stochastic rainfall-runoff model is presented. The rainfall-runoff model here used is very simple one, with a limited number of parameters and practically does not require any calibration, resulting in a robust tool for those catchments which are partially or poorly gauged. The procedure is based on three modules: a stochastic rainfall generator module, a hydrologic loss module and a flood routing module. In the rainfall generator module the rainfall storm, i.e. the maximum rainfall depth for a fixed duration, is assumed to follow the two components extreme value (TCEV) distribution whose parameters have been estimated at regional scale for Sicily. The catchment response has been modelled by using the Soil Conservation Service-Curve Number (SCS-CN) method, in a semi-distributed form, for the transformation of total rainfall to effective rainfall and simple form of IUH for the flood routing. Here, SCS-CN method is implemented in probabilistic form with respect to prior-to-storm conditions, allowing to relax the classical iso-frequency assumption between rainfall and peak flow. The procedure is tested on six practical case studies where synthetic FFC (flood frequency curve) were obtained starting from model variables distributions by simulating 5000 flood events combining 5000 values of total rainfall depth for the storm duration and AMC (antecedent moisture conditions) conditions. The application of this procedure showed how Monte Carlo simulation technique can reproduce the observed flood frequency curves with reasonable accuracy over a wide range of return periods using a simple and parsimonious approach, limited data input and without any calibration of the rainfall-runoff model.
Kohn, Michael S.; Stevens, Michael R.; Mommandi, Amanullah; Khan, Aziz R.
2017-12-14
The U.S. Geological Survey (USGS), in cooperation with the Colorado Department of Transportation, determined the peak discharge, annual exceedance probability (flood frequency), and peak stage of two floods that took place on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado (hereafter referred to as “Big Cottonwood Creek site”), on August 23, 2016, and on Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado (hereafter referred to as “Fountain Creek site”), on August 29, 2016. A one-dimensional hydraulic model was used to estimate the peak discharge. To define the flood frequency of each flood, peak-streamflow regional-regression equations or statistical analyses of USGS streamgage records were used to estimate annual exceedance probability of the peak discharge. A survey of the high-water mark profile was used to determine the peak stage, and the limitations and accuracy of each component also are presented in this report. Collection and computation of flood data, such as peak discharge, annual exceedance probability, and peak stage at structures critical to Colorado’s infrastructure are an important addition to the flood data collected annually by the USGS.The peak discharge of the August 23, 2016, flood at the Big Cottonwood Creek site was 917 cubic feet per second (ft3/s) with a measurement quality of poor (uncertainty plus or minus 25 percent or greater). The peak discharge of the August 29, 2016, flood at the Fountain Creek site was 5,970 ft3/s with a measurement quality of poor (uncertainty plus or minus 25 percent or greater).The August 23, 2016, flood at the Big Cottonwood Creek site had an annual exceedance probability of less than 0.01 (return period greater than the 100-year flood) and had an annual exceedance probability of greater than 0.005 (return period less than the 200-year flood). The August 23, 2016, flood event was caused by a precipitation event having an annual exceedance probability of 1.0 (return
Application of Artificial Neural Networks for estimating index floods
Šimor, Viliam; Hlavčová, Kamila; Kohnová, Silvia; Szolgay, Ján
2012-12-01
This article presents an application of Artificial Neural Networks (ANNs) and multiple regression models for estimating mean annual maximum discharge (index flood) at ungauged sites. Both approaches were tested for 145 small basins in Slovakia in areas ranging from 20 to 300 km2. Using the objective clustering method, the catchments were divided into ten homogeneous pooling groups; for each pooling group, mutually independent predictors (catchment characteristics) were selected for both models. The neural network was applied as a simple multilayer perceptron with one hidden layer and with a back propagation learning algorithm. Hyperbolic tangents were used as an activation function in the hidden layer. Estimating index floods by the multiple regression models were based on deriving relationships between the index floods and catchment predictors. The efficiencies of both approaches were tested by the Nash-Sutcliffe and a correlation coefficients. The results showed the comparative applicability of both models with slightly better results for the index floods achieved using the ANNs methodology.
Wardah, T.; Abu Bakar, S. H.; Bardossy, A.; Maznorizan, M.
2008-07-01
SummaryFrequent flash-floods causing immense devastation in the Klang River Basin of Malaysia necessitate an improvement in the real-time forecasting systems being used. The use of meteorological satellite images in estimating rainfall has become an attractive option for improving the performance of flood forecasting-and-warning systems. In this study, a rainfall estimation algorithm using the infrared (IR) information from the Geostationary Meteorological Satellite-5 (GMS-5) is developed for potential input in a flood forecasting system. Data from the records of GMS-5 IR images have been retrieved for selected convective cells to be trained with the radar rain rate in a back-propagation neural network. The selected data as inputs to the neural network, are five parameters having a significant correlation with the radar rain rate: namely, the cloud-top brightness-temperature of the pixel of interest, the mean and the standard deviation of the temperatures of the surrounding five by five pixels, the rate of temperature change, and the sobel operator that indicates the temperature gradient. In addition, three numerical weather prediction (NWP) products, namely the precipitable water content, relative humidity, and vertical wind, are also included as inputs. The algorithm is applied for the areal rainfall estimation in the upper Klang River Basin and compared with another technique that uses power-law regression between the cloud-top brightness-temperature and radar rain rate. Results from both techniques are validated against previously recorded Thiessen areal-averaged rainfall values with coefficient correlation values of 0.77 and 0.91 for the power-law regression and the artificial neural network (ANN) technique, respectively. An extra lead time of around 2 h is gained when the satellite-based ANN rainfall estimation is coupled with a rainfall-runoff model to forecast a flash-flood event in the upper Klang River Basin.
International Nuclear Information System (INIS)
Kohut, P.
1994-07-01
The major objective of the Surry internal flood analysis was to provide an improved understanding of the core damage scenarios arising from internal flood-related events. The mean core damage frequency of the Surry plant due to internal flood events during mid-loop operations is 4.8E-06 per year, and the 5th and 95th percentiles are 2.2E-07 and 1.8E-05 per year, respectively. Some limited sensitivity calculations were performed on three plant improvement options. The most significant result involves modifications of intake-level structure on the canal, which reduced core damage frequency contribution from floods in mid-loop by about 75%
Peak flood estimation using gene expression programming
Zorn, Conrad R.; Shamseldin, Asaad Y.
2015-12-01
As a case study for the Auckland Region of New Zealand, this paper investigates the potential use of gene-expression programming (GEP) in predicting specific return period events in comparison to the established and widely used Regional Flood Estimation (RFE) method. Initially calibrated to 14 gauged sites, the GEP derived model was further validated to 10 and 100 year flood events with a relative errors of 29% and 18%, respectively. This is compared to the RFE method providing 48% and 44% errors for the same flood events. While the effectiveness of GEP in predicting specific return period events is made apparent, it is argued that the derived equations should be used in conjunction with those existing methodologies rather than as a replacement.
Haberlandt, U.; Radtke, I.
2014-01-01
Derived flood frequency analysis allows the estimation of design floods with hydrological modeling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices regarding precipitation input, discharge output and consequently the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets and to propose the most suitable approach. Event based and continuous, observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output, short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in northern Germany with the hydrological model HEC-HMS (Hydrologic Engineering Center's Hydrologic Modeling System). The results show that (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, and (III) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the
Mapping flood hazards under uncertainty through probabilistic flood inundation maps
Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.
2017-12-01
Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.
Combining information from multiple flood projections in a hierarchical Bayesian framework
Le Vine, Nataliya
2016-04-01
This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Flood Damage and Loss Estimation for Iowa on Web-based Systems using HAZUS
Yildirim, E.; Sermet, M. Y.; Demir, I.
2016-12-01
Importance of decision support systems for flood emergency response and loss estimation increases with its social and economic impacts. To estimate the damage of the flood, there are several software systems available to researchers and decision makers. HAZUS-MH is one of the most widely used desktop program, developed by FEMA (Federal Emergency Management Agency), to estimate economic loss and social impacts of disasters such as earthquake, hurricane and flooding (riverine and coastal). HAZUS used loss estimation methodology and implements through geographic information system (GIS). HAZUS contains structural, demographic, and vehicle information across United States. Thus, it allows decision makers to understand and predict possible casualties and damage of the floods by running flood simulations through GIS application. However, it doesn't represent real time conditions because of using static data. To close this gap, an overview of a web-based infrastructure coupling HAZUS and real time data provided by IFIS (Iowa Flood Information System) is presented by this research. IFIS is developed by the Iowa Flood Center, and a one-stop web-platform to access community-based flood conditions, forecasts, visualizations, inundation maps and flood-related data, information, and applications. Large volume of real-time observational data from a variety of sensors and remote sensing resources (radars, rain gauges, stream sensors, etc.) and flood inundation models are staged on a user-friendly maps environment that is accessible to the general public. Providing cross sectional analyses between HAZUS-MH and IFIS datasets, emergency managers are able to evaluate flood damage during flood events easier and more accessible in real time conditions. With matching data from HAZUS-MH census tract layer and IFC gauges, economical effects of flooding can be observed and evaluated by decision makers. The system will also provide visualization of the data by using augmented reality for
The Sensetivity of Flood Frequency Analysis on Record Length in Continuous United States
Hu, L.; Nikolopoulos, E. I.; Anagnostou, E. N.
2017-12-01
In flood frequency analysis (FFA), sufficiently long data series are important to get more reliable results. Compared to return periods of interest, at-site FFA usually needs large data sets. Generally, the precision of at site estimators and time-sampling errors are associated with the length of a gauged record. In this work, we quantify the difference with various record lengths. we use generalized extreme value (GEV) and Log Pearson type III (LP3), two traditional methods on annual maximum stream flows to undertake FFA, and propose quantitative ways, relative difference in median and interquartile range (IQR) to compare the flood frequency performances on different record length from selected 350 USGS gauges, which have more than 70 years record length in Continuous United States. Also, we group those gauges into different regions separately based on hydrological unit map and discuss the geometry impacts. The results indicate that long record length can avoid imposing an upper limit on the degree of sophistication. Working with relatively longer record length may lead accurate results than working with shorter record length. Furthermore, the influence of hydrologic unites for the watershed boundary dataset on those gauges also be presented. The California region is the most sensitive to record length, while gauges in the east perform steady.
Estimates of Present and Future Flood Risk in the Conterminous United States
Wing, O.; Bates, P. D.; Smith, A.; Sampson, C. C.; Johnson, K.; Fargione, J.; Morefield, P.
2017-12-01
Past attempts to estimate flood risk across the USA either have incomplete coverage, coarse resolution or use overly simplified models of the flooding process. In this paper, we use a new 30m resolution model of the entire conterminous US (CONUS) with realistic flood physics to produce estimates of flood hazard which match to within 90% accuracy the skill of local models built with detailed data. Socio-economic data of commensurate resolution are combined with these flood depths to estimate current and future flood risk. Future population and land-use projections from the US Environmental Protection Agency (USEPA) are employed to indicate how flood risk might change through the 21st Century, while present-day estimates utilize the Federal Emergency Management Agency (FEMA) National Structure Inventory and a USEPA map of population distribution. Our data show that the total CONUS population currently exposed to serious flooding is 2.6 to 3.1 times higher than previous estimates; with nearly 41 million Americans living within the so-called 1 in 100-year (1% annual probability) floodplain, compared to only 13 million according to FEMA flood maps. Moreover, socio-economic change alone leads to significant future increases in flood exposure and risk, even before climate change impacts are accounted for. The share of the population living on the 1 in 100-year floodplain is projected to increase from 13.3% in the present-day to 15.6 - 15.8% in 2050 and 16.4 - 16.8% in 2100. The area of developed land within this floodplain, currently at 150,000 km2, is likely to increase by 37 - 72% in 2100 based on the scenarios selected. 5.5 trillion worth of assets currently lie on the 1% floodplain; we project that by 2100 this number will exceed 10 trillion. With this detailed spatial information on present-day flood risk, federal and state agencies can take appropriate action to mitigate losses. Use of USEPA population and land-use projections mean that particular attention can be
Bartiko, Daniel; Chaffe, Pedro; Bonumá, Nadia
2017-04-01
Floods may be strongly affected by climate, land-use, land-cover and water infrastructure changes. However, it is common to model this process as stationary. This approach has been questioned, especially when it involves estimate of the frequency and magnitude of extreme events for designing and maintaining hydraulic structures, as those responsible for flood control and dams safety. Brazil is the third largest producer of hydroelectricity in the world and many of the country's dams are located in the Southern Region. So, it seems appropriate to investigate the presence of non-stationarity in the affluence in these plants. In our study, we used historical flood data from the Brazilian National Grid Operator (ONS) to explore trends in annual maxima in river flow of the 38 main rivers flowing to Southern Brazilian reservoirs (records range from 43 to 84 years). In the analysis, we assumed a two-parameter log-normal distribution a linear regression model was applied in order to allow for the mean to vary with time. We computed recurrence reduction factors to characterize changes in the return period of an initially estimated 100 year-flood by a log-normal stationary model. To evaluate whether or not a particular site exhibits positive trend, we only considered data series with linear regression slope coefficients that exhibit significance levels (p<0,05). The significance level was calculated using the one-sided Student's test. The trend model residuals were analyzed using the Anderson-Darling normality test, the Durbin-Watson test for the independence and the Breusch-Pagan test for heteroscedasticity. Our results showed that 22 of the 38 data series analyzed have a significant positive trend. The trends were mainly in three large basins: Iguazu, Uruguay and Paranapanema, which suffered changes in land use and flow regularization in the last years. The calculated return period for the series that presented positive trend varied from 50 to 77 years for a 100 year-flood
How are flood risk estimates affected by the choice of return-periods?
Ward, P. J.; de Moel, H.; Aerts, J. C. J. H.
2011-12-01
Flood management is more and more adopting a risk based approach, whereby flood risk is the product of the probability and consequences of flooding. One of the most common approaches in flood risk assessment is to estimate the damage that would occur for floods of several exceedance probabilities (or return periods), to plot these on an exceedance probability-loss curve (risk curve) and to estimate risk as the area under the curve. However, there is little insight into how the selection of the return-periods (which ones and how many) used to calculate risk actually affects the final risk calculation. To gain such insights, we developed and validated an inundation model capable of rapidly simulating inundation extent and depth, and dynamically coupled this to an existing damage model. The method was applied to a section of the River Meuse in the southeast of the Netherlands. Firstly, we estimated risk based on a risk curve using yearly return periods from 2 to 10 000 yr (€ 34 million p.a.). We found that the overall risk is greatly affected by the number of return periods used to construct the risk curve, with over-estimations of annual risk between 33% and 100% when only three return periods are used. In addition, binary assumptions on dike failure can have a large effect (a factor two difference) on risk estimates. Also, the minimum and maximum return period considered in the curve affects the risk estimate considerably. The results suggest that more research is needed to develop relatively simple inundation models that can be used to produce large numbers of inundation maps, complementary to more complex 2-D-3-D hydrodynamic models. It also suggests that research into flood risk could benefit by paying more attention to the damage caused by relatively high probability floods.
Estimating the benefits of single value and probability forecasting for flood warning
Directory of Open Access Journals (Sweden)
J. S. Verkade
2011-12-01
Full Text Available Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS. These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty decreases the potential reduction of flood risk, but is seldom accounted for in estimates of the benefits of FFWRSs. In the present paper, a method to estimate the benefits of (imperfect FFWRSs in reducing flood risk is presented. The method is based on a hydro-economic model of expected annual damage (EAD due to flooding, combined with the concept of Relative Economic Value (REV. The estimated benefits include not only the reduction of flood losses due to a warning response, but also consider the costs of the warning response itself, as well as the costs associated with forecasting uncertainty. The method allows for estimation of the benefits of FFWRSs that use either deterministic or probabilistic forecasts. Through application to a case study, it is shown that FFWRSs using a probabilistic forecast have the potential to realise higher benefits at all lead-times. However, it is also shown that provision of warning at increasing lead-time does not necessarily lead to an increasing reduction of flood risk, but rather that an optimal lead-time at which warnings are provided can be established as a function of forecast uncertainty and the cost-loss ratio of the user receiving and responding to the warning.
International Nuclear Information System (INIS)
Dandini, V.; Staple, B.; Kirk, H.; Whitehead, D.; Forester, J.
1994-07-01
An estimate of the contribution of internal flooding to the mean core damage frequency at the Grand Gulf Nuclear Station was calculated for Plant Operational State 5 during a refueling outage. Pursuant to this objective, flood zones and sources were identified and flood volumes were calculated. Equipment necessary for the maintenance of plant safety was identified and its vulnerability to flooding was determined. Event trees and fault trees were modified or developed as required, and PRA quantification was performed using the IRRAS code. The mean core damage frequency estimate for GGNS during POS 5 was found to be 2.3 E-8 per year
Demissie, Y.; Mortuza, M. R.; Moges, E.; Yan, E.; Li, H. Y.
2017-12-01
Due to the lack of historical and future streamflow data for flood frequency analysis at or near most drainage sites, it is a common practice to directly estimate the design flood (maximum discharge or volume of stream for a given return period) based on storm frequency analysis and the resulted Intensity-Duration-Frequency (IDF) curves. Such analysis assumes a direct relationship between storms and floods with, for example, the 10-year rainfall expected to produce the 10-year flood. However, in reality, a storm is just one factor among the many other hydrological and metrological factors that can affect the peak flow and hydrograph. Consequently, a heavy storm does not necessarily always lead to flooding or a flood events with the same frequency. This is evident by the observed difference in the seasonality of heavy storms and floods in most regions. In order to understand site specific causal-effect relationship between heavy storms and floods and improve the flood analysis for stormwater drainage design and management, we have examined the contributions of various factors that affect floods using statistical and information theory methods. Based on the identified dominant causal-effect relationships, hydrologic and probability analyses were conducted to develop the runoff IDF curves taking into consideration the snowmelt and rain-on-snow effect, the difference in the storm and flood seasonality, soil moisture conditions, and catchment potential for flash and riverine flooding. The approach was demonstrated using data from military installations located in different parts of the United States. The accuracy of the flood frequency analysis and the resulted runoff IDF curves were evaluated based on the runoff IDF curves developed from streamflow measurements.
Urban micro-scale flood risk estimation with parsimonious hydraulic modelling and census data
Directory of Open Access Journals (Sweden)
C. Arrighi
2013-05-01
Full Text Available The adoption of 2007/60/EC Directive requires European countries to implement flood hazard and flood risk maps by the end of 2013. Flood risk is the product of flood hazard, vulnerability and exposure, all three to be estimated with comparable level of accuracy. The route to flood risk assessment is consequently much more than hydraulic modelling of inundation, that is hazard mapping. While hazard maps have already been implemented in many countries, quantitative damage and risk maps are still at a preliminary level. A parsimonious quasi-2-D hydraulic model is here adopted, having many advantages in terms of easy set-up. It is here evaluated as being accurate in flood depth estimation in urban areas with a high-resolution and up-to-date Digital Surface Model (DSM. The accuracy, estimated by comparison with marble-plate records of a historic flood in the city of Florence, is characterized in the downtown's most flooded area by a bias of a very few centimetres and a determination coefficient of 0.73. The average risk is found to be about 14 € m−2 yr−1, corresponding to about 8.3% of residents' income. The spatial distribution of estimated risk highlights a complex interaction between the flood pattern and the building characteristics. As a final example application, the estimated risk values have been used to compare different retrofitting measures. Proceeding through the risk estimation steps, a new micro-scale potential damage assessment method is proposed. This is based on the georeferenced census system as the optimal compromise between spatial detail and open availability of socio-economic data. The results of flood risk assessment at the census section scale resolve most of the risk spatial variability, and they can be easily aggregated to whatever upper scale is needed given that they are geographically defined as contiguous polygons. Damage is calculated through stage–damage curves, starting from census data on building type and
Estimation of probability of coastal flooding: A case study in the Norton Sound, Alaska
Kim, S.; Chapman, R. S.; Jensen, R. E.; Azleton, M. T.; Eisses, K. J.
2010-12-01
Along the Norton Sound, Alaska, coastal communities have been exposed to flooding induced by the extra-tropical storms. Lack of observation data especially with long-term variability makes it difficult to assess the probability of coastal flooding critical in planning for development and evacuation of the coastal communities. We estimated the probability of coastal flooding with the help of an existing storm surge model using ADCIRC and a wave model using WAM for the Western Alaska which includes the Norton Sound as well as the adjacent Bering Sea and Chukchi Sea. The surface pressure and winds as well as ice coverage was analyzed and put in a gridded format with 3 hour interval over the entire Alaskan Shelf by Ocean Weather Inc. (OWI) for the period between 1985 and 2009. The OWI also analyzed the surface conditions for the storm events over the 31 year time period between 1954 and 1984. The correlation between water levels recorded by NOAA tide gage and local meteorological conditions at Nome between 1992 and 2005 suggested strong local winds with prevailing Southerly components period are good proxies for high water events. We also selected heuristically the local winds with prevailing Westerly components at Shaktoolik which locates at the eastern end of the Norton Sound provided extra selection of flood events during the continuous meteorological data record between 1985 and 2009. The frequency analyses were performed using the simulated water levels and wave heights for the 56 year time period between 1954 and 2009. Different methods of estimating return periods were compared including the method according to FEMA guideline, the extreme value statistics, and fitting to the statistical distributions such as Weibull and Gumbel. The estimates are similar as expected but with a variation.
Recent advances in flood forecasting and flood risk assessment
Directory of Open Access Journals (Sweden)
G. Arduino
2005-01-01
Full Text Available Recent large floods in Europe have led to increased interest in research and development of flood forecasting systems. Some of these events have been provoked by some of the wettest rainfall periods on record which has led to speculation that such extremes are attributable in some measure to anthropogenic global warming and represent the beginning of a period of higher flood frequency. Whilst current trends in extreme event statistics will be difficult to discern, conclusively, there has been a substantial increase in the frequency of high floods in the 20th century for basins greater than 2x105 km2. There is also increasing that anthropogenic forcing of climate change may lead to an increased probability of extreme precipitation and, hence, of flooding. There is, therefore, major emphasis on the improvement of operational flood forecasting systems in Europe, with significant European Community spending on research and development on prototype forecasting systems and flood risk management projects. This Special Issue synthesises the most relevant scientific and technological results presented at the International Conference on Flood Forecasting in Europe held in Rotterdam from 3-5 March 2003. During that meeting 150 scientists, forecasters and stakeholders from four continents assembled to present their work and current operational best practice and to discuss future directions of scientific and technological efforts in flood prediction and prevention. The papers presented at the conference fall into seven themes, as follows.
Frequency Estimator Performance for a Software-Based Beacon Receiver
Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix
2014-01-01
As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.
Haberlandt, U.; Radtke, I.
2013-08-01
Derived flood frequency analysis allows to estimate design floods with hydrological modelling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices about precipitation input, discharge output and consequently regarding the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets. Event based and continuous observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in Northern Germany with the hydrological model HEC-HMS. The results show that: (i) the same type of precipitation input data should be used for calibration and application of the hydrological model, (ii) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, (iii) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the application for derived flood frequency analysis.
Paretti, Nicholas V.; Kennedy, Jeffrey R.; Cohn, Timothy A.
2014-01-01
Flooding is among the costliest natural disasters in terms of loss of life and property in Arizona, which is why the accurate estimation of flood frequency and magnitude is crucial for proper structural design and accurate floodplain mapping. Current guidelines for flood frequency analysis in the United States are described in Bulletin 17B (B17B), yet since B17B’s publication in 1982 (Interagency Advisory Committee on Water Data, 1982), several improvements have been proposed as updates for future guidelines. Two proposed updates are the Expected Moments Algorithm (EMA) to accommodate historical and censored data, and a generalized multiple Grubbs-Beck (MGB) low-outlier test. The current guidelines use a standard Grubbs-Beck (GB) method to identify low outliers, changing the determination of the moment estimators because B17B uses a conditional probability adjustment to handle low outliers while EMA censors the low outliers. B17B and EMA estimates are identical if no historical information or censored or low outliers are present in the peak-flow data. EMA with MGB (EMA-MGB) test was compared to the standard B17B (B17B-GB) method for flood frequency analysis at 328 streamgaging stations in Arizona. The methods were compared using the relative percent difference (RPD) between annual exceedance probabilities (AEPs), goodness-of-fit assessments, random resampling procedures, and Monte Carlo simulations. The AEPs were calculated and compared using both station skew and weighted skew. Streamgaging stations were classified by U.S. Geological Survey (USGS) National Water Information System (NWIS) qualification codes, used to denote historical and censored peak-flow data, to better understand the effect that nonstandard flood information has on the flood frequency analysis for each method. Streamgaging stations were also grouped according to geographic flood regions and analyzed separately to better understand regional differences caused by physiography and climate. The B
International Nuclear Information System (INIS)
Prasad, Rajiv; Hibler, Lyle F.; Coleman, Andre M.; Ward, Duane L.
2011-01-01
The purpose of this document is to describe approaches and methods for estimation of the design-basis flood at nuclear power plant sites. Chapter 1 defines the design-basis flood and lists the U.S. Nuclear Regulatory Commission's (NRC) regulations that require estimation of the design-basis flood. For comparison, the design-basis flood estimation methods used by other Federal agencies are also described. A brief discussion of the recommendations of the International Atomic Energy Agency for estimation of the design-basis floods in its member States is also included.
Energy Technology Data Exchange (ETDEWEB)
Prasad, Rajiv; Hibler, Lyle F.; Coleman, Andre M.; Ward, Duane L.
2011-11-01
The purpose of this document is to describe approaches and methods for estimation of the design-basis flood at nuclear power plant sites. Chapter 1 defines the design-basis flood and lists the U.S. Nuclear Regulatory Commission's (NRC) regulations that require estimation of the design-basis flood. For comparison, the design-basis flood estimation methods used by other Federal agencies are also described. A brief discussion of the recommendations of the International Atomic Energy Agency for estimation of the design-basis floods in its member States is also included.
Merging information from multi-model flood projections in a hierarchical Bayesian framework
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Improving Flash Flood Prediction in Multiple Environments
Broxton, P. D.; Troch, P. A.; Schaffner, M.; Unkrich, C.; Goodrich, D.; Wagener, T.; Yatheendradas, S.
2009-12-01
Flash flooding is a major concern in many fast responding headwater catchments . There are many efforts to model and to predict these flood events, though it is not currently possible to adequately predict the nature of flash flood events with a single model, and furthermore, many of these efforts do not even consider snow, which can, by itself, or in combination with rainfall events, cause destructive floods. The current research is aimed at broadening the applicability of flash flood modeling. Specifically, we will take a state of the art flash flood model that is designed to work with warm season precipitation in arid environments, the KINematic runoff and EROSion model (KINEROS2), and combine it with a continuous subsurface flow model and an energy balance snow model. This should improve its predictive capacity in humid environments where lateral subsurface flow significantly contributes to streamflow, and it will make possible the prediction of flooding events that involve rain-on-snow or rapid snowmelt. By modeling changes in the hydrologic state of a catchment before a flood begins, we can also better understand the factors or combination of factors that are necessary to produce large floods. Broadening the applicability of an already state of the art flash flood model, such as KINEROS2, is logical because flash floods can occur in all types of environments, and it may lead to better predictions, which are necessary to preserve life and property.
Flood frequency analysis at ungauged sites in the KwaZulu-Natal Province, South Africa
DEFF Research Database (Denmark)
Kjeldsen, Thomas Rodding; Smithers, J.C.; Schulze, R.E.
2001-01-01
Use of the index-flood method at ungauged sites requires methods for estimation of the index-flood parameter at these sites. This study attempts to relate the mean annual flood to site characteristics of catchments in KwaZulu-Natal, South Africa. The ordinary, weighted and generalised least square...
Design flood estimation in ungauged basins: probabilistic extension of the design-storm concept
Berk, Mario; Špačková, Olga; Straub, Daniel
2016-04-01
Design flood estimation in ungauged basins is an important hydrological task, which is in engineering practice typically solved with the design storm concept. However, neglecting the uncertainty in the hydrological response of the catchment through the assumption of average-recurrence-interval (ARI) neutrality between rainfall and runoff can lead to flawed design flood estimates. Additionally, selecting a single critical rainfall duration neglects the contribution of other rainfall durations on the probability of extreme flood events. In this study, the design flood problem is approached with concepts from structural reliability that enable a consistent treatment of multiple uncertainties in estimating the design flood. The uncertainty of key model parameters are represented probabilistically and the First-Order Reliability Method (FORM) is used to compute the flood exceedance probability. As an important by-product, the FORM analysis provides the most likely parameter combination to lead to a flood with a certain exceedance probability; i.e. it enables one to find representative scenarios for e.g., a 100 year or a 1000 year flood. Possible different rainfall durations are incorporated by formulating the event of a given design flood as a series system. The method is directly applicable in practice, since for the description of the rainfall depth-duration characteristics, the same inputs as for the classical design storm methods are needed, which are commonly provided by meteorological services. The proposed methodology is applied to a case study of Trauchgauer Ach catchment in Bavaria, SCS Curve Number (CN) and Unit hydrograph models are used for modeling the hydrological process. The results indicate, in accordance with past experience, that the traditional design storm concept underestimates design floods.
Flooding Experiments and Modeling for Improved Reactor Safety
International Nuclear Information System (INIS)
Solmos, M.; Hogan, K.J.; VIerow, K.
2008-01-01
Countercurrent two-phase flow and 'flooding' phenomena in light water reactor systems are being investigated experimentally and analytically to improve reactor safety of current and future reactors. The aspects that will be better clarified are the effects of condensation and tube inclination on flooding in large diameter tubes. The current project aims to improve the level of understanding of flooding mechanisms and to develop an analysis model for more accurate evaluations of flooding in the pressurizer surge line of a Pressurized Water Reactor (PWR). Interest in flooding has recently increased because Countercurrent Flow Limitation (CCFL) in the AP600 pressurizer surge line can affect the vessel refill rate following a small break LOCA and because analysis of hypothetical severe accidents with the current flooding models in reactor safety codes shows that these models represent the largest uncertainty in analysis of steam generator tube creep rupture. During a hypothetical station blackout without auxiliary feedwater recovery, should the hot leg become voided, the pressurizer liquid will drain to the hot leg and flooding may occur in the surge line. The flooding model heavily influences the pressurizer emptying rate and the potential for surge line structural failure due to overheating and creep rupture. The air-water test results in vertical tubes are presented in this paper along with a semi-empirical correlation for the onset of flooding. The unique aspects of the study include careful experimentation on large-diameter tubes and an integrated program in which air-water testing provides benchmark knowledge and visualization data from which to conduct steam-water testing
Power system frequency estimation based on an orthogonal decomposition method
Lee, Chih-Hung; Tsai, Men-Shen
2018-06-01
In recent years, several frequency estimation techniques have been proposed by which to estimate the frequency variations in power systems. In order to properly identify power quality issues under asynchronously-sampled signals that are contaminated with noise, flicker, and harmonic and inter-harmonic components, a good frequency estimator that is able to estimate the frequency as well as the rate of frequency changes precisely is needed. However, accurately estimating the fundamental frequency becomes a very difficult task without a priori information about the sampling frequency. In this paper, a better frequency evaluation scheme for power systems is proposed. This method employs a reconstruction technique in combination with orthogonal filters, which may maintain the required frequency characteristics of the orthogonal filters and improve the overall efficiency of power system monitoring through two-stage sliding discrete Fourier transforms. The results showed that this method can accurately estimate the power system frequency under different conditions, including asynchronously sampled signals contaminated by noise, flicker, and harmonic and inter-harmonic components. The proposed approach also provides high computational efficiency.
Methods for design flood estimation in South Africa
African Journals Online (AJOL)
2012-07-04
Jul 4, 2012 ... 1970s and are in need of updating with more than 40 years of additional data ... This paper reviews methods used for design flood estimation in South Africa and .... transposition of past experience, or a deterministic approach,.
Estimated value of insurance premium due to Citarum River flood by using Bayesian method
Sukono; Aisah, I.; Tampubolon, Y. R. H.; Napitupulu, H.; Supian, S.; Subiyanto; Sidi, P.
2018-03-01
Citarum river flood in South Bandung, West Java Indonesia, often happens every year. It causes property damage, producing economic loss. The risk of loss can be mitigated by following the flood insurance program. In this paper, we discussed about the estimated value of insurance premiums due to Citarum river flood by Bayesian method. It is assumed that the risk data for flood losses follows the Pareto distribution with the right fat-tail. The estimation of distribution model parameters is done by using Bayesian method. First, parameter estimation is done with assumption that prior comes from Gamma distribution family, while observation data follow Pareto distribution. Second, flood loss data is simulated based on the probability of damage in each flood affected area. The result of the analysis shows that the estimated premium value of insurance based on pure premium principle is as follows: for the loss value of IDR 629.65 million of premium IDR 338.63 million; for a loss of IDR 584.30 million of its premium IDR 314.24 million; and the loss value of IDR 574.53 million of its premium IDR 308.95 million. The premium value estimator can be used as neither a reference in the decision of reasonable premium determination, so as not to incriminate the insured, nor it result in loss of the insurer.
Flood susceptibility analysis through remote sensing, GIS and frequency ratio model
Samanta, Sailesh; Pal, Dilip Kumar; Palsamanta, Babita
2018-05-01
Papua New Guinea (PNG) is saddled with frequent natural disasters like earthquake, volcanic eruption, landslide, drought, flood etc. Flood, as a hydrological disaster to humankind's niche brings about a powerful and often sudden, pernicious change in the surface distribution of water on land, while the benevolence of flood manifests in restoring the health of the thalweg from excessive siltation by redistributing the fertile sediments on the riverine floodplains. In respect to social, economic and environmental perspective, flood is one of the most devastating disasters in PNG. This research was conducted to investigate the usefulness of remote sensing, geographic information system and the frequency ratio (FR) for flood susceptibility mapping. FR model was used to handle different independent variables via weighted-based bivariate probability values to generate a plausible flood susceptibility map. This study was conducted in the Markham riverine precinct under Morobe province in PNG. A historical flood inventory database of PNG resource information system (PNGRIS) was used to generate 143 flood locations based on "create fishnet" analysis. 100 (70%) flood sample locations were selected randomly for model building. Ten independent variables, namely land use/land cover, elevation, slope, topographic wetness index, surface runoff, landform, lithology, distance from the main river, soil texture and soil drainage were used into the FR model for flood vulnerability analysis. Finally, the database was developed for areas vulnerable to flood. The result demonstrated a span of FR values ranging from 2.66 (least flood prone) to 19.02 (most flood prone) for the study area. The developed database was reclassified into five (5) flood vulnerability zones segmenting on the FR values, namely very low (less that 5.0), low (5.0-7.5), moderate (7.5-10.0), high (10.0-12.5) and very high susceptibility (more than 12.5). The result indicated that about 19.4% land area as `very high
Flood damage estimation of companies: A comparison of Stage-Damage-Functions and Random Forests
Sieg, Tobias; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2017-04-01
The development of appropriate flood damage models plays an important role not only for the damage assessment after an event but also to develop adaptation and risk mitigation strategies. So called Stage-Damage-Functions (SDFs) are often applied as a standard approach to estimate flood damage. These functions assign a certain damage to the water depth depending on the use or other characteristics of the exposed objects. Recent studies apply machine learning algorithms like Random Forests (RFs) to model flood damage. These algorithms usually consider more influencing variables and promise to depict a more detailed insight into the damage processes. In addition they provide an inherent validation scheme. Our study focuses on direct, tangible damage of single companies. The objective is to model and validate the flood damage suffered by single companies with SDFs and RFs. The data sets used are taken from two surveys conducted after the floods in the Elbe and Danube catchments in the years 2002 and 2013 in Germany. Damage to buildings (n = 430), equipment (n = 651) as well as goods and stock (n = 530) are taken into account. The model outputs are validated via a comparison with the actual flood damage acquired by the surveys and subsequently compared with each other. This study investigates the gain in model performance with the use of additional data and the advantages and disadvantages of the RFs compared to SDFs. RFs show an increase in model performance with an increasing amount of data records over a comparatively large range, while the model performance of the SDFs is already saturated for a small set of records. In addition, the RFs are able to identify damage influencing variables, which improves the understanding of damage processes. Hence, RFs can slightly improve flood damage predictions and provide additional insight into the underlying mechanisms compared to SDFs.
Fundamental Frequency and Model Order Estimation Using Spatial Filtering
DEFF Research Database (Denmark)
Karimian-Azari, Sam; Jensen, Jesper Rindom; Christensen, Mads Græsbøll
2014-01-01
extend this procedure to account for inharmonicity using unconstrained model order estimation. The simulations show that beamforming improves the performance of the joint estimates of fundamental frequency and the number of harmonics in low signal to interference (SIR) levels, and an experiment......In signal processing applications of harmonic-structured signals, estimates of the fundamental frequency and number of harmonics are often necessary. In real scenarios, a desired signal is contaminated by different levels of noise and interferers, which complicate the estimation of the signal...... parameters. In this paper, we present an estimation procedure for harmonic-structured signals in situations with strong interference using spatial filtering, or beamforming. We jointly estimate the fundamental frequency and the constrained model order through the output of the beamformers. Besides that, we...
CHANGING FLOOD FREQUENCY IN SCOTLAND: IMPLICATIONS FOR CHANNEL GEOMORPHOLOGY, ECOLOGY AND MANAGEMENT
Thompson, Fiona Hilary
2017-01-01
The effect of climate on the fluvial system has long been investigated due the significant impact it can have on a river’s hydrological regime and fluvial processes. In recent years this interest has increased as global changes in climate are expected to bring more frequent high magnitude flood events globally and to North West Europe in particular. Despite the knowledge that the frequency and magnitude of floods is to increase, less is known about the geomorphological implicat...
A systematic intercomparison of regional flood frequency analysis models in a simulation framework
Ganora, Daniele; Laio, Francesco; Claps, Pierluigi
2015-04-01
Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve (or other discharge-related variables), based on the fundamental concept of substituting temporal information at a site (no data or short time series) by exploiting observations at other sites (spatial information). Different RFA paradigms exist, depending on the way the information is transferred to the site of interest. Despite the wide use of such methodology, a systematic comparison between these paradigms has not been performed. The aim of this study is to provide a framework wherein carrying out the intercomparison: we thus synthetically generate data through Monte Carlo simulations for a number of (virtual) stations, following a GEV parent distribution; different scenarios can be created to represent different spatial heterogeneity patterns by manipulating the parameters of the parent distribution at each station (e.g. with a linear variation in space of the shape parameter of the GEV). A special case is the homogeneous scenario where each station record is sampled from the same parent distribution. For each scenario and each simulation, different regional models are applied to evaluate the 200-year growth factor at each station. Results are than compared to the exact growth factor of each station, which is known in our virtual world. Considered regional approaches include: (i) a single growth curve for the whole region; (ii) a multiple-region model based on cluster analysis which search for an adequate number of homogeneous subregions; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure based on linear regressions.. A further benchmark model is the at-site estimate based on the analysis of the local record. A comprehensive analysis of the results of the simulations shows that, if the scenario is homogeneous (no spatial variability), all the regional approaches
Evaluation of Satellite Rainfall Estimates for Drought and Flood Monitoring in Mozambique
Directory of Open Access Journals (Sweden)
Carolien Toté
2015-02-01
Full Text Available Satellite derived rainfall products are useful for drought and flood early warning and overcome the problem of sparse, unevenly distributed and erratic rain gauge observations, provided their accuracy is well known. Mozambique is highly vulnerable to extreme weather events such as major droughts and floods and thus, an understanding of the strengths and weaknesses of different rainfall products is valuable. Three dekadal (10-day gridded satellite rainfall products (TAMSAT African Rainfall Climatology And Time-series (TARCAT v2.0, Famine Early Warning System NETwork (FEWS NET Rainfall Estimate (RFE v2.0, and Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS are compared to independent gauge data (2001–2012. This is done using pairwise comparison statistics to evaluate the performance in estimating rainfall amounts and categorical statistics to assess rain-detection capabilities. The analysis was performed for different rainfall categories, over the seasonal cycle and for regions dominated by different weather systems. Overall, satellite products overestimate low and underestimate high dekadal rainfall values. The RFE and CHIRPS products perform as good, generally outperforming TARCAT on the majority of statistical measures of skill. TARCAT detects best the relative frequency of rainfall events, while RFE underestimates and CHIRPS overestimates the rainfall events frequency. Differences in products performance disappear with higher rainfall and all products achieve better results during the wet season. During the cyclone season, CHIRPS shows the best results, while RFE outperforms the other products for lower dekadal rainfall. Products blending thermal infrared and passive microwave imagery perform better than infrared only products and particularly when meteorological patterns are more complex, such as over the coastal, central and south regions of Mozambique, where precipitation is influenced by frontal systems.
Evaluation of satellite rainfall estimates for drought and flood monitoring in Mozambique
Tote, Carolien; Patricio, Domingos; Boogaard, Hendrik; van der Wijngaart, Raymond; Tarnavsky, Elena; Funk, Christopher C.
2015-01-01
Satellite derived rainfall products are useful for drought and flood early warning and overcome the problem of sparse, unevenly distributed and erratic rain gauge observations, provided their accuracy is well known. Mozambique is highly vulnerable to extreme weather events such as major droughts and floods and thus, an understanding of the strengths and weaknesses of different rainfall products is valuable. Three dekadal (10-day) gridded satellite rainfall products (TAMSAT African Rainfall Climatology And Time-series (TARCAT) v2.0, Famine Early Warning System NETwork (FEWS NET) Rainfall Estimate (RFE) v2.0, and Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS)) are compared to independent gauge data (2001–2012). This is done using pairwise comparison statistics to evaluate the performance in estimating rainfall amounts and categorical statistics to assess rain-detection capabilities. The analysis was performed for different rainfall categories, over the seasonal cycle and for regions dominated by different weather systems. Overall, satellite products overestimate low and underestimate high dekadal rainfall values. The RFE and CHIRPS products perform as good, generally outperforming TARCAT on the majority of statistical measures of skill. TARCAT detects best the relative frequency of rainfall events, while RFE underestimates and CHIRPS overestimates the rainfall events frequency. Differences in products performance disappear with higher rainfall and all products achieve better results during the wet season. During the cyclone season, CHIRPS shows the best results, while RFE outperforms the other products for lower dekadal rainfall. Products blending thermal infrared and passive microwave imagery perform better than infrared only products and particularly when meteorological patterns are more complex, such as over the coastal, central and south regions of Mozambique, where precipitation is influenced by frontal systems.
Probabilistic flood extent estimates from social media flood observations
Brouwer, Tom; Eilander, Dirk; Van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen
2017-01-01
The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from
Probabilistic flood extent estimates from social media flood observations
Brouwer, Tom; Eilander, Dirk; Van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen
2017-01-01
The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, creates a growing need for accurate and timely flood maps. This research focussed on creating flood maps using user generated content from Twitter. Twitter data has
How Metastrategic Considerations Influence the Selection of Frequency Estimation Strategies
Brown, Norman R.
2008-01-01
Prior research indicates that enumeration-based frequency estimation strategies become increasingly common as memory for relevant event instances improves and that moderate levels of context memory are associated with moderate rates of enumeration [Brown, N. R. (1995). Estimation strategies and the judgment of event frequency. Journal of…
Challenges in estimating the health impact of Hurricane Sandy using macro-level flood data.
Lieberman-Cribbin, W.; Liu, B.; Schneider, S.; Schwartz, R.; Taioli, E.
2016-12-01
Background: Hurricane Sandy caused extensive physical and economic damage but the long-term health impacts are unknown. Flooding is a central component of hurricane exposure, influencing health through multiple pathways that unfold over months after flooding recedes. This study assesses concordance in Federal Emergency Management (FEMA) and self-reported flood exposure after Hurricane Sandy to elucidate discrepancies in flood exposure assessments. Methods: Three meter resolution New York State flood data was obtained from the FEMA Modeling Task Force Hurricane Sandy Impact Analysis. FEMA data was compared to self-reported flood data obtained through validated questionnaires from New York City and Long Island residents following Sandy. Flooding was defined as both dichotomous and continuous variables and analyses were performed in SAS v9.4 and ArcGIS 10.3.1. Results: There was a moderate agreement between FEMA and self-reported flooding (Kappa statistic 0.46) and continuous (Spearman's correlation coefficient 0.50) measures of flood exposure. Flooding was self-reported and recorded by FEMA in 23.6% of cases, while agreement between the two measures on no flooding was 51.1%. Flooding was self-reported but not recorded by FEMA in 8.5% of cases, while flooding was not self-reported but indicated by FEMA in 16.8% of cases. In this last instance, 84% of people (173/207; 83.6%) resided in an apartment (no flooding reported). Spatially, the most concordance resided in the interior of New York City / Long Island, while the greatest areas of discordance were concentrated in the Rockaway Peninsula and Long Beach, especially among those living in apartments. Conclusions: There were significant discrepancies between FEMA and self-reported flood data. While macro-level FEMA flood data is a relatively less expensive and faster way to provide exposure estimates spanning larger geographic areas affected by Hurricane Sandy than micro-level estimates from cohort studies, macro
A novel velocity estimator using multiple frequency carriers
DEFF Research Database (Denmark)
Zhang, Zhuo; Jakobsson, Andreas; Nikolov, Svetoslav
2004-01-01
. In this paper, we propose a nonlinear least squares (NLS) estimator. Typically, NLS estimators are computationally cumbersome, in general requiring the minimization of a multidimensional and often multimodal cost function. Here, by noting that the unknown velocity will result in a common known frequency......Most modern ultrasound scanners use the so-called pulsed-wave Doppler technique to estimate the blood velocities. Among the narrowband-based methods, the autocorrelation estimator and the Fourier-based method are the most commonly used approaches. Due to the low level of the blood echo, the signal......-to-noise ratio is low, and some averaging in depth is applied to improve the estimate. Further, due to velocity gradients in space and time, the spectrum may get smeared. An alternative approach is to use a pulse with multiple frequency carriers, and do some form of averaging in the frequency domain. However...
Adler, Robert
2007-01-01
Floods impact more people globally than any other type of natural disaster. It has been established by experience that the most effective means to reduce the property damage and life loss caused by floods is the development of flood early warning systems. However, advances for such a system have been constrained by the difficulty in estimating rainfall continuously over space (catchment-. national-, continental-. or even global-scale areas) and time (hourly to daily). Particularly, insufficient in situ data, long delay in data transmission and absence of real-time data sharing agreements in many trans-boundary basins hamper the development of a real-time system at the regional to global scale. In many countries around the world, particularly in the tropics where rainfall and flooding co-exist in abundance, satellite-based precipitation estimation may be the best source of rainfall data for those data scarce (ungauged) areas and trans-boundary basins. Satellite remote sensing data acquired and processed in real time can now provide the space-time information on rainfall fluxes needed to monitor severe flood events around the world. This can be achieved by integrating the satellite-derived forcing data with hydrological models, which can be parameterized by a tailored geospatial database. An example that is a key to this progress is NASA's contribution to the Tropical Rainfall Measuring Mission (TRMM), launched in November 1997. Hence, in an effort to evolve toward a more hydrologically-relevant flood alert system, this talk articulates a module-structured framework for quasi-global flood potential naming, that is 'up to date' with the state of the art on satellite rainfall estimation and the improved geospatial datasets. The system is modular in design with the flexibility that permits changes in the model structure and in the choice of components. Four major components included in the system are: 1) multi-satellite precipitation estimation; 2) characterization of
DEFF Research Database (Denmark)
Kjeldsen, Thomas Rodding; Rosbjerg, Dan
2002-01-01
the prediction uncertainty and that the presence of intersite correlation tends to increase the uncertainty. A simulation study revealed that in regional index-flood estimation the method of probability weighted moments is preferable to method of moment estimation with regard to bias and RMSE.......A comparison of different methods for estimating T-year events is presented, all based on the Extreme Value Type I distribution. Series of annual maximum flood from ten gauging stations at the New Zealand South island have been used. Different methods of predicting the 100-year event...... and the connected uncertainty have been applied: At-site estimation and regional index-flood estimation with and without accounting for intersite correlation using either the method of moments or the method of probability weighted moments for parameter estimation. Furthermore, estimation at ungauged sites were...
Can assimilation of crowdsourced data in hydrological modelling improve flood prediction?
Mazzoleni, Maurizio; Verlaan, Martin; Alfonso, Leonardo; Monego, Martina; Norbiato, Daniele; Ferri, Miche; Solomatine, Dimitri P.
2017-02-01
Monitoring stations have been used for decades to properly measure hydrological variables and better predict floods. To this end, methods to incorporate these observations into mathematical water models have also been developed. Besides, in recent years, the continued technological advances, in combination with the growing inclusion of citizens in participatory processes related to water resources management, have encouraged the increase of citizen science projects around the globe. In turn, this has stimulated the spread of low-cost sensors to allow citizens to participate in the collection of hydrological data in a more distributed way than the classic static physical sensors do. However, two main disadvantages of such crowdsourced data are the irregular availability and variable accuracy from sensor to sensor, which makes them challenging to use in hydrological modelling. This study aims to demonstrate that streamflow data, derived from crowdsourced water level observations, can improve flood prediction if integrated in hydrological models. Two different hydrological models, applied to four case studies, are considered. Realistic (albeit synthetic) time series are used to represent crowdsourced data in all case studies. In this study, it is found that the data accuracies have much more influence on the model results than the irregular frequencies of data availability at which the streamflow data are assimilated. This study demonstrates that data collected by citizens, characterized by being asynchronous and inaccurate, can still complement traditional networks formed by few accurate, static sensors and improve the accuracy of flood forecasts.
Floods and climate: emerging perspectives for flood risk assessment and management
DEFF Research Database (Denmark)
Merz, B.; Aerts, J.; Arnbjerg-Nielsen, Karsten
2014-01-01
context of floods. We come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management. (2) Statistical......, and this variation may be partially quantifiable and predictable, with the perspective of dynamic, climate-informed flood risk management. (4) Efforts are needed to fully account for factors that contribute to changes in all three risk components (hazard, exposure, vulnerability) and to better understand......Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction...
Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Todini, Ezio
2015-04-01
-Curve Model in Real Time (RCM-RT) (Barbetta and Moramarco, 2014) are used to this end. Both models without considering rainfall information explicitly considers, at each time of forecast, the estimate of lateral contribution along the river reach for which the stage forecast is performed at downstream end. The analysis is performed for several reaches using different lead times according to the channel length. Barbetta, S., Moramarco, T., Brocca, L., Franchini, M. and Melone, F. 2014. Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3),729-743. Barbetta, S. and Moramarco, T. 2014. Real-time flood forecasting by relating local stage and remote discharge. Hydrological Sciences Journal, 59(9 ), 1656-1674. Coccia, G. and Todini, E. 2011. Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274. doi:10.5194/hess-15-3253-2011. Krzysztofowicz, R. 1999. Bayesian theory of probabilistic forecasting via deterministic hydrologic model, Water Resour. Res., 35, 2739-2750. Todini, E. 2004. Role and treatment of uncertainty in real-time flood forecasting. Hydrological Processes 18(14), 2743_2746. Todini, E. 2008. A model conditional processor to assess predictive uncertainty in flood forecasting. Intl. J. River Basin Management, 6(2): 123-137.
Flood Scenario Simulation and Disaster Estimation of Ba-Ma Creek Watershed in Nantou County, Taiwan
Peng, S. H.; Hsu, Y. K.
2018-04-01
The present study proposed several scenario simulations of flood disaster according to the historical flood event and planning requirement in Ba-Ma Creek Watershed located in Nantou County, Taiwan. The simulations were made using the FLO-2D model, a numerical model which can compute the velocity and depth of flood on a two-dimensional terrain. Meanwhile, the calculated data were utilized to estimate the possible damage incurred by the flood disaster. The results thus obtained can serve as references for disaster prevention. Moreover, the simulated results could be employed for flood disaster estimation using the method suggested by the Water Resources Agency of Taiwan. Finally, the conclusions and perspectives are presented.
Improving flood risk mapping in Italy: the FloodRisk open-source software
Albano, Raffaele; Mancusi, Leonardo; Craciun, Iulia; Sole, Aurelia; Ozunu, Alexandru
2017-04-01
management process, enhancing their awareness. This FOSS approach can promotes transparency and accountability through a process of "guided discovery". Moreover, the immediacy with which information is presented by the qualitative flood risk map, can facilitate and speed up the process of knowledge acquisition. An application of FloodRisk model is showed on a pilot case in "Serio" Valley, (North Italy), and its strengths and limits, in terms of additional efforts required in its application compared with EDQ procedure, have been highlighted focusing on the utility of the results provided for the development of FRMPs. Although they still present limits which prevent the FloodRisk application without critically consider the peculiarities of the investigated area in terms of available knowledge on hazard, exposure and vulnerability, the proposed approach surely produces an increase in available knowledge of flood risk and its drivers. This further information cannot be neglected for defining risk mitigation objectives and strategies. Hence, considering the ongoing efforts in the improvement of data availability and quality, FloodRisk could be a suitable tool for the next revision of flood risk maps due by December 2019, supporting effectively Italian and EU practitioners in the delineation of FRMPs (and for flood risk management in general).
FloodProBE: technologies for improved safety of the built environment in relation to flood events
International Nuclear Information System (INIS)
Ree, C.C.D.F. van; Van, M.A.; Heilemann, K.; Morris, M.W.; Royet, P.; Zevenbergen, C.
2011-01-01
The FloodProBE project started as a FP7 research project in November 2009. Floods, together with wind related storms, are considered the major natural hazard in the EU in terms of risk to people and assets. In order to adapt urban areas (in river and coastal zones) to prevent flooding or to be better prepared for floods, decision makers need to determine how to upgrade flood defences and increasing flood resilience of protected buildings and critical infrastructure (power supplies, communications, water, transport, etc.) and assess the expected risk reduction from these measures. The aim of the FloodProBE-project is to improve knowledge on flood resilience and flood protection performance for balancing investments in flood risk management in urban areas. To this end, technologies, methods and tools for assessment purposes and for the adaptation of new and existing buildings and critical infrastructure are developed, tested and disseminated. Three priority areas are addressed by FloodProBE. These are: (i) vulnerability of critical infrastructure and high-density value assets including direct and indirect damage, (ii) the assessment and reliability of urban flood defences including the use of geophysical methods and remote sensing techniques and (iii) concepts and technologies for upgrading weak links in flood defences as well as construction technologies for flood proofing buildings and infrastructure networks to increase the flood resilience of the urban system. The primary impact of FloodProBE in advancing knowledge in these areas is an increase in the cost-effectiveness (i.e. performance) of new and existing flood protection structures and flood resilience measures.
Energy Technology Data Exchange (ETDEWEB)
Grigg, Reid B.; Schechter, David S.
1999-10-15
The goal of this project is to improve the efficiency of miscible CO2 floods and enhance the prospects for flooding heterogeneous reservoirs. This report provides results of the second year of the three-year project that will be exploring three principles: (1) Fluid and matrix interactions (understanding the problems). (2) Conformance control/sweep efficiency (solving the problems. 3) Reservoir simulation for improved oil recovery (predicting results).
A Probabilistic Analysis of Surface Water Flood Risk in London.
Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris
2017-10-30
Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.
Directory of Open Access Journals (Sweden)
Ramni Kumar SARKAR
2012-12-01
Full Text Available Farmers in South East Asia are adopting rice crop establishment methods from transplanting to direct wet or dry seeding as it requires less labour and time and comparatively less energy than transplanting. In contrast to irrigated condition, in rainfed lowland, direct seeding is a common practice. Early flooding controls weeds but decreases seedling establishment in direct seeded rice. Anaerobic germination is an important trait to counteract damages caused by early flooding. Management options which can help in crop establishment and improve crop growth under flooding might remove the constraints related to direct seeding. The investigation was carried out with two near isogenic lines Swarna and Swarna-Sub1. Swarna-Sub1 is tolerant to submergence whereas Swarna is susceptible. Seed priming was done with water and 2% Jamun (Syzygium cumini leaf extract, and it improved seedling establishment under flooding. Acceleration of growth occurred due to seed pretreatment, which resulted longer seedling and greater accumulation of biomass. Seed priming greatly hastened the activities of total amylase and alcohol dehydrogenase in Swarna-Sub1 than in Swarna. Swarna-Sub1 outperformed Swarna when the plants were cultivated under flooding. Weed biomass decreased significantly under flooding compared to non-flooding conditions. Seed priming had positive effects on yield and yield attributing parameters both under non-flooding and early flooding conditions.
A new framework for estimating return levels using regional frequency analysis
Winter, Hugo; Bernardara, Pietro; Clegg, Georgina
2017-04-01
We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here
Luke, Adam; Vrugt, Jasper A.; AghaKouchak, Amir; Matthew, Richard; Sanders, Brett F.
2017-07-01
Nonstationary extreme value analysis (NEVA) can improve the statistical representation of observed flood peak distributions compared to stationary (ST) analysis, but management of flood risk relies on predictions of out-of-sample distributions for which NEVA has not been comprehensively evaluated. In this study, we apply split-sample testing to 1250 annual maximum discharge records in the United States and compare the predictive capabilities of NEVA relative to ST extreme value analysis using a log-Pearson Type III (LPIII) distribution. The parameters of the LPIII distribution in the ST and nonstationary (NS) models are estimated from the first half of each record using Bayesian inference. The second half of each record is reserved to evaluate the predictions under the ST and NS models. The NS model is applied for prediction by (1) extrapolating the trend of the NS model parameters throughout the evaluation period and (2) using the NS model parameter values at the end of the fitting period to predict with an updated ST model (uST). Our analysis shows that the ST predictions are preferred, overall. NS model parameter extrapolation is rarely preferred. However, if fitting period discharges are influenced by physical changes in the watershed, for example from anthropogenic activity, the uST model is strongly preferred relative to ST and NS predictions. The uST model is therefore recommended for evaluation of current flood risk in watersheds that have undergone physical changes. Supporting information includes a MATLAB® program that estimates the (ST/NS/uST) LPIII parameters from annual peak discharge data through Bayesian inference.
International Nuclear Information System (INIS)
Grigg, Reid B.; Schechter, David S.
1999-01-01
The goal of this project is to improve the efficiency of miscible CO2 floods and enhance the prospects for flooding heterogeneous reservoirs. This report provides results of the second year of the three-year project that will be exploring three principles: (1) Fluid and matrix interactions (understanding the problems). (2) Conformance control/sweep efficiency (solving the problems. 3) Reservoir simulation for improved oil recovery (predicting results)
Improving Flood Damage Assessment Models in Italy
Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.
2015-12-01
The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.
Why continuous simulation? The role of antecedent moisture in design flood estimation
Pathiraja, S.; Westra, S.; Sharma, A.
2012-06-01
Continuous simulation for design flood estimation is increasingly becoming a viable alternative to traditional event-based methods. The advantage of continuous simulation approaches is that the catchment moisture state prior to the flood-producing rainfall event is implicitly incorporated within the modeling framework, provided the model has been calibrated and validated to produce reasonable simulations. This contrasts with event-based models in which both information about the expected sequence of rainfall and evaporation preceding the flood-producing rainfall event, as well as catchment storage and infiltration properties, are commonly pooled together into a single set of "loss" parameters which require adjustment through the process of calibration. To identify the importance of accounting for antecedent moisture in flood modeling, this paper uses a continuous rainfall-runoff model calibrated to 45 catchments in the Murray-Darling Basin in Australia. Flood peaks derived using the historical daily rainfall record are compared with those derived using resampled daily rainfall, for which the sequencing of wet and dry days preceding the heavy rainfall event is removed. The analysis shows that there is a consistent underestimation of the design flood events when antecedent moisture is not properly simulated, which can be as much as 30% when only 1 or 2 days of antecedent rainfall are considered, compared to 5% when this is extended to 60 days of prior rainfall. These results show that, in general, it is necessary to consider both short-term memory in rainfall associated with synoptic scale dependence, as well as longer-term memory at seasonal or longer time scale variability in order to obtain accurate design flood estimates.
Directory of Open Access Journals (Sweden)
Gehendra Kharel
2018-04-01
Full Text Available Background Water level fluctuations in endorheic lakes are highly susceptible to even slight changes in climate and land use. Devils Lake (DL in North Dakota, USA is an endorheic system that has undergone multi-decade flooding driven by changes in regional climate. Flooding mitigation strategies have centered on the release of lake water to a nearby river system through artificial outlets, resulting in legal challenges and environmental concerns related to water quality, downstream flooding, species migration, stakeholder opposition, and transboundary water conflicts between the US and Canada. Despite these drawbacks, running outlets would result in low overspill risks in the next 30 years. Methods In this study we evaluated the efficacy of this outlet-based mitigation strategy under scenarios based on the latest IPCC future climate projections. We used the Coupled Model Intercomparison Project CMIP-5 weather patterns from 17 general circulation models (GCMs obtained under four representative concentration pathways (RCP scenarios and downscaled to the DL region. Then, we simulated the changes in lake water levels using the soil and water assessment tool based hydrological model of the watershed. We estimated the probability of future flood risks under those scenarios and compared those with previously estimated overspill risks under the CMIP-3 climate. Results The CMIP-5 ensemble projected a mean annual temperature of 5.78 °C and mean daily precipitation of 1.42 mm/day; both are higher than the existing CMIP-3 future estimates of 4.98 °C and 1.40 mm/day, respectively. The increased precipitation and higher temperature resulted in a significant increase of DL’s overspill risks: 24.4–47.1% without release from outlets and 3.5–14.4% even if the outlets are operated at their combined full 17 m3/s capacity. Discussion The modeled increases in overspill risks indicate a greater frequency of water releases through the artificial outlets. Future
Qi, Wei
2017-11-01
Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.
Joint Angle and Frequency Estimation Using Multiple-Delay Output Based on ESPRIT
Xudong, Wang
2010-12-01
This paper presents a novel ESPRIT algorithm-based joint angle and frequency estimation using multiple-delay output (MDJAFE). The algorithm can estimate the joint angles and frequencies, since the use of multiple output makes the estimation accuracy greatly improved when compared with a conventional algorithm. The useful behavior of the proposed algorithm is verified by simulations.
International Nuclear Information System (INIS)
1980-05-01
Estimation of the design basis flood for Nuclear Power Plants can be carried out using either deterministic or stochastic techniques. Stochastic techniques, while widely used for the solution of a variety of hydrological and other problems, have not been used to date (1980) in connection with the estimation of design basis flood for NPP siting. This study compares the two techniques against one specific river site (Galt on the Grand River, Ontario). The study concludes that both techniques lead to comparable results , but that stochastic techniques have the advantage of extracting maximum information from available data and presenting the results (flood flow) as a continuous function of probability together with estimation of confidence limits. (author)
Correcting the bias of empirical frequency parameter estimators in codon models.
Directory of Open Access Journals (Sweden)
Sergei Kosakovsky Pond
2010-07-01
Full Text Available Markov models of codon substitution are powerful inferential tools for studying biological processes such as natural selection and preferences in amino acid substitution. The equilibrium character distributions of these models are almost always estimated using nucleotide frequencies observed in a sequence alignment, primarily as a matter of historical convention. In this note, we demonstrate that a popular class of such estimators are biased, and that this bias has an adverse effect on goodness of fit and estimates of substitution rates. We propose a "corrected" empirical estimator that begins with observed nucleotide counts, but accounts for the nucleotide composition of stop codons. We show via simulation that the corrected estimates outperform the de facto standard estimates not just by providing better estimates of the frequencies themselves, but also by leading to improved estimation of other parameters in the evolutionary models. On a curated collection of sequence alignments, our estimators show a significant improvement in goodness of fit compared to the approach. Maximum likelihood estimation of the frequency parameters appears to be warranted in many cases, albeit at a greater computational cost. Our results demonstrate that there is little justification, either statistical or computational, for continued use of the -style estimators.
International Nuclear Information System (INIS)
Solomon, S.I.; Harvey, K.D.
1982-12-01
The IAEA Safety Guide 50-SG-S10A recommends that design basis floods be estimated by deterministic techniques using probable maximum precipitation and a rainfall runoff model to evaluate the corresponding flood. The Guide indicates that stochastic techniques are also acceptable in which case floods of very low probability have to be estimated. The paper compares the results of applying the two techniques in two river basins at a number of locations and concludes that the uncertainty of the results of both techniques is of the same order of magnitude. However, the use of the unit hydrograph as the rainfall runoff model may lead in some cases to nonconservative estimates. A distributed non-linear rainfall runoff model leads to estimates of probable maximum flood flows which are very close to values of flows having a 10 6 - 10 7 years return interval estimated using a conservative and relatively simple stochastic technique. Recommendations on the practical application of Safety Guide 50-SG-10A are made and the extension of the stochastic technique to ungauged sites and other design parameters is discussed
Improving Global Flood Forecasting using Satellite Detected Flood Extent
Revilla Romero, B.
2016-01-01
Flooding is a natural global phenomenon but in many cases is exacerbated by human activity. Although flooding generally affects humans in a negative way, bringing death, suffering, and economic impacts, it also has potentially beneficial effects. Early flood warning and forecasting systems, as well
Technical note: Design flood under hydrological uncertainty
Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco
2017-07-01
Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.
Lazrus, Heather; Morss, Rebecca E; Demuth, Julie L; Lazo, Jeffrey K; Bostrom, Ann
2016-02-01
Understanding how people view flash flood risks can help improve risk communication, ultimately improving outcomes. This article analyzes data from 26 mental models interviews about flash floods with members of the public in Boulder, Colorado, to understand their perspectives on flash flood risks and mitigation. The analysis includes a comparison between public and professional perspectives by referencing a companion mental models study of Boulder-area professionals. A mental models approach can help to diagnose what people already know about flash flood risks and responses, as well as any critical gaps in their knowledge that might be addressed through improved risk communication. A few public interviewees mentioned most of the key concepts discussed by professionals as important for flash flood warning decision making. However, most interviewees exhibited some incomplete understandings and misconceptions about aspects of flash flood development and exposure, effects, or mitigation that may lead to ineffective warning decisions when a flash flood threatens. These include important misunderstandings about the rapid evolution of flash floods, the speed of water in flash floods, the locations and times that pose the greatest flash flood risk in Boulder, the value of situational awareness and environmental cues, and the most appropriate responses when a flash flood threatens. The findings point to recommendations for ways to improve risk communication, over the long term and when an event threatens, to help people quickly recognize and understand threats, obtain needed information, and make informed decisions in complex, rapidly evolving extreme weather events such as flash floods. © 2015 Society for Risk Analysis.
Adaptive OFDM Radar Waveform Design for Improved Micro-Doppler Estimation
Energy Technology Data Exchange (ETDEWEB)
Sen, Satyabrata [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Engineering Science Advanced Research, Computer Science and Mathematics Division
2014-07-01
Here we analyze the performance of a wideband orthogonal frequency division multiplexing (OFDM) signal in estimating the micro-Doppler frequency of a rotating target having multiple scattering centers. The use of a frequency-diverse OFDM signal enables us to independently analyze the micro-Doppler characteristics with respect to a set of orthogonal subcarrier frequencies. We characterize the accuracy of micro-Doppler frequency estimation by computing the Cramer-Rao bound (CRB) on the angular-velocity estimate of the target. Additionally, to improve the accuracy of the estimation procedure, we formulate and solve an optimization problem by minimizing the CRB on the angular-velocity estimate with respect to the OFDM spectral coefficients. We present several numerical examples to demonstrate the CRB variations with respect to the signal-to-noise ratios, number of temporal samples, and number of OFDM subcarriers. We also analysed numerically the improvement in estimation accuracy due to the adaptive waveform design. A grid-based maximum likelihood estimation technique is applied to evaluate the corresponding mean-squared error performance.
Estimation of climate change impacts on hydrology and floods in Finland
Energy Technology Data Exchange (ETDEWEB)
Veijalainen, N.
2012-07-01
Climate scenarios project increases in air temperature and precipitation in Finland during the 21st century and these will results in changes in hydrology. In this thesis climate change impacts on hydrology and floods in Finland were estimated with hydrological modelling and several climate scenarios. One of the goals was to understand the influence of different processes and catchment characteristics on the hydrological response to climate change in boreal conditions. The tool of the climate change impact assessment was the conceptual hydrological model WSFS (Watershed Simulation and Forecasting System). The studies employed and compared two methods of transferring the climate change signal from climate models to the WSFS hydrological model (delta change approach and direct bias corrected Regional Climate Model (RCM) data). Direct RCM data was used to simulate transient hydrological scenarios for 1951- 2100 and the simulation results were analysed to detect changes in water balance components and trends in discharge series. The results revealed that seasonal changes in discharges in Finland were the clearest impacts of climate change. Air temperature increase will affect snow accumulation and melt, increase winter discharge and decrease spring snowmelt discharge. The impacts of climate change on floods in Finland by 2070-2099 varied considerably depending on the location, catchment characteristics, timing of the floods and climate scenario. Floods caused by spring snowmelt decreased or remained unchanged, whereas autumn and winter floods caused by precipitation increased especially in large lakes and their outflow rivers. Since estimation of climate change impacts includes uncertainties in every step of the long modelling process, the accumulated uncertainties by the end of the process become large. The large differences between results from different climate scenarios highlight the need to use several climate scenarios in climate change impact studies
Socio-economic Impact Analysis for Near Real-Time Flood Detection in the Lower Mekong River Basin
Oddo, P.; Ahamed, A.; Bolten, J. D.
2017-12-01
Flood events pose a severe threat to communities in the Lower Mekong River Basin. The combination of population growth, urbanization, and economic development exacerbate the impacts of these flood events. Flood damage assessments are frequently used to quantify the economic losses in the wake of storms. These assessments are critical for understanding the effects of flooding on the local population, and for informing decision-makers about future risks. Remote sensing systems provide a valuable tool for monitoring flood conditions and assessing their severity more rapidly than traditional post-event evaluations. The frequency and severity of extreme flood events are projected to increase, further illustrating the need for improved flood monitoring and impact analysis. In this study we implement a socio-economic damage model into a decision support tool with near real-time flood detection capabilities (NASA's Project Mekong). Surface water extent for current and historical floods is found using multispectral Moderate-resolution Imaging Spectroradiometer (MODIS) 250-meter imagery and the spectral Normalized Difference Vegetation Index (NDVI) signatures of permanent water bodies (MOD44W). Direct and indirect damages to populations, infrastructure, and agriculture are assessed using the 2011 Southeast Asian flood as a case study. Improved land cover and flood depth assessments result in a more refined understanding of losses throughout the Mekong River Basin. Results suggest that rapid initial estimates of flood impacts can provide valuable information to governments, international agencies, and disaster responders in the wake of extreme flood events.
Going beyond the flood insurance rate map: insights from flood hazard map co-production
Luke, Adam; Sanders, Brett F.; Goodrich, Kristen A.; Feldman, David L.; Boudreau, Danielle; Eguiarte, Ana; Serrano, Kimberly; Reyes, Abigail; Schubert, Jochen E.; AghaKouchak, Amir; Basolo, Victoria; Matthew, Richard A.
2018-04-01
Flood hazard mapping in the United States (US) is deeply tied to the National Flood Insurance Program (NFIP). Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM) such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1) legends that frame flood intensity both qualitatively and quantitatively, and (2) flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1) standing water depths following the flood, (2) the erosive potential of flowing water, and (3) pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating pluvial flood hazards
Remote sensing estimates of impervious surfaces for pluvial flood modelling
DEFF Research Database (Denmark)
Kaspersen, Per Skougaard; Drews, Martin
This paper investigates the accuracy of medium resolution (MR) satellite imagery in estimating impervious surfaces for European cities at the detail required for pluvial flood modelling. Using remote sensing techniques enables precise and systematic quantification of the influence of the past 30...
International Nuclear Information System (INIS)
Solomon, S.I.; Harvey, K.D.; Asmis, G.J.K.
1983-01-01
The IAEA Safety Guide 50-SG-S10A recommends that design basis floods be estimated by deterministic techniques using probable maximum precipitation and a rainfall runoff model to evaluate the corresponding flood. The Guide indicates that stochastic techniques are also acceptable in which case floods of very low probability have to be estimated. The paper compares the results of applying the two techniques in two river basins at a number of locations and concludes that the uncertainty of the results of both techniques is of the same order of magnitude. However, the use of the unit hydrograph as the rain fall runoff model may lead in some cases to non-conservative estimates. A distributed non-linear rainfall runoff model leads to estimates of probable maximum flood flows which are very close to values of flows having a 10 6 to 10 7 years return interval estimated using a conservative and relatively simple stochastic technique. Recommendations on the practical application of Safety Guide 50-SG-10A are made and the extension of the stochastic technique to ungauged sites and other design parameters is discussed
Quantitative flood risk assessment for Polders
International Nuclear Information System (INIS)
Manen, Sipke E. van; Brinkhuis, Martine
2005-01-01
In the Netherlands, the design of dikes and other water retaining structures is based on an acceptable probability (frequency) of overtopping. In 1993 a new safety concept was introduced based on total flood risk. Risk was defined as the product of probability and consequences. In recent years advanced tools have become available to calculate the actual flood risk of a polder. This paper describes the application of these tools to an existing lowland river area. The complete chain of calculations necessary to estimate the risk of flooding of a polder (or dike ring) is presented. The difficulties in applying the present day tools and the largest uncertainties in the calculations are shown
Quantitative flood risk assessment for Polders
Energy Technology Data Exchange (ETDEWEB)
Manen, Sipke E. van [Ministry of Transport, Public Works and Water Management, Bouwdienst Rijkswaterstaat, Griffioenlaan 2, Utrecht 3526 (Netherlands)]. E-mail: s.e.vmanen@bwd.rws.minvenw.nl; Brinkhuis, Martine [Ministry of Transport, Public Works and Water Management, Delft (Netherlands)
2005-12-01
In the Netherlands, the design of dikes and other water retaining structures is based on an acceptable probability (frequency) of overtopping. In 1993 a new safety concept was introduced based on total flood risk. Risk was defined as the product of probability and consequences. In recent years advanced tools have become available to calculate the actual flood risk of a polder. This paper describes the application of these tools to an existing lowland river area. The complete chain of calculations necessary to estimate the risk of flooding of a polder (or dike ring) is presented. The difficulties in applying the present day tools and the largest uncertainties in the calculations are shown.
Statistical analysis of the uncertainty related to flood hazard appraisal
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
A system for generating long streamflow records for study of floods of long return period: Phase 2
International Nuclear Information System (INIS)
Franz, D.D.; Kraeger, B.A.; Linsley, R.K.
1989-02-01
Knowledge of the return periods of large floods is required to make risk analyses for nuclear power plants subject to flooding from rivers. The system reported here combined the stochastic simulation of hourly rainfall data and daily pan evaporation data with the deterministic simulation of streamflow by using the synthetic rainfall and evaporation data as input to a calibrated rainfall runoff model. The sequence of annual maximum flood peaks from a synthetic record of 10,000 years or more was then analyzed to obtain estimates of flood frequency. The reasonableness of the flood frequency results must be evaluated on the degree of mimicry of the key characteristics of the observed rainfall data and the ability of the rainfall-runoff model to mimic the observed flood frequency during the calibration period. On this basis, the flood frequency results appeared to be a reasonable extrapolation of the data used in defining the model parameters. There is a need to develop regional parameters for the stochastic models and to conduct research on the relationship between the stochastic structure of rainfall and stochastic structure of flood frequency. The methodology is applicable, assuming a highly skilled analyst, to watersheds similar to those already tested
The weighted function method: A handy tool for flood frequency analysis or just a curiosity?
Bogdanowicz, Ewa; Kochanek, Krzysztof; Strupczewski, Witold G.
2018-04-01
The idea of the Weighted Function (WF) method for estimation of Pearson type 3 (Pe3) distribution introduced by Ma in 1984 has been revised and successfully applied for shifted inverse Gaussian (IGa3) distribution. Also the conditions of WF applicability to a shifted distribution have been formulated. The accuracy of WF flood quantiles for both Pe3 and IGa3 distributions was assessed by Monte Caro simulations under the true and false distribution assumption versus the maximum likelihood (MLM), moment (MOM) and L-moments (LMM) methods. Three datasets of annual peak flows of Polish catchments serve the case studies to compare the results of the WF, MOM, MLM and LMM performance for the real flood data. For the hundred-year flood the WF method revealed the explicit superiority only over the MLM surpassing the MOM and especially LMM both for the true and false distributional assumption with respect to relative bias and relative mean root square error values. Generally, the WF method performs well and for hydrological sample size and constitutes good alternative for the estimation of the flood upper quantiles.
López, J.; Francés, F.
2013-08-01
Recent evidences of the impact of persistent modes of regional climate variability, coupled with the intensification of human activities, have led hydrologists to study flood regime without applying the hypothesis of stationarity. In this study, a framework for flood frequency analysis is developed on the basis of a tool that enables us to address the modelling of non-stationary time series, namely, the "generalized additive models for location, scale and shape" (GAMLSS). Two approaches to non-stationary modelling in GAMLSS were applied to the annual maximum flood records of 20 continental Spanish rivers. The results of the first approach, in which the parameters of the selected distributions were modelled as a function of time only, show the presence of clear non-stationarities in the flood regime. In a second approach, the parameters of the flood distributions are modelled as functions of climate indices (Arctic Oscillation, North Atlantic Oscillation, Mediterranean Oscillation and the Western Mediterranean Oscillation) and a reservoir index that is proposed in this paper. The results when incorporating external covariates in the study highlight the important role of interannual variability in low-frequency climate forcings when modelling the flood regime in continental Spanish rivers. Also, with this approach it is possible to properly introduce the impact on the flood regime of intensified reservoir regulation strategies. The inclusion of external covariates permits the use of these models as predictive tools. Finally, the application of non-stationary analysis shows that the differences between the non-stationary quantiles and their stationary equivalents may be important over long periods of time.
Directory of Open Access Journals (Sweden)
J. López
2013-08-01
Full Text Available Recent evidences of the impact of persistent modes of regional climate variability, coupled with the intensification of human activities, have led hydrologists to study flood regime without applying the hypothesis of stationarity. In this study, a framework for flood frequency analysis is developed on the basis of a tool that enables us to address the modelling of non-stationary time series, namely, the "generalized additive models for location, scale and shape" (GAMLSS. Two approaches to non-stationary modelling in GAMLSS were applied to the annual maximum flood records of 20 continental Spanish rivers. The results of the first approach, in which the parameters of the selected distributions were modelled as a function of time only, show the presence of clear non-stationarities in the flood regime. In a second approach, the parameters of the flood distributions are modelled as functions of climate indices (Arctic Oscillation, North Atlantic Oscillation, Mediterranean Oscillation and the Western Mediterranean Oscillation and a reservoir index that is proposed in this paper. The results when incorporating external covariates in the study highlight the important role of interannual variability in low-frequency climate forcings when modelling the flood regime in continental Spanish rivers. Also, with this approach it is possible to properly introduce the impact on the flood regime of intensified reservoir regulation strategies. The inclusion of external covariates permits the use of these models as predictive tools. Finally, the application of non-stationary analysis shows that the differences between the non-stationary quantiles and their stationary equivalents may be important over long periods of time.
Paquet, Emmanuel; Lawrence, Deborah
2013-04-01
value and assessment of snowmelt role in extreme floods are presented. This study illustrates the complexity of the extreme flood estimation in snow driven catchments, and the need for a good representation of snow accumulation and melting processes in simulations for design flood estimations. In particular, the SCHADEX method is able to represent a range of possible catchment conditions (representing both soil moisture and snowmelt) in which extreme flood events can occur. This study is part of a collaboration between NVE and EDF, initiated within the FloodFreq COST Action (http://www.cost-floodfreq.eu/). References: Fleig, A., Scientific Report of the Short Term Scientific Mission Anne Fleig visiting Électricité de France, FloodFreq COST action - STSM report, 2012 Garavaglia, F., Gailhard, J., Paquet, E., Lang, M., Garçon, R., and Bernardara, P., Introducing a rainfall compound distribution model based on weather patterns sub-sampling, Hydrol. Earth Syst. Sci., 14, 951-964, doi:10.5194/hess-14-951-2010, 2010 Garçon, R. Modèle global pluie-débit pour la prévision et la prédétermination des crues, La Houille Blanche, 7-8, 88-95. doi: 10.1051/lhb/1999088 Paquet, E., Gailhard, J. and Garçon, R. (2006), Evolution of the GRADEX method: improvement by atmospheric circulation classification and hydrological modeling, La Houille Blanche, 5, 80-90. doi: 10.1051/lhb/2006091 Paquet, E., Garavaglia, F., Garçon, R. and Gailhard, J. (2012), The SCHADEX method: a semi-continuous rainfall-runoff simulation for extreme food estimation, Journal of Hydrology, under revision
Adige river in Trento flooding map, 1892: private or public risk transfer?
Ranzi, Roberto
2016-04-01
For the determination of the flood risk hydrologist and hydraulic engineers focuse their attention mainly to the estimation of physical factors determining the flood hazard, while economists and experts of social sciences deal mainly with the estimation of vulnerability and exposure. The fact that flood zoning involves both hydrological and socio-economic aspects, however, was clear already in the XIX century when the impact of floods on inundated areas started to appear in flood maps, for instance in the UK and in Italy. A pioneering 'flood risk' map for the Adige river in Trento, Italy, was already published in 1892, taking into account in detail both hazard intensity in terms of velocity and depth, frequency of occurrence, vulnerability and economic costs for flood protection with river embankments. This map is likely to be the reinterpreted certainly as a pioneering, and possibly as the first flood risk map for an Italian river and worldwide. Risk levels were divided in three categories and seven sub-categories, depending on flood water depth, velocity, frequency and damage costs. It is interesting to notice the fact that at that time the map was used to share the cost of levees' reparation and enhancement after the severe September 1882 flood as a function of the estimated level of protection of the respective areas against the flood risk. The sharing of costs between public bodies, the railway company and private owners was debated for about 20 years and at the end the public sustained the major costs. This shows how already at that time the economic assessment of structural flood protections was based on objective and rational cost-benefit criteria, that hydraulic risk mapping was perceived by the society as fundamental for the design of flood protection systems and that a balanced cost sharing between public and private was an accepted approach although some protests arose at that time.
A Multi-Faceted Debris-Flood Hazard Assessment for Cougar Creek, Alberta, Canada
Directory of Open Access Journals (Sweden)
Matthias Jakob
2017-01-01
Full Text Available A destructive debris flood occurred between 19 and 21 June 2013 on Cougar Creek, located in Canmore, Alberta. Cougar Creek fan is likely the most densely developed alluvial fan in Canada. While no lives were lost, the event resulted in approximately $40 M of damage and closed both the Trans-Canada Highway (Highway 1 and the Canadian Pacific Railway line for a period of several days. The debris flood triggered a comprehensive hazard assessment which is the focus of this paper. Debris-flood frequencies and magnitudes are determined by combining several quantitative methods including photogrammetry, dendrochronology, radiometric dating, test pit logging, empirical relationships between rainfall volumes and sediment volumes, and landslide dam outburst flood modeling. The data analysis suggests that three distinct process types act in the watershed. The most frequent process is normal or “clearwater” floods. Less frequent but more damaging are debris floods during which excessive amounts of bedload are transported on the fan, typically associated with rapid and extensive bank erosion and channel infilling and widening. The third and most destructive process is interpreted to be landslide dam outbreak floods. This event type is estimated to occur at return periods exceeding 300 years. Using a cumulative magnitude frequency technique, the data for conventional debris floods were plotted up to the 100–300s year return period. A peak-over-threshold approach was used for landslide dam outbreak floods occurring at return periods exceeding 300 years, as not all such events were identified during test trenching. Hydrographs for 6 return period classes were approximated by using the estimated peak discharges and fitting the hydrograph shape to integrate to the debris flood volumes as determined from the frequency-magnitude relationship. The fan volume was calculated and compared with the integrated frequency-magnitude curve to check of the validity of
Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.
2016-12-01
Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the
Directory of Open Access Journals (Sweden)
T. Matingo
2018-05-01
Full Text Available Flash floods are experienced almost annually in the ungauged Mbire District of the Middle Zambezi Basin. Studies related to hydrological modelling (rainfall-runoff and flood forecasting require major inputs such as precipitation which, due to shortage of observed data, are increasingly using indirect methods for estimating precipitation. This study therefore evaluated performance of CMORPH and TRMM satellite rainfall estimates (SREs for 30 min, 1 h, 3 h and daily intensities through hydrologic and flash flood modelling in the Lower Middle Zambezi Basin for the period 2013–2016. On a daily timestep, uncorrected CMORPH and TRMM show Probability of Detection (POD of 61 and 59 %, respectively, when compared to rain gauge observations. The best performance using Correlation Coefficient (CC was 70 and 60 % on daily timesteps for CMORPH and TRMM, respectively. The best RMSE for CMORPH was 0.81 % for 30 min timestep and for TRMM was 2, 11 % on 3 h timestep. For the year 2014 to 2015, the HEC-HMS (Hydrological Engineering Centre-Hydrological Modelling System daily model calibration Nash Sutcliffe efficiency (NSE for Musengezi sub catchment was 59 % whilst for Angwa it was 55 %. Angwa sub-catchment daily NSE results for the period 2015–2016 was 61 %. HEC-RAS flash flood modeling at 100, 50 and 25 year return periods for Angwa sub catchment, inundated 811 and 867 ha for TRMM rainfall simulated discharge at 3 h and daily timesteps, respectively. For CMORPH generated rainfall, the inundation was 818, 876, 890 and 891 ha at daily, 3 h, 1 h and 30 min timesteps. The 30 min time step for CMORPH effectively captures flash floods with the measure of agreement between simulated flood extent and ground control points of 69 %. For TRMM, the 3 h timestep effectively captures flash floods with coefficient of 67 %. The study therefore concludes that satellite products are most effective in capturing localized
The influence of climate change on flood risks in France - first estimates and uncertainty analysis
Dumas, P.; Hallegatte, S.; Quintana-Seguì, P.; Martin, E.
2013-03-01
This paper proposes a methodology to project the possible evolution of river flood damages due to climate change, and applies it to mainland France. Its main contributions are (i) to demonstrate a methodology to investigate the full causal chain from global climate change to local economic flood losses; (ii) to show that future flood losses may change in a very significant manner over France; (iii) to show that a very large uncertainty arises from the climate downscaling technique, since two techniques with comparable skills at reproducing reference river flows give very different estimates of future flows, and thus of future local losses. The main conclusion is thus that estimating future flood losses is still out of reach, especially at local scale, but that future national-scale losses may change significantly over this century, requiring policy changes in terms of risk management and land-use planning.
Croissant, Thomas; Lague, Dimitri; Davy, Philippe
2016-04-01
Climate fluctuations at geological timescales control the capacity of rivers to transport sediment with consequences on geochemical cycles, sedimentary basins dynamics and sedimentation/tectonics interactions. While the impact of differential friction generated by riparian vegetation has been studied for individual flood events, its impact on the long-term sediment transport capacity of rivers, modulated by the frequency of floods remains unknown. Here, we investigate this effect on a simplified river-floodplain configuration obeying observed hydraulic scaling laws. We numerically integrate the full-frequency magnitude distribution of discharge events and its impact on the transport capacity of bedload and suspended material for various level of vegetation-linked differential friction. We demonstrate that riparian vegetation by acting as a virtual confinement of the flow i) increases significantly the instantaneous transport capacity of the river independently of the transport mode and ii) increases the long term bedload transport rates as a function of discharge variability. Our results expose the dominance of flood frequency rather than riparian vegetation on the long term sediment transport capacity. Therefore, flood frequency has to be considered when evaluating long-term bedload transport capacity while floodplain vegetation is important only in high discharge variability regimes. By comparing the transport capacity of unconfined alluvial rivers and confined bedrock gorges, we demonstrate that the latter always presents the highest long term transport capacity at equivalent width and slope. The loss of confinement at the transition between bedrock and alluvial river must be compensated by a widening or a steepening of the alluvial channel to avoid infinite storage. Because steepening is never observed in natural system, we compute the alluvial widening factor value that varies between 3 to 11 times the width of the bedrock channel depending on riparian
Bösmeier, Annette; Glaser, Rüdiger; Stahl, Kerstin; Himmelsbach, Iso; Schönbein, Johannes
2017-04-01
Future estimations of flood hazard and risk for developing optimal coping and adaption strategies inevitably include considerations of the frequency and magnitude of past events. Methods of historical climatology represent one way of assessing flood occurrences beyond the period of instrumental measurements and can thereby substantially help to extend the view into the past and to improve modern risk analysis. Such historical information can be of additional value and has been used in statistical approaches like Bayesian flood frequency analyses during recent years. However, the derivation of quantitative values from vague descriptive information of historical sources remains a crucial challenge. We explored possibilities of parametrization of descriptive flood related data specifically for the assessment of historical floods in a framework that combines a hermeneutical approach with mathematical and statistical methods. This study forms part of the transnational, Franco-German research project TRANSRISK2 (2014 - 2017), funded by ANR and DFG, with the focus on exploring the floods history of the last 300 years for the regions of Upper and Middle Rhine. A broad data base of flood events had been compiled, dating back to AD 1500. The events had been classified based on hermeneutical methods, depending on intensity, spatial dimension, temporal structure, damages and mitigation measures associated with the specific events. This indexed database allowed the exploration of a link between descriptive data and quantitative information for the overlapping time period of classified floods and instrumental measurements since the end of the 19th century. Thereby, flood peak discharges as a quantitative measure of the severity of a flood were used to assess the discharge intervals for flood classes (upper and lower thresholds) within different time intervals for validating the flood classification, as well as examining the trend in the perception threshold over time
Alberola, Armando; Barriendos, Mariano; Gil-Guirado, Salvador; Pérez-Morales, Alfredo; Balasch, Carles; Castelltort, Xavier; Mazón, Jordi; Pino, David; Lluís Ruiz-Bellet, Josep; Tuset, Jordi
2016-04-01
Historical flood data series of Eastern Spanish Coast (14th-20th centuries). Improving identification of climatic patterns and human factors of flood events from primary documentary sources Armando Alberola, Barriendos, M., Gil-Guirado, S., Pérez Morales, A., Balasch, C., Castelltort, X., Mazón, J., Pino, D., Ruiz-Bellet, J.L., Tuset, J. Historical flood events in eastern spanish coast have been studied by different research groups and projects. Complexity of flood processes, involving atmospheric, surface and human factors, is not easily understandable when long time series are required. Present analysis from PREDIFLOOD Project Consortium defines a new step of flood event databases: Improved access to primary (documentary) and secondary (bibliographical) sources, data collection for all possible locations where floods are detected, and improved system of classification (Barriendos et al., 2014). A first analysis is applied to 8 selected flood series. Long chronologies from PREDIFLOOD Project for Catalonia region (Girona, Barcelona, Tarragona, Lleida, Tortosa). In addition, to cover all sector of spanish mediterranean coast, we introduce Valencia city in Turia River basin. South Eastern sector is cover with Murcia and Caravaca cities, Segura River basin. Extension of area under study required contributions of research teams experienced in work of documentary primary sources (Alberola, 2006; Gil-Guirado, 2013). Flood frequency analysis for long scale periods show natural climatic oscillations into so-called Little Ice Age. There are general patterns, affecting most of basins, but also some local anomalies or singularities. To explain these differences and analogies it is not enough to use purely climatic factors. In this way, we analyze human factors that have been able to influence the variability of floods along last 6 centuries (demography, hydraulic infrastructures, urban development...). This approach improves strongly understanding of mechanisms producing
Fast fundamental frequency estimation
DEFF Research Database (Denmark)
Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom
2017-01-01
Modelling signals as being periodic is common in many applications. Such periodic signals can be represented by a weighted sum of sinusoids with frequencies being an integer multiple of the fundamental frequency. Due to its widespread use, numerous methods have been proposed to estimate the funda...
Joint angle and Doppler frequency estimation of coherent targets in monostatic MIMO radar
Cao, Renzheng; Zhang, Xiaofei
2015-05-01
This paper discusses the problem of joint direction of arrival (DOA) and Doppler frequency estimation of coherent targets in a monostatic multiple-input multiple-output radar. In the proposed algorithm, we perform a reduced dimension (RD) transformation on the received signal first and then use forward spatial smoothing (FSS) technique to decorrelate the coherence and obtain joint estimation of DOA and Doppler frequency by exploiting the estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithm. The joint estimated parameters of the proposed RD-FSS-ESPRIT are automatically paired. Compared with the conventional FSS-ESPRIT algorithm, our RD-FSS-ESPRIT algorithm has much lower complexity and better estimation performance of both DOA and frequency. The variance of the estimation error and the Cramer-Rao Bound of the DOA and frequency estimation are derived. Simulation results show the effectiveness and improvement of our algorithm.
Li, C.-H.; Li, N.; Wu, L.-C.; Hu, A.-J.
2013-07-01
The vulnerability to flood disaster is addressed by a number of studies. It is of great importance to analyze the vulnerability of different regions and various periods to enable the government to make policies for distributing relief funds and help the regions to improve their capabilities against disasters, yet a recognized paradigm for such studies seems missing. Vulnerability is defined and evaluated through either physical or economic-ecological perspectives depending on the field of the researcher concerned. The vulnerability, however, is the core of both systems as it entails systematic descriptions of flood severities or disaster management units. The research mentioned often has a development perspective, and in this article we decompose the overall flood system into several factors: disaster driver, disaster environment, disaster bearer, and disaster intensity, and take the interaction mechanism among all factors as an indispensable function. The conditions of flood disaster components are demonstrated with disaster driver risk level, disaster environment stability level and disaster bearer sensitivity, respectively. The flood system vulnerability is expressed as vulnerability = f(risk, stability, sensitivity). Based on the theory, data envelopment analysis method (DEA) is used to detail the relative vulnerability's spatiotemporal variation of a flood disaster system and its components in the Dongting Lake region. The study finds that although a flood disaster system's relative vulnerability is closely associated with its components' conditions, the flood system and its components have a different vulnerability level. The overall vulnerability is not the aggregation of its components' vulnerability. On a spatial scale, zones central and adjacent to Dongting Lake and/or river zones are characterized with very high vulnerability. Zones with low and very low vulnerability are mainly distributed in the periphery of the Dongting Lake region. On a temporal
Sabatier, Pierre; Wilhelm, Bruno; Ficetola, Gentile Francesco; Moiroux, Fanny; Poulenard, Jérôme; Develle, Anne-Lise; Bichet, Adeline; Chen, Wentao; Pignol, Cécile; Reyss, Jean-Louis; Gielly, Ludovic; Bajard, Manon; Perrette, Yves; Malet, Emmanuel; Taberlet, Pierre; Arnaud, Fabien
2017-08-01
The high-resolution sedimentological and geochemical analysis of a sediment sequence from Lake Savine (Western Mediterranean Alps, France) led to the identification of 220 event layers for the last 6000 years. 200 were triggered by flood events and 20 by underwater mass movements possibly related to earthquakes that occurred in 5 clusters of increase seismicity. Because human activity could influence the flood chronicle, the presence of pastures was reconstructed through ancient DNA, which suggested that the flood chronicle was mainly driven by hydroclimate variability. Weather reanalysis of historical floods allow to identify that mesoscale precipitation events called "East Return" events were the main triggers of floods recorded in Lake Savine. The first part of this palaeoflood record (6-4 kyr BP) was characterized by increases in flood frequency and intensity in phase with Northern Alpine palaeoflood records. By contrast, the second part of the record (i.e., since 4 kyr BP) was phased with Southern Alpine palaeoflood records. These results suggest a palaeohydrological transition at approximately 4 kyr BP, as has been previously described for the Mediterranean region. This may have resulted in a change of flood-prone hydro-meteorological processes, i.e., in the balance between occurrence and intensity of local convective climatic phenomena and their influence on Mediterranean mesoscale precipitation events in this part of the Alps. At a centennial timescale, increases in flood frequency and intensity corresponded to periods of solar minima, affecting climate through atmospheric changes in the Euro-Atlantic sector.
Going beyond the flood insurance rate map: insights from flood hazard map co-production
Directory of Open Access Journals (Sweden)
A. Luke
2018-04-01
Full Text Available Flood hazard mapping in the United States (US is deeply tied to the National Flood Insurance Program (NFIP. Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1 legends that frame flood intensity both qualitatively and quantitatively, and (2 flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1 standing water depths following the flood, (2 the erosive potential of flowing water, and (3 pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating
Directory of Open Access Journals (Sweden)
Chen Cao
2016-09-01
Full Text Available This study focused on producing flash flood hazard susceptibility maps (FFHSM using frequency ratio (FR and statistical index (SI models in the Xiqu Gully (XQG of Beijing, China. First, a total of 85 flash flood hazard locations (n = 85 were surveyed in the field and plotted using geographic information system (GIS software. Based on the flash flood hazard locations, a flood hazard inventory map was built. Seventy percent (n = 60 of the flooding hazard locations were randomly selected for building the models. The remaining 30% (n = 25 of the flooded hazard locations were used for validation. Considering that the XQG used to be a coal mining area, coalmine caves and subsidence caused by coal mining exist in this catchment, as well as many ground fissures. Thus, this study took the subsidence risk level into consideration for FFHSM. The ten conditioning parameters were elevation, slope, curvature, land use, geology, soil texture, subsidence risk area, stream power index (SPI, topographic wetness index (TWI, and short-term heavy rain. This study also tested different classification schemes for the values for each conditional parameter and checked their impacts on the results. The accuracy of the FFHSM was validated using area under the curve (AUC analysis. Classification accuracies were 86.61%, 83.35%, and 78.52% using frequency ratio (FR-natural breaks, statistical index (SI-natural breaks and FR-manual classification schemes, respectively. Associated prediction accuracies were 83.69%, 81.22%, and 74.23%, respectively. It was found that FR modeling using a natural breaks classification method was more appropriate for generating FFHSM for the Xiqu Gully.
Viero, Daniele P.
2018-01-01
Citizen science and crowdsourcing are gaining increasing attention among hydrologists. In a recent contribution, Mazzoleni et al. (2017) investigated the integration of crowdsourced data (CSD) into hydrological models to improve the accuracy of real-time flood forecasts. The authors used synthetic CSD (i.e. not actually measured), because real CSD were not available at the time of the study. In their work, which is a proof-of-concept study, Mazzoleni et al. (2017) showed that assimilation of CSD improves the overall model performance; the impact of irregular frequency of available CSD, and that of data uncertainty, were also deeply assessed. However, the use of synthetic CSD in conjunction with (semi-)distributed hydrological models deserves further discussion. As a result of equifinality, poor model identifiability, and deficiencies in model structure, internal states of (semi-)distributed models can hardly mimic the actual states of complex systems away from calibration points. Accordingly, the use of synthetic CSD that are drawn from model internal states under best-fit conditions can lead to overestimation of the effectiveness of CSD assimilation in improving flood prediction. Operational flood forecasting, which results in decisions of high societal value, requires robust knowledge of the model behaviour and an in-depth assessment of both model structure and forcing data. Additional guidelines are given that are useful for the a priori evaluation of CSD for real-time flood forecasting and, hopefully, for planning apt design strategies for both model calibration and collection of CSD.
Comparing the index-flood and multiple-regression methods using L-moments
Malekinezhad, H.; Nachtnebel, H. P.; Klik, A.
In arid and semi-arid regions, the length of records is usually too short to ensure reliable quantile estimates. Comparing index-flood and multiple-regression analyses based on L-moments was the main objective of this study. Factor analysis was applied to determine main influencing variables on flood magnitude. Ward’s cluster and L-moments approaches were applied to several sites in the Namak-Lake basin in central Iran to delineate homogeneous regions based on site characteristics. Homogeneity test was done using L-moments-based measures. Several distributions were fitted to the regional flood data and index-flood and multiple-regression methods as two regional flood frequency methods were compared. The results of factor analysis showed that length of main waterway, compactness coefficient, mean annual precipitation, and mean annual temperature were the main variables affecting flood magnitude. The study area was divided into three regions based on the Ward’s method of clustering approach. The homogeneity test based on L-moments showed that all three regions were acceptably homogeneous. Five distributions were fitted to the annual peak flood data of three homogeneous regions. Using the L-moment ratios and the Z-statistic criteria, GEV distribution was identified as the most robust distribution among five candidate distributions for all the proposed sub-regions of the study area, and in general, it was concluded that the generalised extreme value distribution was the best-fit distribution for every three regions. The relative root mean square error (RRMSE) measure was applied for evaluating the performance of the index-flood and multiple-regression methods in comparison with the curve fitting (plotting position) method. In general, index-flood method gives more reliable estimations for various flood magnitudes of different recurrence intervals. Therefore, this method should be adopted as regional flood frequency method for the study area and the Namak-Lake basin
Flood risk assessment and mapping for the Lebanese watersheds
Abdallah, Chadi; Hdeib, Rouya
2016-04-01
Of all natural disasters, floods affect the greatest number of people worldwide and have the greatest potential to cause damage. Nowadays, with the emerging global warming phenomenon, this number is expected to increase. The Eastern Mediterranean area, including Lebanon (10452 Km2, 4.5 M habitant), has witnessed in the past few decades an increase frequency of flooding events. This study profoundly assess the flood risk over Lebanon covering all the 17 major watersheds and a number of small sub-catchments. It evaluate the physical direct tangible damages caused by floods. The risk assessment and evaluation process was carried out over three stages; i) Evaluating Assets at Risk, where the areas and assets vulnerable to flooding are identified, ii) Vulnerability Assessment, where the causes of vulnerability are assessed and the value of the assets are provided, iii) Risk Assessment, where damage functions are established and the consequent damages of flooding are estimated. A detailed Land CoverUse map was prepared at a scale of 1/ 1 000 using 0.4 m resolution satellite images within the flood hazard zones. The detailed field verification enabled to allocate and characterize all elements at risk, identify hotspots, interview local witnesses, and to correlate and calibrate previous flood damages with the utilized models. All filed gathered information was collected through Mobile Application and transformed to be standardized and classified under GIS environment. Consequently; the general damage evaluation and risk maps at different flood recurrence periods (10, 50, 100 years) were established. Major results showed that floods in a winter season (December, January, and February) of 10 year recurrence and of water retention ranging from 1 to 3 days can cause total damages (losses) that reach 1.14 M for crop lands and 2.30 M for green houses. Whereas, it may cause 0.2 M to losses in fruit trees for a flood retention ranging from 3 to 5 days. These numbers differs
Evaluation of critical storm duration rainfall estimates used in flood ...
African Journals Online (AJOL)
The results showed that the RMLA&SI approach can be considered as the preferred DDF relationship in future design flood estimations. The results also showed that a direct relationship exists between the catchment area and TC, thus ARFs can be explicitly expressed in terms of only the catchment area. Keywords: Rainfall ...
DEFF Research Database (Denmark)
Skovgård Olsen, Anders; Zhou, Qianqian; Linde, Jens Jørgen
Estimating the expected annual damage (EAD) due to flooding in an urban area is of great interest for urban water managers and other stakeholders. It is a strong indicator for a given area showing how it will be affected by climate change and how much can be gained by implementing adaptation...... measures. This study investigates three different methods for estimating the EAD based on a loglinear relation between the damage costs and the return periods, one of which has been used in previous studies. The results show with the increased amount of data points there appears to be a shift in the log......-linear relation which could be contributed by the Danish design standards for drainage systems. Three different methods for estimating the EAD were tested and the choice of method is less important than accounting for the log-linear shift. This then also means that the statistical approximation of the EAD used...
Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles
2017-04-01
An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in
Bayesian flood forecasting methods: A review
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been
Climate change and flood hazard: Evaluation of the SCHADEX methodology in a non-stationary context
International Nuclear Information System (INIS)
Brigode, Pierre
2013-01-01
Since 2006, Electricite de France (EDF) applies a new hydro-climatological approach of extreme rainfall and flood predetermination - the SCHADEX method - for the design of dam spillways. In a context of potential increase of extreme event intensity and frequency due to climate change, the use of the SCHADEX method in non-stationary conditions is a main interest topic for EDF hydrologists. Thus, the scientific goal of this Ph.D. thesis work has been to evaluate the ability of the SCHADEX method to take into account future climate simulations for the estimation of future extreme floods. The recognized inabilities of climate models and down-scaling methods to simulate (extreme) rainfall distribution at the catchment-scale have been avoided, by developing and testing new methodological approaches. Moreover, the decomposition of the flood-producing factors proposed by the SCHADEX method has been used for considering different simulated climatic evolutions and for quantifying the relative impact of these factors on the extreme flood estimation. First, the SCHADEX method has been applied in present time over different climatic contexts (France, Austria, Canada and Norway), thanks to several colorations with academic and industrial partners. A sensitivity analysis allowed to quantify the extreme flood estimation sensitivity to rainfall hazard, catchment saturation hazard and rainfall-runoff transformation, independently. The results showed a large sensitivity of SCHADEX flood estimations to the rainfall hazard and to the rainfall-runoff transformation. Using the sensitivity analysis results, tests have been done in order to estimate the future evolution of 'key' variables previously identified. New climate model outputs (done within the CMIP5 project) have been analyzed and used for determining future frequency of rainfall events and future catchment saturation conditions. Considering these simulated evolutions within the SCHADEX method lead to a significant decrease of
Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.
2017-12-01
Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.
Assessing flood risk at the global scale: model setup, results, and sensitivity
International Nuclear Information System (INIS)
Ward, Philip J; Jongman, Brenden; Weiland, Frederiek Sperna; Winsemius, Hessel C; Bouwman, Arno; Ligtvoet, Willem; Van Beek, Rens; Bierkens, Marc F P
2013-01-01
Globally, economic losses from flooding exceeded $19 billion in 2012, and are rising rapidly. Hence, there is an increasing need for global-scale flood risk assessments, also within the context of integrated global assessments. We have developed and validated a model cascade for producing global flood risk maps, based on numerous flood return-periods. Validation results indicate that the model simulates interannual fluctuations in flood impacts well. The cascade involves: hydrological and hydraulic modelling; extreme value statistics; inundation modelling; flood impact modelling; and estimating annual expected impacts. The initial results estimate global impacts for several indicators, for example annual expected exposed population (169 million); and annual expected exposed GDP ($1383 billion). These results are relatively insensitive to the extreme value distribution employed to estimate low frequency flood volumes. However, they are extremely sensitive to the assumed flood protection standard; developing a database of such standards should be a research priority. Also, results are sensitive to the use of two different climate forcing datasets. The impact model can easily accommodate new, user-defined, impact indicators. We envisage several applications, for example: identifying risk hotspots; calculating macro-scale risk for the insurance industry and large companies; and assessing potential benefits (and costs) of adaptation measures. (letter)
Development and validation of a two-dimensional fast-response flood estimation model
Energy Technology Data Exchange (ETDEWEB)
Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK
2009-01-01
A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.
Directory of Open Access Journals (Sweden)
Ling Kang
Full Text Available Heuristic search algorithms, which are characterized by faster convergence rates and can obtain better solutions than the traditional mathematical methods, are extensively used in engineering optimizations. In this paper, a newly developed elitist-mutated particle swarm optimization (EMPSO technique and an improved gravitational search algorithm (IGSA are successively applied to parameter estimation problems of Muskingum flood routing models. First, the global optimization performance of the EMPSO and IGSA are validated by nine standard benchmark functions. Then, to further analyse the applicability of the EMPSO and IGSA for various forms of Muskingum models, three typical structures are considered: the basic two-parameter linear Muskingum model (LMM, a three-parameter nonlinear Muskingum model (NLMM and a four-parameter nonlinear Muskingum model which incorporates the lateral flow (NLMM-L. The problems are formulated as optimization procedures to minimize the sum of the squared deviations (SSQ or the sum of the absolute deviations (SAD between the observed and the estimated outflows. Comparative results of the selected numerical cases (Case 1-3 show that the EMPSO and IGSA not only rapidly converge but also obtain the same best optimal parameter vector in every run. The EMPSO and IGSA exhibit superior robustness and provide two efficient alternative approaches that can be confidently employed to estimate the parameters of both linear and nonlinear Muskingum models in engineering applications.
The Value Estimation of an HFGW Frequency Time Standard for Telecommunications Network Optimization
Harper, Colby; Stephenson, Gary
2007-01-01
The emerging technology of gravitational wave control is used to augment a communication system using a development roadmap suggested in Stephenson (2003) for applications emphasized in Baker (2005). In the present paper consideration is given to the value of a High Frequency Gravitational Wave (HFGW) channel purely as providing a method of frequency and time reference distribution for use within conventional Radio Frequency (RF) telecommunications networks. Specifically, the native value of conventional telecommunications networks may be optimized by using an unperturbed frequency time standard (FTS) to (1) improve terminal navigation and Doppler estimation performance via improved time difference of arrival (TDOA) from a universal time reference, and (2) improve acquisition speed, coding efficiency, and dynamic bandwidth efficiency through the use of a universal frequency reference. A model utilizing a discounted cash flow technique provides an estimation of the additional value using HFGW FTS technology could bring to a mixed technology HFGW/RF network. By applying a simple net present value analysis with supporting reference valuations to such a network, it is demonstrated that an HFGW FTS could create a sizable improvement within an otherwise conventional RF telecommunications network. Our conservative model establishes a low-side value estimate of approximately 50B USD Net Present Value for an HFGW FTS service, with reasonable potential high-side values to significant multiples of this low-side value floor.
Directory of Open Access Journals (Sweden)
Le Bihan Guillaume
2016-01-01
Full Text Available Flash floods monitoring systems developed up to now generally enable a real-time assessment of the potential flash-floods magnitudes based on highly distributed hydrological models and weather radar records. The approach presented here aims to go one step ahead by offering a direct assessment of the potential impacts of flash floods on inhabited areas. This approach is based on an a priori analysis of the considered area in order (1 to evaluate based on a semi-automatic hydraulic approach (Cartino method the potentially flooded areas for different discharge levels, and (2 to identify the associated buildings and/or population at risk based on geographic databases. This preliminary analysis enables to build a simplified impact model (discharge-impact curve for each river reach, which can be used to directly estimate the importance of potentially affected assets based on the outputs of a distributed rainfall-runoff model. This article presents a first case study conducted in the Gard region (south eastern France. The first validation results are presented in terms of (1 accuracy of the delineation of the flooded areas estimated based on the Cartino method and using a high resolution DTM, and (2 relevance and usefulness of the impact model obtained. The impacts estimated at the event scale will now be evaluated in a near future based on insurance claim data provided by CCR (Caisse Centrale de Réassurrance.
Flood Hazard Recurrence Frequencies for the Savannah River Site
International Nuclear Information System (INIS)
Chen, K.F.
2001-01-01
Department of Energy (DOE) regulations outline the requirements for Natural Phenomena Hazard (NPH) mitigation for new and existing DOE facilities. The NPH considered in this report is flooding. The facility-specific probabilistic flood hazard curve defines, as a function of water elevation, the annual probability of occurrence or the return period in years. The facility-specific probabilistic flood hazard curves provide basis to avoid unnecessary facility upgrades, to establish appropriate design criteria for new facilities, and to develop emergency preparedness plans to mitigate the consequences of floods. A method based on precipitation, basin runoff and open channel hydraulics was developed to determine probabilistic flood hazard curves for the Savannah River Site. The calculated flood hazard curves show that the probabilities of flooding existing SRS major facilities are significantly less than 1.E-05 per year
A comparison of regional flood frequency analysis approaches in a simulation framework
Ganora, D.; Laio, F.
2016-07-01
Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve at ungauged (or scarcely gauged) sites. Different RFA approaches exist, depending on the way the information is transferred to the site of interest, but it is not clear in the literature if a specific method systematically outperforms the others. The aim of this study is to provide a framework wherein carrying out the intercomparison by building up a virtual environment based on synthetically generated data. The considered regional approaches include: (i) a unique regional curve for the whole region; (ii) a multiple-region model where homogeneous subregions are determined through cluster analysis; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially smooth estimation procedure where the parameters of the regional model vary continuously along the space. Virtual environments are generated considering different patterns of heterogeneity, including step change and smooth variations. If the region is heterogeneous, with the parent distribution changing continuously within the region, the spatially smooth regional approach outperforms the others, with overall errors 10-50% lower than the other methods. In the case of a step-change, the spatially smooth and clustering procedures perform similarly if the heterogeneity is moderate, while clustering procedures work better when the step-change is severe. To extend our findings, an extensive sensitivity analysis has been performed to investigate the effect of sample length, number of virtual stations, return period of the predicted quantile, variability of the scale parameter of the parent distribution, number of predictor variables and different parent distribution. Overall, the spatially smooth approach appears as the most robust approach as its performances are more stable across different patterns of heterogeneity, especially when short records are
Kourgialas, N. N.; Karatzas, G. P.
2014-03-01
A modeling system for the estimation of flash flood flow velocity and sediment transport is developed in this study. The system comprises three components: (a) a modeling framework based on the hydrological model HSPF, (b) the hydrodynamic module of the hydraulic model MIKE 11 (quasi-2-D), and (c) the advection-dispersion module of MIKE 11 as a sediment transport model. An important parameter in hydraulic modeling is the Manning's coefficient, an indicator of the channel resistance which is directly dependent on riparian vegetation changes. Riparian vegetation's effect on flood propagation parameters such as water depth (inundation), discharge, flow velocity, and sediment transport load is investigated in this study. Based on the obtained results, when the weed-cutting percentage is increased, the flood wave depth decreases while flow discharge, velocity and sediment transport load increase. The proposed modeling system is used to evaluate and illustrate the flood hazard for different riparian vegetation cutting scenarios. For the estimation of flood hazard, a combination of the flood propagation characteristics of water depth, flow velocity and sediment load was used. Next, a well-balanced selection of the most appropriate agricultural cutting practices of riparian vegetation was performed. Ultimately, the model results obtained for different agricultural cutting practice scenarios can be employed to create flood protection measures for flood-prone areas. The proposed methodology was applied to the downstream part of a small Mediterranean river basin in Crete, Greece.
International Nuclear Information System (INIS)
Khan, B.
2007-01-01
High flows and stream discharge have long been measured and used by the engineers in the design of hydraulic structures and flood-protection works and in planning for flood-plain use. Probability-analysis is the basis for the engineering design of many projects and advance information about flood-forecasting. High-flow analysis or flood-frequency studies interpret a past record of events, to predict the future probability of occurrence. In many countries, including the author's country, the long term flow data required for design of hydraulic structures and flood-protection works are not available. In such cases, the only tool with hydrologists is to extend the short-term flow data available at some other site in the region. The present study is made to find a reliable estimation of maximum instantaneous flood for higher frequencies of Kabul River at Warsak weir. Kabul River, at Nowshera gaging station is used or the purpose and regression-analysis is performed to extend the instantaneous peak-flow record up to 29 years at Warsak. The frequency-curves of high-flows are plotted on the normal probability paper, using different probability distributions. The Gumbel distribution seemed to be the best fit for the observed data-points, and is used here for estimation of flood for different return periods. (author)
Directory of Open Access Journals (Sweden)
C.-H. Li
2013-07-01
Full Text Available The vulnerability to flood disaster is addressed by a number of studies. It is of great importance to analyze the vulnerability of different regions and various periods to enable the government to make policies for distributing relief funds and help the regions to improve their capabilities against disasters, yet a recognized paradigm for such studies seems missing. Vulnerability is defined and evaluated through either physical or economic–ecological perspectives depending on the field of the researcher concerned. The vulnerability, however, is the core of both systems as it entails systematic descriptions of flood severities or disaster management units. The research mentioned often has a development perspective, and in this article we decompose the overall flood system into several factors: disaster driver, disaster environment, disaster bearer, and disaster intensity, and take the interaction mechanism among all factors as an indispensable function. The conditions of flood disaster components are demonstrated with disaster driver risk level, disaster environment stability level and disaster bearer sensitivity, respectively. The flood system vulnerability is expressed as vulnerability = f(risk, stability, sensitivity. Based on the theory, data envelopment analysis method (DEA is used to detail the relative vulnerability's spatiotemporal variation of a flood disaster system and its components in the Dongting Lake region. The study finds that although a flood disaster system's relative vulnerability is closely associated with its components' conditions, the flood system and its components have a different vulnerability level. The overall vulnerability is not the aggregation of its components' vulnerability. On a spatial scale, zones central and adjacent to Dongting Lake and/or river zones are characterized with very high vulnerability. Zones with low and very low vulnerability are mainly distributed in the periphery of the Dongting Lake region
2017-06-30
ER D C/ EL S R- 17 -3 Levee Setbacks: An Innovative , Cost-Effective, and Sustainable Solution for Improved Flood Risk Management En vi...EL SR-17-3 June 2017 Levee Setbacks: An Innovative , Cost-Effective, and Sustainable Solution for Improved Flood Risk Management David L. Smith...alternative view point is necessary. ERDC/EL SR-17-3 4 Levee setbacks are a relatively recent innovation in Corps flood risk management practice
Frequency domain based LS channel estimation in OFDM based Power line communications
Bogdanović, Mario
2015-01-01
This paper is focused on low voltage power line communication (PLC) realization with an emphasis on channel estimation techniques. The Orthogonal Frequency Division Multiplexing (OFDM) scheme is preferred technology in PLC systems because of its effective combat with frequency selective fading properties of PLC channel. As the channel estimation is one of the crucial problems in OFDM based PLC system because of a problematic area of PLC signal attenuation and interference, the improved LS est...
Viganotti, Matteo; Jackson, Ruth; Krahn, Hartmut; Dyer, Mark
2013-05-01
Earthen flood defence embankments are linear structures, raised above the flood plain, that are commonly used as flood defences in rural settings; these are often relatively old structures constructed using locally garnered material and of which little is known in terms of design and construction. Alarmingly, it is generally reported that a number of urban developments have expanded to previously rural areas; hence, acquiring knowledge about the flood defences protecting these areas has risen significantly in the agendas of basin and asset managers. This paper focusses, by reporting two case studies, on electromagnetic induction (EMI) methods that would efficiently complement routine visual inspections and would represent a first step to more detailed investigations. Evaluation of the results is presented by comparison with ERT profiles and intrusive investigation data. The EM data, acquired using a GEM-2 apparatus for frequency sounding and an EM-31 apparatus for geometrical sounding, has been handled using the prototype eGMS software tool, being developed by the eGMS international research consortium; the depth sounding data interpretation was assisted by 1D inversions obtained with the EM1DFM software developed by the University of British Columbia. Although both sounding methods showed some limitations, the models obtained were consistent with ERT models and the techniques were useful screening methods for the identification of areas of interest, such as material interfaces or potential seepage areas, within the embankment structure: 1D modelling improved the rapid assessment of earthen flood defence embankments in an estuarine environment; evidence that EMI sounding could play an important role as a monitoring tool or as a first step towards more detailed investigations.
Thirty Years Later: Reflections of the Big Thompson Flood, Colorado, 1976 to 2006
Jarrett, R. D.; Costa, J. E.; Brunstein, F. C.; Quesenberry, C. A.; Vandas, S. J.; Capesius, J. P.; O'Neill, G. B.
2006-12-01
. When substantial flooding occurs, the USGS mobilizes personnel to collect streamflow data in affected areas. Streamflow data improve flood forecasting and provide data for flood-frequency analysis for floodplain management, design of structures located in floodplains, and related water studies. An important lesson learned is that nature provides environmental signs before and during floods that can help people avoid hazard areas. Important contributions to flood science as a result of the 1976 flood include development of paleoflood methods to interpret the preserved flood-plain stratigraphy to document the number, magnitude, and age of floods that occurred prior to streamflow monitoring. These methods and data on large floods can be used in many mountain-river systems to help us better understand flood hazards and plan for the future. For example, according to conventional flood-frequency analysis, the 1976 Big Thompson flood had a flood recurrence interval of about 100 years. However, paleoflood research indicated the 1976 flood was the largest in about the last 10,000 years in the basin and had a flood recurrence interval in excess of 1,000 years.
The Importance of Studying Past Extreme Floods to Prepare for Uncertain Future Extremes
Burges, S. J.
2016-12-01
Hoyt and Langbein, 1955 in their book `Floods' wrote: " ..meteorologic and hydrologic conditions will combine to produce superfloods of unprecedented magnitude. We have every reason to believe that in most rivers past floods may not be an accurate measure of ultimate flood potentialities. It is this superflood with which we are always most concerned". I provide several examples to offer some historical perspective on assessing extreme floods. In one example, flooding in the Miami Valley, OH in 1913 claimed 350 lives. The engineering and socio-economic challenges facing the Morgan Engineering Co in how to mitigate against future flood damage and loss of life when limited information was available provide guidance about ways to face an uncertain hydroclimate future, particularly one of a changed climate. A second example forces us to examine mixed flood populations and illustrates the huge uncertainty in assigning flood magnitude and exceedance probability to extreme floods in such cases. There is large uncertainty in flood frequency estimates; knowledge of the total flood hydrograph, not the peak flood flow rate alone, is what is needed for hazard mitigation assessment or design. Some challenges in estimating the complete flood hydrograph in an uncertain future climate, including demands on hydrologic models and their inputs, are addressed.
Estimation of wind stress using dual-frequency TOPEX data
Elfouhaily, Tanos; Vandemark, Douglas; Gourrion, Jéro‸me; Chapron, Bertrand
1998-10-01
The TOPEX/POSEIDON satellite carries the first dual-frequency radar altimeter. Monofrequency (Ku-band) algorithms are presently used to retrieve surface wind speed from the altimeter's radar cross-section measurement (σ0Ku). These algorithms work reasonably well, but it is also known that altimeter wind estimates can be contaminated by residual effects, such as sea state, embedded in the σ0Ku measurement. Investigating the potential benefit of using two frequencies for wind retrieval, it is shown that a simple evaluation of TOPEX data yields previously unavailable information, particularly for high and low wind speeds. As the wind speed increases, the dual-frequency data provides a measurement more directly linked to the short-scale surface roughness, which in turn is associated with the local surface wind stress. Using a global TOPEX σ0° data set and TOPEX's significant wave height (Hs) estimate as a surrogate for the sea state's degree of development, it is also shown that differences between the two TOPEX σ0 measurements strongly evidence nonlocal sea state signature. A composite scattering theory is used to show how the dual-frequency data can provide an improved friction velocity model, especially for winds above 7 m/s. A wind speed conversion is included using a sea state dependent drag coefficient fed with TOPEX Hs data. Two colocated TOPEX-buoy data sets (from the National Data Buoy Center (NDBC) and the Structure des Echanges Mer-Atmosphre, Proprietes des Heterogeneites Oceaniques: Recherche Expérimentale (SEMAPHORE) campaign) are employed to test the new wind speed algorithm. A measurable improvement in wind speed estimation is obtained when compared to the monofrequency Witter and Chelton [1991] model.
Developing a Malaysia flood model
Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina
2014-05-01
Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.
Fundamental Frequency Estimation using Polynomial Rooting of a Subspace-Based Method
DEFF Research Database (Denmark)
Jensen, Jesper Rindom; Christensen, Mads Græsbøll; Jensen, Søren Holdt
2010-01-01
improvements compared to HMUSIC. First, by using the proposed method we can obtain an estimate of the fundamental frequency without doing a grid search like in HMUSIC. This is due to that the fundamental frequency is estimated as the argument of the root lying closest to the unit circle. Second, we obtain...... a higher spectral resolution compared to HMUSIC which is a property of polynomial rooting methods. Our simulation results show that the proposed method is applicable to real-life signals, and that we in most cases obtain a higher spectral resolution than HMUSIC....
Zhao, Ling; Xia, Huifen
2018-01-01
The project of polymer flooding has achieved great success in Daqing oilfield, and the main oil reservoir recovery can be improved by more than 15%. But, for some strong oil reservoir heterogeneity carrying out polymer flooding, polymer solution will be inefficient and invalid loop problem in the high permeability layer, then cause the larger polymer volume, and a significant reduction in the polymer flooding efficiency. Aiming at this problem, it is studied the method that improves heterogeneous oil reservoir polymer flooding effect by positively-charged gel profile control. The research results show that the polymer physical and chemical reaction of positively-charged gel with the residual polymer in high permeability layer can generate three-dimensional network of polymer, plugging high permeable layer, and increase injection pressure gradient, then improve the effect of polymer flooding development. Under the condition of the same dosage, positively-charged gel profile control can improve the polymer flooding recovery factor by 2.3∼3.8 percentage points. Under the condition of the same polymer flooding recovery factor increase value, after positively-charged gel profile control, it can reduce the polymer volume by 50 %. Applying mechanism of positively-charged gel profile control technology is feasible, cost savings, simple construction, and no environmental pollution, therefore has good application prospect.
Fan, Qin; Davlasheridze, Meri
2016-06-01
Climate change is expected to worsen the negative effects of natural disasters like floods. The negative impacts, however, can be mitigated by individuals' adjustments through migration and relocation behaviors. Previous literature has identified flood risk as one significant driver in relocation decisions, but no prior study examines the effect of the National Flood Insurance Program's voluntary program-the Community Rating System (CRS)-on residential location choice. This article fills this gap and tests the hypothesis that flood risk and the CRS-creditable flood control activities affect residential location choices. We employ a two-stage sorting model to empirically estimate the effects. In the first stage, individuals' risk perception and preference heterogeneity for the CRS activities are considered, while mean effects of flood risk and the CRS activities are estimated in the second stage. We then estimate heterogeneous marginal willingness to pay (WTP) for the CRS activities by category. Results show that age, ethnicity and race, educational attainment, and prior exposure to risk explain risk perception. We find significant values for the CRS-creditable mitigation activities, which provides empirical evidence for the benefits associated with the program. The marginal WTP for an additional credit point earned for public information activities, including hazard disclosure, is found to be the highest. Results also suggest that water amenities dominate flood risk. Thus, high amenity values may increase exposure to flood risk, and flood mitigation projects should be strategized in coastal regions accordingly. © 2015 Society for Risk Analysis.
Rydlund, Jr., Paul H.
2006-01-01
The Taum Sauk pump-storage hydroelectric power plant located in Reynolds County, Missouri, uses turbines that operate as pumps and hydraulic head generated by discharging water from an upper to a lower reservoir to produce electricity. A 55-acre upper reservoir with a 1.5- billion gallon capacity was built on top of Proffit Mountain, approximately 760 feet above the floodplain of the East Fork Black River. At approximately 5:16 am on December 14, 2005, a 680-foot wide section of the upper reservoir embankment failed suddenly, sending water rushing down the western side of Proffit Mountain and emptying into the floodplain of East Fork Black River. Flood waters from the upper reservoir flowed downstream through Johnson's Shut-Ins State Park and into the lower reservoir of the East Fork Black River. Floods such as this present unique challenges and opportunities to analyze and document peak-flow characteristics, flood profiles, inundation extents, and debris movement. On December 16, 2005, Light Detection and Ranging (LiDAR) data were collected and used to support hydraulic analyses, forensic failure analyses, damage extent, and mitigation of future disasters. To evaluate the impact of sedimentation in the lower reservoir, a bathymetric survey conducted on December 22 and 23, 2005, was compared to a previous bathymetric survey conducted in April, 2005. Survey results indicated the maximum reservoir capacity difference of 147 acre-feet existed at a pool elevation of 730 feet. Peak discharge estimates of 289,000 cubic feet per second along Proffit Mountain and 95,000 cubic feet per second along the East Fork Black River were determined through indirect measurement techniques. The magnitude of the embankment failure flood along the East Fork Black River was approximately 4 times greater than the 100-year flood frequency estimate of 21,900 cubic feet per second, and approximately 3 times greater than the 500-year flood frequency estimate of 30,500 cubic feet per second
Magnitude of flood flows for selected annual exceedance probabilities in Rhode Island through 2010
Zarriello, Phillip J.; Ahearn, Elizabeth A.; Levin, Sara B.
2012-01-01
in the flood magnitudes from 20- to 0.2-percent AEPs. Estimates of uncertainty of the at-site and regression flood magnitudes are provided and were combined with their respective estimated flood quantiles to improve estimates of flood flows at streamgages. This region has a long history of urban development, which is considered to have an important effect on flood flows. This study includes basins that have an impervious area ranging from 0.5 to 37 percent. Although imperviousness provided some explanatory power in the regression, it was not statistically significant at the 95-percent confidence level for any of the AEPs examined. Influence of urbanization on flood flows indicates a complex interaction with other characteristics that confounds a statistical explanation of its effects. Standard methods for calculating magnitude of floods for given AEP are based on the assumption of stationarity, that is, the annual peak flows exhibit no significant trend over time. A subset of 16 streamgages with 70 or more years of unregulated systematic record indicates all but 4 streamgages have a statistically significant positive trend at the 95-percent confidence level; three of these are statistically significant at about the 90-percent confidence level or above. If the trend continues linearly in time, the estimated magnitude of floods for any AEP, on average, will increase by 6, 13, and 21 percent in 10, 20, and 30 years' time, respectively. In 2010, new peaks of record were set at 18 of the 21 active streamgages in Rhode Island. The updated flood frequency analysis indicates the peaks at these streamgages ranged from 2- to 0.2-percent AEP. Many streamgages in the State peaked at a 0.5- and 0.2-percent AEP, except for streamgages in the Blackstone River Basin, which peaked from a 4- to 2-percent AEP.
Development of Integrated Flood Analysis System for Improving Flood Mitigation Capabilities in Korea
Moon, Young-Il; Kim, Jong-suk
2016-04-01
Recently, the needs of people are growing for a more safety life and secure homeland from unexpected natural disasters. Flood damages have been recorded every year and those damages are greater than the annual average of 2 trillion won since 2000 in Korea. It has been increased in casualties and property damages due to flooding caused by hydrometeorlogical extremes according to climate change. Although the importance of flooding situation is emerging rapidly, studies related to development of integrated management system for reducing floods are insufficient in Korea. In addition, it is difficult to effectively reduce floods without developing integrated operation system taking into account of sewage pipe network configuration with the river level. Since the floods result in increasing damages to infrastructure, as well as life and property, structural and non-structural measures should be urgently established in order to effectively reduce the flood. Therefore, in this study, we developed an integrated flood analysis system that systematized technology to quantify flood risk and flood forecasting for supporting synthetic decision-making through real-time monitoring and prediction on flash rain or short-term rainfall by using radar and satellite information in Korea. Keywords: Flooding, Integrated flood analysis system, Rainfall forecasting, Korea Acknowledgments This work was carried out with the support of "Cooperative Research Program for Agriculture Science & Technology Development (Project No. PJ011686022015)" Rural Development Administration, Republic of Korea
Wobus, C. W.; Gutmann, E. D.; Jones, R.; Rissing, M.; Mizukami, N.; Lorie, M.; Mahoney, H.; Wood, A.; Mills, D.; Martinich, J.
2017-12-01
A growing body of recent work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus increasing monetary damages from flooding in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1% annual exceedance probability flood events at 57,116 locations across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century, under two greenhouse gas (GHG) emissions scenarios. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations, and trajectories of future damages that vary substantially depending on the GHG emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches $4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long-term in terms of reduced flood risk. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages at a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1% AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results suggest that monetary damages from inland flooding could be substantially reduced through more aggressive GHG mitigation policies.
Flash floods in Europe: state of the art and research perspectives
Gaume, Eric
2014-05-01
Flash floods, i.e. floods induced by severe rainfall events generally affecting watersheds of limited area, are the most frequent, destructive and deadly kind of natural hazard known in Europe and throughout the world. Flash floods are especially intense across the Mediterranean zone, where rainfall accumulations exceeding 500 mm within a few hours may be observed. Despite this state of facts, the study of extremes in hydrology has essentially gone unexplored until the recent past, with the exception of some rare factual reports on individual flood events, with the sporadic inclusion of isolated estimated peak discharges. Floods of extraordinary magnitude are in fact hardly ever captured by existing standard measurement networks, either because they are too heavily concentrated in space and time or because their discharges greatly exceed the design and calibration ranges of the measurement devices employed (stream gauges). This situation has gradually evolved over the last decade for two main reasons. First, the expansion and densification of weather radar networks, combined with improved radar quantitative precipitation estimates, now provide ready access to rainfall measurements at spatial and temporal scales that, while not perfectly accurate, are compatible with the study of extreme events. Heavy rainfall events no longer fail to be recorded by existing rain gauge and radar networks. Second, pioneering research efforts on extreme floods, based on precise post-flood surveys, have helped overcome the limitations imposed by a small base of available direct measured data. This activity has already yielded significant progress in expanding the knowledge and understanding of extreme flash floods. This presentation will provide a review of the recent research progresses in the area of flash flood studies, mainly based on the outcomes of the European research projects FLOODsite, HYDRATE and Hymex. It will show how intensive collation of field data helped better define
Phillips, R.; Samadi, S. Z.; Meadows, M.
2017-12-01
The potential for the intensity of extreme rainfall to increase with climate change nonstationarity has emerged as a prevailing issue for the design of engineering infrastructure, underscoring the need to better characterize the statistical assumptions underlying hydrological frequency analysis. The focus of this study is on developing probabilistic rainfall intensity-duration-frequency (IDF) curves for the major catchments in South Carolina (SC) where the October 02-05, 2015 floods caused infrastructure damages and several lives to be lost. Continuous to discrete probability distributions including Weibull, the generalized extreme value (GEV), the Generalized Pareto (GP), the Gumbel, the Fréchet, the normal, and the log-normal functions were fitted to the short duration (i.e., 24-hr) intense rainfall. Analysis suggests that the GEV probability distribution provided the most adequate fit to rainfall records. Rainfall frequency analysis indicated return periods above 500 years for urban drainage systems with a maximum return level of approximately 2,744 years, whereas rainfall magnitude was much lower in rural catchments. Further, the return levels (i.e., 2, 20, 50,100, 500, and 1000 years) computed by Monte Carlo method were consistently higher than the NOAA design IDF curves. Given the potential increase in the magnitude of intense rainfall, current IDF curves can substantially underestimate the frequency of extremes, indicating the susceptibility of the storm drainage and flood control structures in SC that were designed under assumptions of a stationary climate.
Determination of Flood Reduction Alternatives for Climate Change Adaptation in Gyeongancheon basin
Han, D.; Joo, H. J.; Jung, J.; Kim, H. S.
2017-12-01
Recently, the frequency of extreme rainfall event has increased due to the climate change and the impermeable area in an urban watershed has also increased due to the rapid urbanization. Therefore, the flood risk is increasing and we ought to prepare countermeasures for flood damage reduction. For the determination of appropriate measures or alternatives, firstly, this study estimated the frequency based rainfall considering the climate change according to the each target period(reference : 1971˜2010, Target period Ⅰ : 2011˜2040, Target period Ⅱ : 2041˜2070, Target period Ⅲ : 2071˜2100). Then the future flood discharge was computed by using HEC-HMS model. We set 5 sizes of drainage pumps and detention ponds respectively as the flood reduction alternatives and the flood level in the river was obtained by each alternative through HEC-RAS model. The flood inundation map was constructed using topographical data and flood water level in the river and the economic analysis was conducted for the flood damage reduction studies using Multi Dimensional Flood Damage Analysis (MD-FDA) tool. As a result of the effectiveness analysis of the flood reduction alternatives, the flood level by drainage pump was reduced by 0.06m up to 0.44m while it was reduced by 0.01m up to 1.86m in the case of the detention pond. The flooded area was shrunk by up to 32.64% from 0.3% and inundation depth was also dropped. As a result of a comparison of the Benefit/Cost ratio estimated by the economic analysis, a detention pond E in the target period Ⅰ and the pump D in the periods Ⅱ and Ⅲ were considered as the appropriate alternatives for the flood damage reduction under the climate change. AcknowledgementsThis research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(2017R1A2B3005695)
Directory of Open Access Journals (Sweden)
Bambang Dwi Dasanto
2014-06-01
Full Text Available From 1931 to 2010 the flood frequency in Upper Citarum Watershed had increased sharply indicating the decline of the wateshed quality. With the change of climate, risk of the flood may get worse. This study aims to determine effective rainfall that caused flooding and to evaluate the impact of future rainfall changes on the flood prone areas. Effective rainfall which contributes to direct runoff (DRO and leads to flooding was determined using regression equation relating the DRO and cumulative rainfall of a number of consecutive days. Mapping the flood prone areas was developed using the GIS techniques. Results showed that the effective rainfall which caused flooding was the rainfall accumulation for four consecutive days before occurrence of peak of DRO. The percentage of accuracy between estimated and actual flood maps was about 76.9%. According to historical rainfall, the flood prone areas spreaded at right and left directions of the Upstream Citarum River. If this area experiences the climate change, the frequency and flood extents will increase. This study can only identify locations and possibility of flood occurrence but it cannot demonstrate widespread of flood inundation precisely. However, this simple approach can evaluate the flood frequency and intensity quite well.
Shima, Tomoyuki; Tomeba, Hiromichi; Adachi, Fumiyuki
Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of time-domain spreading and orthogonal frequency division multiplexing (OFDM). In orthogonal MC DS-CDMA, the frequency diversity gain can be obtained by applying frequency-domain equalization (FDE) based on minimum mean square error (MMSE) criterion to a block of OFDM symbols and can improve the bit error rate (BER) performance in a severe frequency-selective fading channel. FDE requires an accurate estimate of the channel gain. The channel gain can be estimated by removing the pilot modulation in the frequency domain. In this paper, we propose a pilot-assisted channel estimation suitable for orthogonal MC DS-CDMA with FDE and evaluate, by computer simulation, the BER performance in a frequency-selective Rayleigh fading channel.
Using Internet search engines to estimate word frequency.
Blair, Irene V; Urland, Geoffrey R; Ma, Jennifer E
2002-05-01
The present research investigated Internet search engines as a rapid, cost-effective alternative for estimating word frequencies. Frequency estimates for 382 words were obtained and compared across four methods: (1) Internet search engines, (2) the Kucera and Francis (1967) analysis of a traditional linguistic corpus, (3) the CELEX English linguistic database (Baayen, Piepenbrock, & Gulikers, 1995), and (4) participant ratings of familiarity. The results showed that Internet search engines produced frequency estimates that were highly consistent with those reported by Kucera and Francis and those calculated from CELEX, highly consistent across search engines, and very reliable over a 6-month period of time. Additional results suggested that Internet search engines are an excellent option when traditional word frequency analyses do not contain the necessary data (e.g., estimates for forenames and slang). In contrast, participants' familiarity judgments did not correspond well with the more objective estimates of word frequency. Researchers are advised to use search engines with large databases (e.g., AltaVista) to ensure the greatest representativeness of the frequency estimates.
Estimating design flood and HEC-RAS modelling approach for flood analysis in Bojonegoro city
Prastica, R. M. S.; Maitri, C.; Hermawan, A.; Nugroho, P. C.; Sutjiningsih, D.; Anggraheni, E.
2018-03-01
Bojonegoro faces flood every year with less advanced prevention development. Bojonegoro city development could not peak because the flood results material losses. It affects every sectors in Bojonegoro: education, politics, economy, social, and infrastructure development. This research aims to analyse and to ensure that river capacity has high probability to be the main factor of flood in Bojonegoro. Flood discharge analysis uses Nakayasu synthetic unit hydrograph for period of 5 years, 10 years, 25 years, 50 years, and 100 years. They would be compared to the water maximum capacity that could be loaded by downstream part of Bengawan Solo River in Bojonegoro. According to analysis result, Bengawan Solo River in Bojonegoro could not able to load flood discharges. Another method used is HEC-RAS analysis. The conclusion that shown by HEC-RAS analysis has the same view. It could be observed that flood water loading is more than full bank capacity elevation in the river. To conclude, the main factor that should be noticed by government to solve flood problem is river capacity.
Comparison of Flood Frequency Analysis Methods for Ungauged Catchments in France
Directory of Open Access Journals (Sweden)
Jean Odry
2017-09-01
Full Text Available The objective of flood frequency analysis (FFA is to associate flood intensity with a probability of exceedance. Many methods are currently employed for this, ranging from statistical distribution fitting to simulation approaches. In many cases the site of interest is actually ungauged, and a regionalisation scheme has to be associated with the FFA method, leading to a multiplication of the number of possible methods available. This paper presents the results of a wide-range comparison of FFA methods from statistical and simulation families associated with different regionalisation schemes based on regression, or spatial or physical proximity. The methods are applied to a set of 1535 French catchments, and a k-fold cross-validation procedure is used to consider the ungauged configuration. The results suggest that FFA from the statistical family largely relies on the regionalisation step, whereas the simulation-based method is more stable regarding regionalisation. This conclusion emphasises the difficulty of the regionalisation process. The results are also contrasted depending on the type of climate: the Mediterranean catchments tend to aggravate the differences between the methods.
Li, Ruixiao; Li, Kun; Zhao, Changming
2018-01-01
Coherent dual-frequency Lidar (CDFL) is a new development of Lidar which dramatically enhances the ability to decrease the influence of atmospheric interference by using dual-frequency laser to measure the range and velocity with high precision. Based on the nature of CDFL signals, we propose to apply the multiple signal classification (MUSIC) algorithm in place of the fast Fourier transform (FFT) to estimate the phase differences in dual-frequency Lidar. In the presence of Gaussian white noise, the simulation results show that the signal peaks are more evident when using MUSIC algorithm instead of FFT in condition of low signal-noise-ratio (SNR), which helps to improve the precision of detection on range and velocity, especially for the long distance measurement systems.
Dahl, Kristina A; Fitzpatrick, Melanie F; Spanger-Siegfried, Erika
2017-01-01
Tidal flooding is among the most tangible present-day effects of global sea level rise. Here, we utilize a set of NOAA tide gauges along the U.S. East and Gulf Coasts to evaluate the potential impact of future sea level rise on the frequency and severity of tidal flooding. Using the 2001-2015 time period as a baseline, we first determine how often tidal flooding currently occurs. Using localized sea level rise projections based on the Intermediate-Low, Intermediate-High, and Highest projections from the U.S. National Climate Assessment, we then determine the frequency and extent of such flooding at these locations for two near-term time horizons: 2030 and 2045. We show that increases in tidal flooding will be substantial and nearly universal at the 52 locations included in our analysis. Long before areas are permanently inundated, the steady creep of sea level rise will force many communities to grapple with chronic high tide flooding in the next 15 to 30 years.
Saleh, F.; Garambois, P. A.; Biancamaria, S.
2017-12-01
Floods are considered the major natural threats to human societies across all continents. Consequences of floods in highly populated areas are more dramatic with losses of human lives and substantial property damage. This risk is projected to increase with the effects of climate change, particularly sea-level rise, increasing storm frequencies and intensities and increasing population and economic assets in such urban watersheds. Despite the advances in computational resources and modeling techniques, significant gaps exist in predicting complex processes and accurately representing the initial state of the system. Improving flood prediction models and data assimilation chains through satellite has become an absolute priority to produce accurate flood forecasts with sufficient lead times. The overarching goal of this work is to assess the benefits of the Surface Water Ocean Topography SWOT satellite data from a flood prediction perspective. The near real time methodology is based on combining satellite data from a simulator that mimics the future SWOT data, numerical models, high resolution elevation data and real-time local measurement in the New York/New Jersey area.
Simplified approach for estimating large early release frequency
International Nuclear Information System (INIS)
Pratt, W.T.; Mubayi, V.; Nourbakhsh, H.; Brown, T.; Gregory, J.
1998-04-01
The US Nuclear Regulatory Commission (NRC) Policy Statement related to Probabilistic Risk Analysis (PRA) encourages greater use of PRA techniques to improve safety decision-making and enhance regulatory efficiency. One activity in response to this policy statement is the use of PRA in support of decisions related to modifying a plant's current licensing basis (CLB). Risk metrics such as core damage frequency (CDF) and Large Early Release Frequency (LERF) are recommended for use in making risk-informed regulatory decisions and also for establishing acceptance guidelines. This paper describes a simplified approach for estimating LERF, and changes in LERF resulting from changes to a plant's CLB
Lopez, M.A.; Woodham, W.M.
1983-01-01
Hydrologic data collected on nine small urban watersheds in the Tampa Bay area of west-central Florida and a method for estimating peak discharges in the study area are described. The watersheds have mixed land use and range in size from 0.34 to 3.45 square miles. Watershed soils, land use, and storm-drainage system data are described. Urban development ranged from a sparsely populated area with open-ditch storm sewers and 19% impervious area to a completely sewered watershed with 61% impervious cover. The U.S. Geological Survey natural-basin and urban-watershed models were calibrated for the nine watersheds using 5-minute interval rainfall data from the Tampa, Florida, National Weather Service rain gage to simulate annual peak discharge for the period 1906-52. A log-Pearson Type III frequency analysis of the simulated annual maximum discharge was used to determine the 2-, 5-, 10-, 25-, 50-, and 100-year flood discharges for each watershed. Flood discharges were related in a multiple-linear regression to drainage area, channel slope, detention storage area, and an urban-development factor determined by the extent of curb and gutter street drainage and storm-sewer system. The average standard error for the regional relations ranged from + or - 32 to + or - 42%. (USGS)
Accurate Frequency Estimation Based On Three-Parameter Sine-Fitting With Three FFT Samples
Directory of Open Access Journals (Sweden)
Liu Xin
2015-09-01
Full Text Available This paper presents a simple DFT-based golden section searching algorithm (DGSSA for the single tone frequency estimation. Because of truncation and discreteness in signal samples, Fast Fourier Transform (FFT and Discrete Fourier Transform (DFT are inevitable to cause the spectrum leakage and fence effect which lead to a low estimation accuracy. This method can improve the estimation accuracy under conditions of a low signal-to-noise ratio (SNR and a low resolution. This method firstly uses three FFT samples to determine the frequency searching scope, then – besides the frequency – the estimated values of amplitude, phase and dc component are obtained by minimizing the least square (LS fitting error of three-parameter sine fitting. By setting reasonable stop conditions or the number of iterations, the accurate frequency estimation can be realized. The accuracy of this method, when applied to observed single-tone sinusoid samples corrupted by white Gaussian noise, is investigated by different methods with respect to the unbiased Cramer-Rao Low Bound (CRLB. The simulation results show that the root mean square error (RMSE of the frequency estimation curve is consistent with the tendency of CRLB as SNR increases, even in the case of a small number of samples. The average RMSE of the frequency estimation is less than 1.5 times the CRLB with SNR = 20 dB and N = 512.
Basnayake, S. B.; Jayasinghe, S.; Meechaiya, C.; Markert, K. N.; Lee, H.; Towashiraporn, P.; Anderson, E.; Okeowo, M. A.
2017-12-01
area. Model results (inundations) are compared with the estimates of water levels of Jason 2/3 measurements from two locations along the river. The results encourage us to use satellite-derived rainfall data over upstream areas to improve flood modeling, which contributes to improved flood early warning in Myanmar and other lower Mekong countries.
Wobus, Cameron; Gutmann, Ethan; Jones, Russell; Rissing, Matthew; Mizukami, Naoki; Lorie, Mark; Mahoney, Hardee; Wood, Andrew W.; Mills, David; Martinich, Jeremy
2017-12-01
A growing body of work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus potentially increasing flood damages in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1 % annual exceedance probability (1 % AEP, or 100-year) flood events at 57 116 stream reaches across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations and trajectories of future damages that vary substantially depending on the greenhouse gas (GHG) emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches USD 4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long term in terms of reduced flood damages. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages on a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1 % AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results indicate that monetary damages from inland flooding could be significantly reduced through substantial GHG mitigation.
Directory of Open Access Journals (Sweden)
C. Wobus
2017-12-01
Full Text Available A growing body of work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus potentially increasing flood damages in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5 to estimate changes in the frequency of modeled 1 % annual exceedance probability (1 % AEP, or 100-year flood events at 57 116 stream reaches across the contiguous United States (CONUS. We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations and trajectories of future damages that vary substantially depending on the greenhouse gas (GHG emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches USD 4 billion per year by 2100 (in undiscounted 2014 dollars, suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long term in terms of reduced flood damages. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages on a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1 % AEP floods. Although future work is needed to test the sensitivity of our results to these methodological choices, our results indicate that monetary damages from inland flooding could be significantly reduced through substantial GHG mitigation.
Identification of flood-rich and flood-poor periods in flood series
Mediero, Luis; Santillán, David; Garrote, Luis
2015-04-01
Recently, a general concern about non-stationarity of flood series has arisen, as changes in catchment response can be driven by several factors, such as climatic and land-use changes. Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Trends are usually detected by the Mann-Kendall test. However, the results of this test depend on the starting and ending year of the series, which can lead to different results in terms of the period considered. The results can be conditioned to flood-poor and flood-rich periods located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to a set of long series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. Mediero et al. (2014) found a general decreasing trend in flood series in some parts of Spain that could be caused by a flood-rich period observed in 1950-1970, placed at the beginning of the flood series. The results of this study support the findings of Mediero et al. (2014), as a flood-rich period in 1950-1970 was identified in most of the selected sites. References: Mediero, L., Santillán, D., Garrote, L., Granados, A. Detection and attribution of trends in magnitude, frequency and timing of floods in Spain, Journal of Hydrology, 517, 1072-1088, 2014.
Probabilistic Flood Mapping using Volunteered Geographical Information
Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.
2016-12-01
Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).
Evaluation of internal flooding in a BWR
International Nuclear Information System (INIS)
Shiu, K.; Papazoglou, I.A.; Sun, Y.H.; Anavim, E.; Ilberg, D.
1985-01-01
Flooding inside a nuclear power station is capable of concurrently disabling redundant safety systems. This paper presents the results of a recent review study performed on internally-generated floods inside a boiling water reactor (BWR) reactor building. The study evaluated the flood initiator frequency due to either maintenance or ruptures using Markovian models. A time phased event tree approach was adopted to quantify the core damage frequency based on the flood initiator frequency. It is found in the study that the contribution to the total core damage due to internal flooding events is not insignificant and is comparable to other transient contributors. The findings also indicate that the operator plays an important role in the prevention as well as the mitigation of a flooding event
Flood Risk Assessment Based On Security Deficit Analysis
Beck, J.; Metzger, R.; Hingray, B.; Musy, A.
Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the
Future flood risk estimates along the river Rhine
te Linde, A.H.; Bubeck, P.; Dekkers, J.E.C.; de Moel, H.; Aerts, J.C.J.H.
2011-01-01
In Europe, water management is moving from flood defence to a risk management approach, which takes both the probability and the potential consequences of flooding into account. It is expected that climate change and socio-economic development will lead to an increase in flood risk in the Rhine
A Kalman-based Fundamental Frequency Estimation Algorithm
DEFF Research Database (Denmark)
Shi, Liming; Nielsen, Jesper Kjær; Jensen, Jesper Rindom
2017-01-01
Fundamental frequency estimation is an important task in speech and audio analysis. Harmonic model-based methods typically have superior estimation accuracy. However, such methods usually as- sume that the fundamental frequency and amplitudes are station- ary over a short time frame. In this pape...
A statistical approach to evaluate flood risk at the regional level: an application to Italy
Rossi, Mauro; Marchesini, Ivan; Salvati, Paola; Donnini, Marco; Guzzetti, Fausto; Sterlacchini, Simone; Zazzeri, Marco; Bonazzi, Alessandro; Carlesi, Andrea
2016-04-01
Floods are frequent and widespread in Italy, causing every year multiple fatalities and extensive damages to public and private structures. A pre-requisite for the development of mitigation schemes, including financial instruments such as insurance, is the ability to quantify their costs starting from the estimation of the underlying flood hazard. However, comprehensive and coherent information on flood prone areas, and estimates on the frequency and intensity of flood events, are not often available at scales appropriate for risk pooling and diversification. In Italy, River Basins Hydrogeological Plans (PAI), prepared by basin administrations, are the basic descriptive, regulatory, technical and operational tools for environmental planning in flood prone areas. Nevertheless, such plans do not cover the entire Italian territory, having significant gaps along the minor hydrographic network and in ungauged basins. Several process-based modelling approaches have been used by different basin administrations for the flood hazard assessment, resulting in an inhomogeneous hazard zonation of the territory. As a result, flood hazard assessments expected and damage estimations across the different Italian basin administrations are not always coherent. To overcome these limitations, we propose a simplified multivariate statistical approach for the regional flood hazard zonation coupled with a flood impact model. This modelling approach has been applied in different Italian basin administrations, allowing a preliminary but coherent and comparable estimation of the flood hazard and the relative impact. Model performances are evaluated comparing the predicted flood prone areas with the corresponding PAI zonation. The proposed approach will provide standardized information (following the EU Floods Directive specifications) on flood risk at a regional level which can in turn be more readily applied to assess flood economic impacts. Furthermore, in the assumption of an appropriate
Stephenson, V.; D'Ayala, D.
2013-10-01
The recent increase in frequency and severity of flooding in the UK has led to a shift in the perception of risk associated with flood hazards. This has extended to the conservation community, and the risks posed to historic structures that suffer from flooding are particularly concerning for those charged with preserving and maintaining such buildings. In order to fully appraise the risks in a manner appropriate to the complex issue of preservation, a new methodology is proposed that studies the nature of vulnerability of such structures, and places it in the context of risk assessment, accounting for the vulnerable object and the subsequent exposure of that object to flood hazards. The testing of the methodology is carried out using three urban case studies and the results of the survey analysis provide key findings and guidance on the development of fragility curves for historic structures exposed to flooding. This occurs through appraisal of key vulnerability indicators related to building form, structural and fabric integrity, and preservation of architectural and archaeological values. This in turn facilitates the production of strategies for mitigating and managing the losses threatened by such extreme climate events.
Le Bihan, Guillaume; Payrastre, Olivier; Gaume, Eric; Pons, Frederic; Moncoulon, David
2016-04-01
Hydrometeorological forecasting is an essential component of real-time flood management. The information it provides is of great help for crisis managers to anticipate the inundations and the associated risks. In the particular case of flash-floods, which may affect a large amount of small watersheds spread over the territory (up to 300 000 km of waterways considering a drained area of 5 km² minimum in France), appropriate flood forecasting systems are still under development. In France, highly distributed hydrological models have been implemented, enabling a real-time assessment of the potential intensity of flash-floods from the records of weather radars: AIGA-hydro system (Lavabre et al., 2005; Javelle et al., 2014), PreDiFlood project (Naulin et al., 2013). The approach presented here aims to go one step further by offering a direct assessment of the potential impacts of the simulated floods on inhabited areas. This approach is based on an a priori analysis of the study area in order (1) to evaluate with a simplified hydraulic approach (DTM treatment) the potentially flooded areas for different discharge levels, and (2) to identify the associated buildings and/or population at risk from geographic databases. This preliminary analysis enables to build an impact model (discharge-impact curve) on each river reach, which is then used to directly estimate the potentially affected assets based on a distributed rainfall runoff model. The overall principle of this approach was already presented at the 8th Hymex workshop. Therefore, the presentation will be here focused on the first validation results in terms of (1) accuracy of flooded areas simulated from DTM treatments, and (2) relevance of estimated impacts. The inundated areas simulated were compared to the European Directive cartography results (where available), showing an overall good correspondence in a large majority of cases, but also very significant errors for approximatively 10% of the river reaches
Floods and food security: A method to estimate the effect of inundation on crops availability
Pacetti, Tommaso; Caporali, Enrica; Rulli, Maria Cristina
2017-12-01
The inner connections between floods and food security are extremely relevant, especially in developing countries where food availability can be highly jeopardized by extreme events that damage the primary access to food, i.e. agriculture. A method for the evaluation of the effects of floods on food supply, consisting of the integration of remote sensing data, agricultural statistics and water footprint databases, is proposed and applied to two different case studies. Based on the existing literature related to extreme floods, the events in Bangladesh (2007) and in Pakistan (2010) have been selected as exemplary case studies. Results show that the use of remote sensing data combined with other sources of onsite information is particularly useful to assess the effects of flood events on food availability. The damages caused by floods on agricultural areas are estimated in terms of crop losses and then converted into lost calories and water footprint as complementary indicators. Method results are fully repeatable; whereas, for remote sensed data the sources of data are valid worldwide and the data regarding land use and crops characteristics are strongly site specific, which need to be carefully evaluated. A sensitivity analysis has been carried out for the water depth critical on the crops in Bangladesh, varying the assumed level by ±20%. The results show a difference in the energy content losses estimation of 12% underlying the importance of an accurate data choice.
Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO.
Kreibich, Heidi; Botto, Anna; Merz, Bruno; Schröter, Kai
2017-04-01
Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. © 2016 Society for Risk Analysis.
Directory of Open Access Journals (Sweden)
Rehan Balqis M.
2016-01-01
Full Text Available Current practice in flood frequency analysis assumes that the stochastic properties of extreme floods follow that of stationary conditions. As human intervention and anthropogenic climate change influences in hydrometeorological variables are becoming evident in some places, there have been suggestions that nonstationary statistics would be better to represent the stochastic properties of the extreme floods. The probabilistic estimation of non-stationary models, however, is surrounded with uncertainty related to scarcity of observations and modelling complexities hence the difficulty to project the future condition. In the face of uncertain future and the subjectivity of model choices, this study attempts to demonstrate the practical implications of applying a nonstationary model and compares it with a stationary model in flood risk assessment. A fully integrated framework to simulate decision makers’ behaviour in flood frequency analysis is thereby developed. The framework is applied to hypothetical flood risk management decisions and the outcomes are compared with those of known underlying future conditions. Uncertainty of the economic performance of the risk-based decisions is assessed through Monte Carlo simulations. Sensitivity of the results is also tested by varying the possible magnitude of future changes. The application provides quantitative and qualitative comparative results that satisfy a preliminary analysis of whether the nonstationary model complexity should be applied to improve the economic performance of decisions. Results obtained from the case study shows that the relative differences of competing models for all considered possible future changes are small, suggesting that stationary assumptions are preferred to a shift to nonstationary statistics for practical application of flood risk management. Nevertheless, nonstationary assumption should also be considered during a planning stage in addition to stationary assumption
Directory of Open Access Journals (Sweden)
D. P. Viero
2018-01-01
Full Text Available Citizen science and crowdsourcing are gaining increasing attention among hydrologists. In a recent contribution, Mazzoleni et al. (2017 investigated the integration of crowdsourced data (CSD into hydrological models to improve the accuracy of real-time flood forecasts. The authors used synthetic CSD (i.e. not actually measured, because real CSD were not available at the time of the study. In their work, which is a proof-of-concept study, Mazzoleni et al. (2017 showed that assimilation of CSD improves the overall model performance; the impact of irregular frequency of available CSD, and that of data uncertainty, were also deeply assessed. However, the use of synthetic CSD in conjunction with (semi-distributed hydrological models deserves further discussion. As a result of equifinality, poor model identifiability, and deficiencies in model structure, internal states of (semi-distributed models can hardly mimic the actual states of complex systems away from calibration points. Accordingly, the use of synthetic CSD that are drawn from model internal states under best-fit conditions can lead to overestimation of the effectiveness of CSD assimilation in improving flood prediction. Operational flood forecasting, which results in decisions of high societal value, requires robust knowledge of the model behaviour and an in-depth assessment of both model structure and forcing data. Additional guidelines are given that are useful for the a priori evaluation of CSD for real-time flood forecasting and, hopefully, for planning apt design strategies for both model calibration and collection of CSD.
Directory of Open Access Journals (Sweden)
Changjiang Xu
2016-01-01
Full Text Available Design flood hydrograph (DFH for a dam is the flood of suitable probability and magnitude adopted to ensure safety of the dam in accordance with appropriate design standards. Estimated quantiles of peak discharge and flood volumes are necessary for deriving the DFH, which are mutually correlated and need to be described by multivariate analysis methods. The joint probability distributions of peak discharge and flood volumes were established using copula functions. Then the general formulae of conditional most likely composition (CMLC and conditional expectation composition (CEC methods that consider the inherent relationship between flood peak and volumes were derived for estimating DFH. The Danjiangkou reservoir in Hanjiang basin was selected as a case study. The design values of flood volumes and 90% confidence intervals with different peak discharges were estimated by the proposed methods. The performance of CMLC and CEC methods was also compared with conventional flood frequency analysis, and the results show that CMLC method performs best for both bivariate and trivariate distributions which has the smallest relative error and root mean square error. The proposed CMLC method has strong statistical basis with unique design flood composition scheme and provides an alternative way for deriving DFH.
Lee, J. Y.; Chae, B. S.; Wi, S.; KIm, T. W.
2017-12-01
Various climate change scenarios expect the rainfall in South Korea to increase by 3-10% in the future. The future increased rainfall has significant effect on the frequency of flood in future as well. This study analyzed the probability of future flood to investigate the stability of existing and new installed hydraulic structures and the possibility of increasing flood damage in mid-sized watersheds in South Korea. To achieve this goal, we first clarified the relationship between flood quantiles acquired from the flood-frequency analysis (FFA) and design rainfall-runoff analysis (DRRA) in gauged watersheds. Then, after synthetically generating the regional natural flow data according to RCP climate change scenarios, we developed mathematical formulas to estimate future flood quantiles based on the regression between DRRA and FFA incorporated with regional natural flows in unguaged watersheds. Finally, we developed a flood risk map to investigate the change of flood risk in terms of the return period for the past, present, and future. The results identified that the future flood quantiles and risks would increase in accordance with the RCP climate change scenarios. Because the regional flood risk was identified to increase in future comparing with the present status, comprehensive flood control will be needed to cope with extreme floods in future.
FLASH-FLOOD MODELLING WITH ARTIFICIAL NEURAL NETWORKS USING RADAR RAINFALL ESTIMATES
Directory of Open Access Journals (Sweden)
Dinu Cristian
2017-09-01
Full Text Available The use of artificial neural networks (ANNs in modelling the hydrological processes has become a common approach in the last two decades, among side the traditional methods. In regard to the rainfall-runoff modelling, in both traditional and ANN models the use of ground rainfall measurements is prevalent, which can be challenging in areas with low rain gauging station density, especially in catchments where strong focused rainfall can generate flash-floods. The weather radar technology can prove to be a solution for such areas by providing rain estimates with good time and space resolution. This paper presents a comparison between different ANN setups using as input both ground and radar observations for modelling the rainfall-runoff process for Bahluet catchment, with focus on a flash-flood observed in the catchment.
Determination of soil degradation from flooding for estimating ecosystem services in Slovakia
Hlavcova, Kamila; Szolgay, Jan; Karabova, Beata; Kohnova, Silvia
2015-04-01
Floods as natural hazards are related to soil health, land-use and land management. They not only represent threats on their own, but can also be triggered, controlled and amplified by interactions with other soil threats and soil degradation processes. Among the many direct impacts of flooding on soil health, including soil texture, structure, changes in the soil's chemical properties, deterioration of soil aggregation and water holding capacity, etc., are soil erosion, mudflows, depositions of sediment and debris. Flooding is initiated by a combination of predispositive and triggering factors and apart from climate drivers it is related to the physiographic conditions of the land, state of the soil, land use and land management. Due to the diversity and complexity of their potential interactions, diverse methodologies and approaches are needed for describing a particular type of event in a specific environment, especially in ungauged sites. In engineering studies and also in many rainfall-runoff models, the SCS-CN method has remained widely applied for soil and land use-based estimations of direct runoff and flooding potential. The SCS-CN method is an empirical rainfall-runoff model developed by the USDA Natural Resources Conservation Service (formerly called the Soil Conservation Service or SCS). The runoff curve number (CN) is based on the hydrological soil characteristics, land use, land management and antecedent saturation conditions of soil. Since the method and curve numbers were derived on the basis of an empirical analysis of rainfall-runoff events from small catchments and hillslope plots monitored by the USDA, the use of the method for the conditions of Slovakia raises uncertainty and can cause inaccurate results in determining direct runoff. The objective of the study presented (also within the framework of the EU-FP7 RECARE Project) was to develop the SCS - CN methodology for the flood conditions in Slovakia (and especially for the RECARE pilot site
An Estimation Method for number of carrier frequency
Directory of Open Access Journals (Sweden)
Xiong Peng
2015-01-01
Full Text Available This paper proposes a method that utilizes AR model power spectrum estimation based on Burg algorithm to estimate the number of carrier frequency in single pulse. In the modern electronic and information warfare, the pulse signal form of radar is complex and changeable, among which single pulse with multi-carrier frequencies is the most typical one, such as the frequency shift keying (FSK signal, the frequency shift keying with linear frequency (FSK-LFM hybrid modulation signal and the frequency shift keying with bi-phase shift keying (FSK-BPSK hybrid modulation signal. In view of this kind of single pulse which has multi-carrier frequencies, this paper adopts a method which transforms the complex signal into AR model, then takes power spectrum based on Burg algorithm to show the effect. Experimental results show that the estimation method still can determine the number of carrier frequencies accurately even when the signal noise ratio (SNR is very low.
Revising time series of the Elbe river discharge for flood frequency determination at gauge Dresden
Directory of Open Access Journals (Sweden)
S. Bartl
2009-11-01
Full Text Available The German research programme RIsk MAnagment of eXtreme flood events has accomplished the improvement of regional hazard assessment for the large rivers in Germany. Here we focused on the Elbe river at its gauge Dresden, which belongs to the oldest gauges in Europe with officially available daily discharge time series beginning on 1 January 1890. The project on the one hand aimed to extend and to revise the existing time series, and on the other hand to examine the variability of the Elbe river discharge conditions on a greater time scale. Therefore one major task were the historical searches and the examination of the retrieved documents and the contained information. After analysing this information the development of the river course and the discharge conditions were discussed. Using the provided knowledge, in an other subproject, a historical hydraulic model was established. Its results then again were used here. A further purpose was the determining of flood frequency based on all pre-processed data. The obtained knowledge about historical changes was also used to get an idea about possible future variations under climate change conditions. Especially variations in the runoff characteristic of the Elbe river over the course of the year were analysed. It succeeded to obtain a much longer discharge time series which contain fewer errors and uncertainties. Hence an optimized regional hazard assessment was realised.
Revising time series of the Elbe river discharge for flood frequency determination at gauge Dresden
Bartl, S.; Schümberg, S.; Deutsch, M.
2009-11-01
The German research programme RIsk MAnagment of eXtreme flood events has accomplished the improvement of regional hazard assessment for the large rivers in Germany. Here we focused on the Elbe river at its gauge Dresden, which belongs to the oldest gauges in Europe with officially available daily discharge time series beginning on 1 January 1890. The project on the one hand aimed to extend and to revise the existing time series, and on the other hand to examine the variability of the Elbe river discharge conditions on a greater time scale. Therefore one major task were the historical searches and the examination of the retrieved documents and the contained information. After analysing this information the development of the river course and the discharge conditions were discussed. Using the provided knowledge, in an other subproject, a historical hydraulic model was established. Its results then again were used here. A further purpose was the determining of flood frequency based on all pre-processed data. The obtained knowledge about historical changes was also used to get an idea about possible future variations under climate change conditions. Especially variations in the runoff characteristic of the Elbe river over the course of the year were analysed. It succeeded to obtain a much longer discharge time series which contain fewer errors and uncertainties. Hence an optimized regional hazard assessment was realised.
Sava, E.; Cervone, G.; Kalyanapu, A. J.; Sampson, K. M.
2017-12-01
The increasing trend in flooding events, paired with rapid urbanization and an aging infrastructure is projected to enhance the risk of catastrophic losses and increase the frequency of both flash and large area floods. During such events, it is critical for decision makers and emergency responders to have access to timely actionable knowledge regarding preparedness, emergency response, and recovery before, during and after a disaster. Large volumes of data sets derived from sophisticated sensors, mobile phones, and social media feeds are increasingly being used to improve citizen services and provide clues to the best way to respond to emergencies through the use of visualization and GIS mapping. Such data, coupled with recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed decision makers to more efficiently extract precise and relevant knowledge and better understand how damage caused by disasters have real time effects on urban population. This research assesses the feasibility of integrating multiple sources of contributed data into hydrodynamic models for flood inundation simulation and estimating damage assessment. It integrates multiple sources of high-resolution physiographic data such as satellite remote sensing imagery coupled with non-authoritative data such as Civil Air Patrol (CAP) and `during-event' social media observations of flood inundation in order to improve the identification of flood mapping. The goal is to augment remote sensing imagery with new open-source datasets to generate flood extend maps at higher temporal and spatial resolution. The proposed methodology is applied on two test cases, relative to the 2013 Boulder Colorado flood and the 2015 floods in Texas.
Load Estimation by Frequency Domain Decomposition
DEFF Research Database (Denmark)
Pedersen, Ivar Chr. Bjerg; Hansen, Søren Mosegaard; Brincker, Rune
2007-01-01
When performing operational modal analysis the dynamic loading is unknown, however, once the modal properties of the structure have been estimated, the transfer matrix can be obtained, and the loading can be estimated by inverse filtering. In this paper loads in frequency domain are estimated by ...
Floods and climate: emerging perspectives for flood risk assessment and management
Merz, B.; Aerts, J.C.J.H.; Arnbjerg-Nielsen, K.; Baldi, M.; Becker, A.; Bichet, A.; Blöschl, G.; Bouwer, L.M.; Brauer, A.; Cioffi, F.; Delgado, J.M.; Gocht, M.; Guzetti, F.; Harrigan, S.; Hirschboeck, K.; Kilsby, C.; Kron, W.; Kwon, H. -H.; Lall, U.; Merz, R.; Nissen, K.; Salvatti, P.; Swierczynski, T.; Ulbrich, U.; Viglione, A.; Ward, P.J.; Weiler, M.; Wilhelm, B.; Nied, M.
2014-01-01
Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction of
Application of Flood Nomograph for Flood Forecasting in Urban Areas
Directory of Open Access Journals (Sweden)
Eui Hoon Lee
2018-01-01
Full Text Available Imperviousness has increased due to urbanization, as has the frequency of extreme rainfall events by climate change. Various countermeasures, such as structural and nonstructural measures, are required to prepare for these effects. Flood forecasting is a representative nonstructural measure. Flood forecasting techniques have been developed for the prevention of repetitive flood damage in urban areas. It is difficult to apply some flood forecasting techniques using training processes because training needs to be applied at every usage. The other flood forecasting techniques that use rainfall data predicted by radar are not appropriate for small areas, such as single drainage basins. In this study, a new flood forecasting technique is suggested to reduce flood damage in urban areas. The flood nomograph consists of the first flooding nodes in rainfall runoff simulations with synthetic rainfall data at each duration. When selecting the first flooding node, the initial amount of synthetic rainfall is 1 mm, which increases in 1 mm increments until flooding occurs. The advantage of this flood forecasting technique is its simple application using real-time rainfall data. This technique can be used to prepare a preemptive response in the process of urban flood management.
Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel
2017-04-01
Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.
Application of the Unbounded Probability Distribution of the Johnson System for Floods Estimation
Directory of Open Access Journals (Sweden)
Campos-Aranda Daniel Francisco
2015-09-01
Full Text Available Floods designs constitute a key to estimate the sizing of new water works and to review the hydrological security of existing ones. The most reliable method for estimating their magnitudes associated with certain return periods is to fit a probabilistic model to available records of maximum annual flows. Since such model is at first unknown, several models need to be tested in order to select the most appropriate one according to an arbitrary statistical index, commonly the standard error of fit. Several probability distributions have shown versatility and consistency of results when processing floods records and therefore, its application has been established as a norm or precept. The Johnson System has three families of distributions, one of which is the Log–Normal model with three parameters of fit, which is also the border between the bounded distributions and those with no upper limit. These families of distributions have four adjustment parameters and converge to the standard normal distribution, so that their predictions are obtained with such a model. Having contrasted the three probability distributions established by precept in 31 historical records of hydrological events, the Johnson system is applied to such data. The results of the unbounded distribution of the Johnson system (SJU are compared to the optimal results from the three distributions. It was found that the predictions of the SJU distribution are similar to those obtained with the other models in the low return periods ( 1000 years. Because of its theoretical support, the SJU model is recommended in flood estimation.
Predicting Coastal Flood Severity using Random Forest Algorithm
Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.
2017-12-01
Coastal floods have become more common recently and are predicted to further increase in frequency and severity due to sea level rise. Predicting floods in coastal cities can be difficult due to the number of environmental and geographic factors which can influence flooding events. Built stormwater infrastructure and irregular urban landscapes add further complexity. This paper demonstrates the use of machine learning algorithms in predicting street flood occurrence in an urban coastal setting. The model is trained and evaluated using data from Norfolk, Virginia USA from September 2010 - October 2016. Rainfall, tide levels, water table levels, and wind conditions are used as input variables. Street flooding reports made by city workers after named and unnamed storm events, ranging from 1-159 reports per event, are the model output. Results show that Random Forest provides predictive power in estimating the number of flood occurrences given a set of environmental conditions with an out-of-bag root mean squared error of 4.3 flood reports and a mean absolute error of 0.82 flood reports. The Random Forest algorithm performed much better than Poisson regression. From the Random Forest model, total daily rainfall was by far the most important factor in flood occurrence prediction, followed by daily low tide and daily higher high tide. The model demonstrated here could be used to predict flood severity based on forecast rainfall and tide conditions and could be further enhanced using more complete street flooding data for model training.
FREQUENCY ANALYSIS OF RAINFALL FOR FLOOD CONTROL IN ...
African Journals Online (AJOL)
The Niger Delta Region of Nigeria is within the mangrove forest region and is crisscrossed by series of streams and creeks. As a result of the high rainfall volume within this region there is a tendency for severe flooding to occur. These flood events have severe consequences on lives and properties. It is therefore necessary ...
Joint Spatio-Temporal Filtering Methods for DOA and Fundamental Frequency Estimation
DEFF Research Database (Denmark)
Jensen, Jesper Rindom; Christensen, Mads Græsbøll; Benesty, Jacob
2015-01-01
some attention in the community and is quite promising for several applications. The proposed methods are based on optimal, adaptive filters that leave the desired signal, having a certain DOA and fundamental frequency, undistorted and suppress everything else. The filtering methods simultaneously...... operate in space and time, whereby it is possible resolve cases that are otherwise problematic for pitch estimators or DOA estimators based on beamforming. Several special cases and improvements are considered, including a method for estimating the covariance matrix based on the recently proposed...
Frequency of posttraumatic stress disorder (ptsd) among flood affected individuals
International Nuclear Information System (INIS)
Aslam, N.; Kamal, A.
2014-01-01
Objectives: To investigate the relationship of exposure to a traumatic event and the subsequent onset of Posttraumatic Stress Disorder (PTSD) in the population exposed to floods in Pakistan. Study Design: Cross sectional study. Place and duration of study: Individuals exposed to the 2010 flood in district Shadadkot, Sindh from April 2012 to September 2012. Methodology: Sample of the study comprised of 101 individuals from the flood affected areas in Pakistan. Age range of the participants was 15 to 50 years (M=27.73, SD = 7.19), with participation of both males and females. PTSD was assessed by using the self report measure, impact of Event Scale (IES) and the subjective and objective experience to flood was assessed through Flood Related Exposure Scale (FRES) devised by the authors. Results: The prevalence rate of PTSD among the flood affected population was 35.5%. Trauma had significant positive relation with objective flood exposure and subjective flood exposure (r=.27 and r =.38) respectively. Inverse relation appeared between age and PTSD (r=-.20). PTSD was higher among females as compared to males. Conclusion: Understanding the prevalence of PTSD helps the mental health professionals in devising intervention strategies. A longitudinal study design is recommended that may be developed for better understanding of trajectories of trauma response across time span. Our findings may help identify populations at risk for treatment research. (author)
Estimating Coastal Lagoon Tidal Flooding and Repletion with Multidate ASTER Thermal Imagery
Directory of Open Access Journals (Sweden)
Thomas R. Allen
2012-10-01
Full Text Available Coastal lagoons mix inflowing freshwater and tidal marine waters in complex spatial patterns. This project sought to detect and measure temperature and spatial variability of flood tides for a constricted coastal lagoon using multitemporal remote sensing. Advanced Spaceborne Thermal Emission Radiometer (ASTER thermal infrared data provided estimates of surface temperature for delineation of repletion zones in portions of Chincoteague Bay, Virginia. ASTER high spatial resolution sea-surface temperature imagery in conjunction with in situ observations and tidal predictions helped determine the optimal seasonal data for analyses. The selected time series ASTER satellite data sets were analyzed at different tidal phases and seasons in 2004–2006. Skin surface temperatures of ocean and estuarine waters were differentiated by flood tidal penetration and ebb flows. Spatially variable tidal flood penetration was evaluated using discrete seed-pixel area analysis and time series Principal Components Analysis. Results from these techniques provide spatial extent and variability dynamics of tidal repletion, flushing, and mixing, important factors in eutrophication assessment, water quality and resource monitoring, and application of hydrodynamic modeling for coastal estuary science and management.
REVIEW OF IMPROVEMENTS IN RADIO FREQUENCY PHOTONICS
2017-09-01
AFRL-RY-WP-TR-2017-0156 REVIEW OF IMPROVEMENTS IN RADIO FREQUENCY PHOTONICS Preetpaul S. Devgan RF/EO Subsystems Branch Aerospace Components...Center (DTIC) (http://www.dtic.mil). AFRL-RY-WP-TR-2017-0156 HAS BEEN REVIEWED AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION...public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions
Renkewitz, Frank
2004-01-01
Contemporary theories on frequency processing have been developed in different sub-disciplines of psychology and have shown remarkable discrepancies. Thus, in judgment and decision making, frequency estimates on serially encoded events are mostly traced back to the availability heuristic (Tversky & Kahneman, 1973). Evidence for the use of this heuristic comes from several popular demonstrations of biased frequency estimates. In the area of decision making, these demonstrations led to the ...
International Nuclear Information System (INIS)
Head, J.W.
1982-01-01
Estimates of lava volumes on planetary surfaces provide important data on the lava flooding history and thermal evolution of a planet. Lack of information concerning the configuration of the topography prior to volcanic flooding requires the use of a variety of techniques to estimate lava thicknesses and volumes. A technique is described and developed which provides volume estimates by artificially flooding unflooded lunar topography characteristic of certain geological environments, and tracking the area covered, lava thicknesses, and lava volumes. Comparisons of map patterns of incompletely buried topography in these artificially flooded areas are then made to lava-flooded topography on the Moon in order to estimate the actual lava volumes. This technique is applied to two areas related to lunar impact basins; the relatively unflooded Orientale basin, and the Archimedes-Apennine Bench region of the Imbrium basin. (Auth.)
Kalyanapu, A. J.; Dullo, T. T.; Gangrade, S.; Kao, S. C.; Marshall, R.; Islam, S. R.; Ghafoor, S. K.
2017-12-01
Hurricane Harvey that made landfall in the southern Texas this August is one of the most destructive hurricanes during the 2017 hurricane season. During its active period, many areas in coastal Texas region received more than 40 inches of rain. This downpour caused significant flooding resulting in about 77 casualties, displacing more than 30,000 people, inundating hundreds of thousands homes and is currently estimated to have caused more than $70 billion in direct damage. One of the significantly affected areas is Harris County where the city of Houston, TX is located. Covering over two HUC-8 drainage basins ( 2702 mi2), this county experienced more than 80% of its annual average rainfall during this event. This study presents an effort to reconstruct flooding caused by extreme rainfall due to Hurricane Harvey in Harris County, Texas. This computationally intensive task was performed at a 30-m spatial resolution using a rapid flood model called Flood2D-GPU, a graphics processing unit (GPU) accelerated model, on Oak Ridge National Laboratory's (ORNL) Titan Supercomputer. For this task, the hourly rainfall estimates from the National Center for Environmental Prediction Stage IV Quantitative Precipitation Estimate were fed into the Variable Infiltration Capacity (VIC) hydrologic model and Routing Application for Parallel computation of Discharge (RAPID) routing model to estimate flow hydrographs at 69 locations for Flood2D-GPU simulation. Preliminary results of the simulation including flood inundation extents, maps of flood depths and inundation duration will be presented. Future efforts will focus on calibrating and validating the simulation results and assessing the flood damage for better understanding the impacts made by Hurricane Harvey.
Default Bayesian Estimation of the Fundamental Frequency
DEFF Research Database (Denmark)
Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt
2013-01-01
Joint fundamental frequency and model order esti- mation is an important problem in several applications. In this paper, a default estimation algorithm based on a minimum of prior information is presented. The algorithm is developed in a Bayesian framework, and it can be applied to both real....... Moreover, several approximations of the posterior distributions on the fundamental frequency and the model order are derived, and one of the state-of-the-art joint fundamental frequency and model order estimators is demonstrated to be a special case of one of these approximations. The performance...
Achieving Optimal Quantum Acceleration of Frequency Estimation Using Adaptive Coherent Control.
Naghiloo, M; Jordan, A N; Murch, K W
2017-11-03
Precision measurements of frequency are critical to accurate time keeping and are fundamentally limited by quantum measurement uncertainties. While for time-independent quantum Hamiltonians the uncertainty of any parameter scales at best as 1/T, where T is the duration of the experiment, recent theoretical works have predicted that explicitly time-dependent Hamiltonians can yield a 1/T^{2} scaling of the uncertainty for an oscillation frequency. This quantum acceleration in precision requires coherent control, which is generally adaptive. We experimentally realize this quantum improvement in frequency sensitivity with superconducting circuits, using a single transmon qubit. With optimal control pulses, the theoretically ideal frequency precision scaling is reached for times shorter than the decoherence time. This result demonstrates a fundamental quantum advantage for frequency estimation.
Multi-catchment rainfall-runoff simulation for extreme flood estimation
Paquet, Emmanuel
2017-04-01
The SCHADEX method (Paquet et al., 2013) is a reference method in France for the estimation of extreme flood for dam design. The method is based on a semi-continuous rainfall-runoff simulation process: hundreds of different rainy events, randomly drawn up to extreme values, are simulated independently in the hydrological conditions of each day when a rainy event has been actually observed. This allows generating an exhaustive set of crossings between precipitation and soil saturation hazards, and to build a complete distribution of flood discharges up to extreme quantiles. The hydrological model used within SCHADEX, the MORDOR model (Garçon, 1996), is a lumped model, which implies that hydrological processes, e.g. rainfall and soil saturation, are supposed to be homogeneous throughout the catchment. Snow processes are nevertheless represented in relation with altitude. This hypothesis of homogeneity is questionable especially as the size of the catchment increases, or in areas of highly contrasted climatology (like mountainous areas). Conversely, modeling the catchment with a fully distributed approach would cause different problems, in particular distributing the rainfall-runoff model parameters trough space, and within the SCHADEX stochastic framework, generating extreme rain fields with credible spatio-temporal features. An intermediate solution is presented here. It provides a better representation of the hydro-climatic diversity of the studied catchment (especially regarding flood processes) while keeping the SCHADEX simulation framework. It consists in dividing the catchment in several, more homogeneous sub-catchments. Rainfall-runoff models are parameterized individually for each of them, using local discharge data if available. A first SCHADEX simulation is done at the global scale, which allows assigning a probability to each simulated event, mainly based on the global areal rainfall drawn for the event (see Paquet el al., 2013 for details). Then the
An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance
Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun
2015-01-01
Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314
An experimental system for flood risk forecasting at global scale
Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.
2016-12-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.
Improvements in fast-response flood modeling: desktop parallel computing and domain tracking
Energy Technology Data Exchange (ETDEWEB)
Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH
2009-01-01
It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al
Giguet-Covex, Charline; Arnaud, Fabien; Enters, Dirk; Poulenard, Jérôme; Millet, Laurent; Francus, Pierre; David, Fernand; Rey, Pierre-Jérôme; Wilhelm, Bruno; Delannoy, Jean-Jacques
2012-01-01
In central Western Europe, several studies have shown that colder Holocene periods, such as the Little Ice Age, also correspond to wet periods. However, in mountain areas which are highly sensitive to erosion processes and where precipitation events can be localized, past evolution of hydrological activity might be more complicated. To assess these past hydrological changes, a paleolimnological approach was applied on a 13.4-m-long sediment core taken in alpine Lake Anterne (2063 m asl) and representing the last 3.5 ka. Lake sedimentation is mainly composed of flood deposits triggered by precipitation events. Sedimentological and geochemical analyses show that floods were more frequent during cold periods while high-intensity flood events occurred preferentially during warmer periods. In mild temperature conditions, both flood patterns are present. This underlines the complex relationship between flood hazards and climatic change in mountain areas. During the warmer and/or dryer times of the end of Iron Age and the Roman Period, both the frequency and intensity of floods increased. This is interpreted as an effect of human-induced clearing for grazing activities and reveals that anthropogenic interferences must be taken into account when reconstructing climatic signals from natural archives.
Demand analysis of flood insurance by using logistic regression model and genetic algorithm
Sidi, P.; Mamat, M. B.; Sukono; Supian, S.; Putra, A. S.
2018-03-01
Citarum River floods in the area of South Bandung Indonesia, often resulting damage to some buildings belonging to the people living in the vicinity. One effort to alleviate the risk of building damage is to have flood insurance. The main obstacle is not all people in the Citarum basin decide to buy flood insurance. In this paper, we intend to analyse the decision to buy flood insurance. It is assumed that there are eight variables that influence the decision of purchasing flood assurance, include: income level, education level, house distance with river, building election with road, flood frequency experience, flood prediction, perception on insurance company, and perception towards government effort in handling flood. The analysis was done by using logistic regression model, and to estimate model parameters, it is done with genetic algorithm. The results of the analysis shows that eight variables analysed significantly influence the demand of flood insurance. These results are expected to be considered for insurance companies, to influence the decision of the community to be willing to buy flood insurance.
Directory of Open Access Journals (Sweden)
L. Cenci
2017-11-01
Full Text Available The assimilation of satellite-derived soil moisture estimates (soil moisture–data assimilation, SM–DA into hydrological models has the potential to reduce the uncertainty of streamflow simulations. The improved capacity to monitor the closeness to saturation of small catchments, such as those characterizing the Mediterranean region, can be exploited to enhance flash flood predictions. When compared to other microwave sensors that have been exploited for SM–DA in recent years (e.g. the Advanced SCATterometer – ASCAT, characterized by low spatial/high temporal resolution, the Sentinel 1 (S1 mission provides an excellent opportunity to monitor systematically soil moisture (SM at high spatial resolution and moderate temporal resolution. The aim of this research was thus to evaluate the impact of S1-based SM–DA for enhancing flash flood predictions of a hydrological model (Continuum that is currently exploited for civil protection applications in Italy. The analysis was carried out in a representative Mediterranean catchment prone to flash floods, located in north-western Italy, during the time period October 2014–February 2015. It provided some important findings: (i revealing the potential provided by S1-based SM–DA for improving discharge predictions, especially for higher flows; (ii suggesting a more appropriate pre-processing technique to be applied to S1 data before the assimilation; and (iii highlighting that even though high spatial resolution does provide an important contribution in a SM–DA system, the temporal resolution has the most crucial role. S1-derived SM maps are still a relatively new product and, to our knowledge, this is the first work published in an international journal dealing with their assimilation within a hydrological model to improve continuous streamflow simulations and flash flood predictions. Even though the reported results were obtained by analysing a relatively short time period, and thus should be
Cenci, Luca; Pulvirenti, Luca; Boni, Giorgio; Chini, Marco; Matgen, Patrick; Gabellani, Simone; Squicciarino, Giuseppe; Pierdicca, Nazzareno
2017-11-01
The assimilation of satellite-derived soil moisture estimates (soil moisture-data assimilation, SM-DA) into hydrological models has the potential to reduce the uncertainty of streamflow simulations. The improved capacity to monitor the closeness to saturation of small catchments, such as those characterizing the Mediterranean region, can be exploited to enhance flash flood predictions. When compared to other microwave sensors that have been exploited for SM-DA in recent years (e.g. the Advanced SCATterometer - ASCAT), characterized by low spatial/high temporal resolution, the Sentinel 1 (S1) mission provides an excellent opportunity to monitor systematically soil moisture (SM) at high spatial resolution and moderate temporal resolution. The aim of this research was thus to evaluate the impact of S1-based SM-DA for enhancing flash flood predictions of a hydrological model (Continuum) that is currently exploited for civil protection applications in Italy. The analysis was carried out in a representative Mediterranean catchment prone to flash floods, located in north-western Italy, during the time period October 2014-February 2015. It provided some important findings: (i) revealing the potential provided by S1-based SM-DA for improving discharge predictions, especially for higher flows; (ii) suggesting a more appropriate pre-processing technique to be applied to S1 data before the assimilation; and (iii) highlighting that even though high spatial resolution does provide an important contribution in a SM-DA system, the temporal resolution has the most crucial role. S1-derived SM maps are still a relatively new product and, to our knowledge, this is the first work published in an international journal dealing with their assimilation within a hydrological model to improve continuous streamflow simulations and flash flood predictions. Even though the reported results were obtained by analysing a relatively short time period, and thus should be supported by further
Fragmented patterns of flood change across the United States
Archfield, Stacey A.; Hirsch, Robert M.; Viglione, A.; Blöschl, G.
2016-01-01
Trends in the peak magnitude, frequency, duration, and volume of frequent floods (floods occurring at an average of two events per year relative to a base period) across the United States show large changes; however, few trends are found to be statistically significant. The multidimensional behavior of flood change across the United States can be described by four distinct groups, with streamgages experiencing (1) minimal change, (2) increasing frequency, (3) decreasing frequency, or (4) increases in all flood properties. Yet group membership shows only weak geographic cohesion. Lack of geographic cohesion is further demonstrated by weak correlations between the temporal patterns of flood change and large-scale climate indices. These findings reveal a complex, fragmented pattern of flood change that, therefore, clouds the ability to make meaningful generalizations about flood change across the United States.
Gray bootstrap method for estimating frequency-varying random vibration signals with small samples
Directory of Open Access Journals (Sweden)
Wang Yanqing
2014-04-01
Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.
Flood rich periods, flood poor periods and the need to look beyond instrumental records
Lane, S. N.
2009-04-01
For many, the later 20th Century and early 21st Century has become synonymous with a growing experience of flood risk. Scientists, politicians and the media have ascribed this to changing climate and there are good hypothetical reasons for human-induced climate change to be impacting upon the magnitude and frequency of extreme weather events. In this paper, I will interrogate this claim more carefully, using the UK's instrumental records of river flow, most of which begin after 1960, but a smaller number of which extend back into the 19th Century. Those records that extent back to the 19th Century suggest that major flood events tend to cluster into periods that are relatively flood rich and relatively flood poor, most notably in larger drainage basins: i.e. there is a clear scale issue. The timing (inset, duration, termination) of these periods varies systematically by region although there is a marked flood poor period for much of the UK during the late 1960s, 1970s and 1980s. It follows that at least some of the current experience of flooding, including why it has taken so many policy-makers and flood victims by surprise, may reflect a transition from a flood poor to a flood rich period, exacerbated by possible climate change impacts. These results point to the need to rethink how we think through what drives flood risk. First, it points to the need to look at some of the fundamental oscillations in core atmospheric drivers, such as the North Atlantic Multidecadal Oscillation, in explaining what drives flood risk. Consideration of precipitation, as opposed to river flow, is more advanced in this respect, and those of us working in rivers need to engage much more thoughtfully with atmospheric scientists. Second, it points to the severe inadequacies in using records of only a few decades duration. Even where these are pooled across adjacent sub-catchments, there is likely to be a severe bias in the estimation of flood return periods when we look at instrumental
Theresa K. Andersen; Marshall J. Shepherd
2013-01-01
Atmospheric warming and associated hydrological changes have implications for regional flood intensity and frequency. Climate models and hydrological models have the ability to integrate various contributing factors and assess potential changes to hydrology at global to local scales through the century. This survey of floods in a changing climate reviews flood...
Yin, Yixing; Chen, Haishan; Xu, Chongyu; Xu, Wucheng; Chen, Changchun
2014-05-01
The regionalization methods which 'trade space for time' by including several at-site data records in the frequency analysis are an efficient tool to improve the reliability of extreme quantile estimates. With the main aims of improving the understanding of the regional frequency of extreme precipitation and providing scientific and practical background and assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region, in this paper, L-moment-based index-flood (LMIF) method, one of the popular regionalization methods, is used in the regional frequency analysis of extreme precipitation; attention was paid to inter-site dependence and its influence on the accuracy of quantile estimates, which hasn't been considered for most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, Generalized extreme-value (GEV) and Generalized Normal (GNO) distributions were identified as the best-fit distributions for most of the sub regions. Estimated quantiles for each region were further obtained. Monte-Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root mean square errors (RMSEs) were bigger and the 90% error bounds were wider with inter-site dependence than those with no inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with return period of 100 years were obtained which indicated that there are two regions with the highest precipitation extremes (southeastern coastal area of Zhejiang Province and the
Mousa, Mustafa; Claudel, Christian G.
2014-01-01
floods occur very rarely, we use a supervised learning approach to estimate the correction to the ultrasonic rangefinder caused by temperature fluctuations. Preliminary data shows that water level can be estimated with an absolute error of less than 2 cm
Improving operational flood forecasting through data assimilation
Rakovec, Oldrich; Weerts, Albrecht; Uijlenhoet, Remko; Hazenberg, Pieter; Torfs, Paul
2010-05-01
Accurate flood forecasts have been a challenging topic in hydrology for decades. Uncertainty in hydrological forecasts is due to errors in initial state (e.g. forcing errors in historical mode), errors in model structure and parameters and last but not least the errors in model forcings (weather forecasts) during the forecast mode. More accurate flood forecasts can be obtained through data assimilation by merging observations with model simulations. This enables to identify the sources of uncertainties in the flood forecasting system. Our aim is to assess the different sources of error that affect the initial state and to investigate how they propagate through hydrological models with different levels of spatial variation, starting from lumped models. The knowledge thus obtained can then be used in a data assimilation scheme to improve the flood forecasts. This study presents the first results of this framework and focuses on quantifying precipitation errors and its effect on discharge simulations within the Ourthe catchment (1600 km2), which is situated in the Belgian Ardennes and is one of the larger subbasins of the Meuse River. Inside the catchment, hourly rain gauge information from 10 different locations is available over a period of 15 years. Based on these time series, the bootstrap method has been applied to generate precipitation ensembles. These were then used to simulate the catchment's discharges at the outlet. The corresponding streamflow ensembles were further assimilated with observed river discharges to update the model states of lumped hydrological models (R-PDM, HBV) through Residual Resampling. This particle filtering technique is a sequential data assimilation method and takes no prior assumption of the probability density function for the model states, which in contrast to the Ensemble Kalman filter does not have to be Gaussian. Our further research will be aimed at quantifying and reducing the sources of uncertainty that affect the initial
Revision of regional maximum flood (RMF) estimation in Namibia ...
African Journals Online (AJOL)
Extreme flood hydrology in Namibia for the past 30 years has largely been based on the South African Department of Water Affairs Technical Report 137 (TR 137) of 1988. This report proposes an empirically established upper limit of flood peaks for regions called the regional maximum flood (RMF), which could be ...
Odry, Jean; Arnaud, Patrick
2016-04-01
The SHYREG method (Aubert et al., 2014) associates a stochastic rainfall generator and a rainfall-runoff model to produce rainfall and flood quantiles on a 1 km2 mesh covering the whole French territory. The rainfall generator is based on the description of rainy events by descriptive variables following probability distributions and is characterised by a high stability. This stochastic generator is fully regionalised, and the rainfall-runoff transformation is calibrated with a single parameter. Thanks to the stability of the approach, calibration can be performed against only flood quantiles associated with observated frequencies which can be extracted from relatively short time series. The aggregation of SHYREG flood quantiles to the catchment scale is performed using an areal reduction factor technique unique on the whole territory. Past studies demonstrated the accuracy of SHYREG flood quantiles estimation for catchments where flow data are available (Arnaud et al., 2015). Nevertheless, the parameter of the rainfall-runoff model is independently calibrated for each target catchment. As a consequence, this parameter plays a corrective role and compensates approximations and modelling errors which makes difficult to identify its proper spatial pattern. It is an inherent objective of the SHYREG approach to be completely regionalised in order to provide a complete and accurate flood quantiles database throughout France. Consequently, it appears necessary to identify the model configuration in which the calibrated parameter could be regionalised with acceptable performances. The revaluation of some of the method hypothesis is a necessary step before the regionalisation. Especially the inclusion or the modification of the spatial variability of imposed parameters (like production and transfer reservoir size, base flow addition and quantiles aggregation function) should lead to more realistic values of the only calibrated parameter. The objective of the work presented
Bachand, Philip A M; Roy, Sujoy B; Choperena, Joe; Cameron, Don; Horwath, William R
2014-12-02
The agriculturally productive San Joaquin Valley faces two severe hydrologic issues: persistent groundwater overdraft and flooding risks. Capturing flood flows for groundwater recharge could help address both of these issues, yet flood flow frequency, duration, and magnitude vary greatly as upstream reservoir releases are affected by snowpack, precipitation type, reservoir volume, and flood risks. This variability makes dedicated, engineered recharge approaches expensive. Our work evaluates leveraging private farmlands in the Kings River Basin to capture flood flows for direct and in lieu recharge, calculates on-farm infiltration rates, assesses logistics, and considers potential water quality issues. The Natural Resources Conservation Service (NRCS) soil series suggested that a cementing layer would hinder recharge. The standard practice of deep ripping fractured the layer, resulting in infiltration rates averaging 2.5 in d(-1) (6 cm d(-1)) throughout the farm. Based on these rates 10 acres are needed to infiltrate 1 cfs (100 m(3) h(-1)) of flood flows. Our conceptual model predicts that salinity and nitrate pulses flush initially to the groundwater but that groundwater quality improves in the long term due to pristine flood flows low in salts or nitrate. Flood flow capture, when integrated with irrigation, is more cost-effective than groundwater pumping.
Directory of Open Access Journals (Sweden)
L. Altarejos-García
2012-07-01
Full Text Available This paper addresses the use of reliability techniques such as Rosenblueth's Point-Estimate Method (PEM as a practical alternative to more precise Monte Carlo approaches to get estimates of the mean and variance of uncertain flood parameters water depth and velocity. These parameters define the flood severity, which is a concept used for decision-making in the context of flood risk assessment. The method proposed is particularly useful when the degree of complexity of the hydraulic models makes Monte Carlo inapplicable in terms of computing time, but when a measure of the variability of these parameters is still needed. The capacity of PEM, which is a special case of numerical quadrature based on orthogonal polynomials, to evaluate the first two moments of performance functions such as the water depth and velocity is demonstrated in the case of a single river reach using a 1-D HEC-RAS model. It is shown that in some cases, using a simple variable transformation, statistical distributions of both water depth and velocity approximate the lognormal. As this distribution is fully defined by its mean and variance, PEM can be used to define the full probability distribution function of these flood parameters and so allowing for probability estimations of flood severity. Then, an application of the method to the same river reach using a 2-D Shallow Water Equations (SWE model is performed. Flood maps of mean and standard deviation of water depth and velocity are obtained, and uncertainty in the extension of flooded areas with different severity levels is assessed. It is recognized, though, that whenever application of Monte Carlo method is practically feasible, it is a preferred approach.
Improving Radar QPE's in Complex Terrain for Improved Flash Flood Monitoring and Prediction
Cifelli, R.; Streubel, D. P.; Reynolds, D.
2010-12-01
Quantitative Precipitation Estimation (QPE) is extremely challenging in regions of complex terrain due to a combination of issues related to sampling. In particular, radar beams are often blocked or scan above the liquid precipitation zone while rain gauge density is often too low to properly characterize the spatial distribution of precipitation. Due to poor radar coverage, rain gauge networks are used by the National Weather Service (NWS) River Forecast Centers as the principal source for QPE across the western U.S. The California Nevada River Forecast Center (CNRFC) uses point rainfall measurements and historical rainfall runoff relationships to derive river stage forecasts. The point measurements are interpolated to a 4 km grid using Parameter-elevation Regressions on Independent Slopes Model (PRISM) data to develop a gridded 6-hour QPE product (hereafter referred to as RFC QPE). Local forecast offices can utilize the Multi-sensor Precipitation Estimator (MPE) software to improve local QPE’s and thus local flash flood monitoring and prediction. MPE uses radar and rain gauge data to develop a combined QPE product at 1-hour intervals. The rain gauge information is used to bias correct the radar precipitation estimates so that, in situations where the rain gauge density and radar coverage are adequate, MPE can take advantage of the spatial coverage of the radar and the “ground truth” of the rain gauges to provide an accurate QPE. The MPE 1-hour QPE analysis should provide better spatial and temporal resolution for short duration hydrologic events as compared to 6-hour analyses. These hourly QPEs are then used to correct radar derived rain rates used by the Flash Flood Monitoring and Prediction (FFMP) software in forecast offices for issuance of flash flood warnings. Although widely used by forecasters across the eastern U.S., MPE is not used extensively by the NWS in the west. Part of the reason for the lack of use of MPE across the west is that there has
A methodology to derive Synthetic Design Hydrographs for river flood management
Tomirotti, Massimo; Mignosa, Paolo
2017-12-01
The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.
Payrastre, Olivier; Bourgin, François; Lebouc, Laurent; Le Bihan, Guillaume; Gaume, Eric
2017-04-01
The October 2015 flash-floods in south eastern France caused more than twenty fatalities, high damages and large economic losses in high density urban areas of the Mediterranean coast, including the cities of Mandelieu-La Napoule, Cannes and Antibes. Following a post event survey and preliminary analyses conducted within the framework of the Hymex project, we set up an entire simulation chain at the regional scale to better understand this outstanding event. Rainfall-runoff simulations, inundation mapping and a first estimation of the impacts are conducted following the approach developed and successfully applied for two large flash-flood events in two different French regions (Gard in 2002 and Var in 2010) by Le Bihan (2016). A distributed rainfall-runoff model applied at high resolution for the whole area - including numerous small ungauged basins - is used to feed a semi-automatic hydraulic approach (Cartino method) applied along the river network - including small tributaries. Estimation of the impacts is then performed based on the delineation of the flooded areas and geographic databases identifying buildings and population at risk.
Delivering Integrated Flood Risk Management : Governance for collaboration, learning and adaptation
Van Herk, S.
2014-01-01
The frequency and consequences of extreme flood events have increased rapidly worldwide in recent decades and climate change and economic growth are likely to exacerbate this trend. Flood protection measures alone cannot accommodate the future frequencies and impacts of flooding. Integrated flood
Delivering Integrated Flood Risk Management: Governance for collaboration, learning and adaptation
Van Herk, S.
2014-01-01
The frequency and consequences of extreme flood events have increased rapidly worldwide in recent decades and climate change and economic growth are likely to exacerbate this trend. Flood protection measures alone cannot accommodate the future frequencies and impacts of flooding. Integrated flood
Climate change track in river floods in Europe
Directory of Open Access Journals (Sweden)
Z. W. Kundzewicz
2015-06-01
Full Text Available A holistic perspective on changing river flood risk in Europe is provided. Economic losses from floods have increased, principally driven by the expanding exposure of assets at risk. Climate change (i.e. observed increase in precipitation intensity, decrease of snowpack and other observed climate changes might already have had an impact on floods. However, no gauge-based evidence had been found for a climate-driven, widespread change in the magnitude/frequency of floods during the last decades. There are strong regional and sub-regional variations in the trends. Moreover, it has not been generally possible to attribute rain-generated peak streamflow trends to anthropogenic climate change. Physical reasoning suggests that projected increases in the frequency and intensity of heavy rainfall would contribute to increases in rain-generated local floods, while less snowmelt flooding and earlier spring peak flows in snowmelt-fed rivers are expected. However, there is low confidence in future changes in flood magnitude and frequency resulting from climate change. The impacts of climate change on flood characteristics are highly sensitive to the detailed nature of those changes. Discussion of projections of flood hazard in Europe is offered. Attention is drawn to a considerable uncertainty - over the last decade or so, projections of flood hazard in Europe have largely changed.
Okada, T.; McAneney, K. J.; Chen, K.
2011-12-01
Flooding on the Tone River, which drains the largest catchment area in Japan and is now home to 12 million people, poses significant risk to the Greater Tokyo Area. In April 2010, an expert panel in Japan, the Central Disaster Prevention Council, examined the potential for large-scale flooding and outlined possible mitigation measures in the Greater Tokyo Area. One of the scenarios considered closely mimics the pattern of flooding that occurred with the passage of Typhoon Kathleen in 1947 and would potentially flood some 680 000 households above floor level. Building upon that report, this study presents a Geographical Information System (GIS)-based data integration approach to estimate the insurance losses for residential buildings and contents as just one component of the potential financial cost. Using a range of publicly available data - census information, location reference data, insurance market information and flood water elevation data - this analysis finds that insurance losses for residential property alone could reach approximately 1 trillion JPY (US 12.5 billion). Total insurance losses, including commercial and industrial lines of business, are likely to be at least double this figure with total economic costs being much greater again. The results are sensitive to the flood scenario assumed, position of levee failures, local flood depths and extents, population and building heights. The Average Recurrence Interval (ARI) of the rainfall following Typhoon Kathleen has been estimated to be on the order of 200 yr; however, at this juncture it is not possible to put an ARI on the modelled loss since we cannot know the relative or joint probability of the different flooding scenarios. It is possible that more than one of these scenarios could occur simultaneously or that levee failure at one point might lower water levels downstream and avoid a failure at all other points. In addition to insurance applications, spatial analyses like that presented here have
Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.
2013-01-01
he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
Why are decisions in flood disaster management so poorly supported by information from flood models?
Leskens, Anne; Brugnach, Marcela Fabiana; Hoekstra, Arjen Ysbert; Schuurmans, W.
2014-01-01
Flood simulation models can provide practitioners of Flood Disaster Management with sophisticated estimates of floods. Despite the advantages that flood simulation modeling may provide, experiences have proven that these models are of limited use. Until now, this problem has mainly been investigated
Directory of Open Access Journals (Sweden)
Xiao Qi Ye
Full Text Available Operation of the Three Gorges Reservoir (TGR, China imposes a new water fluctuation regime, including a prolonged winter submergence in contrast to the natural short summer flooding of the rivers. The contrasting water temperature regimes may remarkably affect the survival of submerged plants in the TGR. Plant survival in such prolonged flooding might depend on the carbohydrate status of the plants. Therefore, we investigated the effects of water temperature on survival and carbohydrate status in a flood-tolerant plant species and predicted that both survival and carbohydrate status would be improved by lower water temperatures.A growth chamber experiment with controlled water temperature were performed with the flood-tolerant species Arundinella anomala from the TGR region. The plants were submerged (80 cm deep water above soil surface with a constant water temperature at 30°C, 20°C or 10°C. The water temperature effects on survival, plant biomass and carbohydrate content (glucose, fructose and sucrose and starch in the viable and dead tissues were investigated.The results showed that the survival percentage of A.anomala plants was greatly dependent on water temperature. The two-month submergence survival percentage was 100% at 10°C, 40% at 20°C and 0% at 30°C. Decreasing the water temperature led to both later leaf death and slower biomass loss. Temperature decrease also induced less reduction in glucose, fructose and sucrose in the roots and leaves (before decay, p 0.05. Different water temperatures did not alter the carbon pool size in the stems, leaves and whole plants (p > 0.05, but a clear difference was found in the roots (p < 0.05, with a larger pool size at a lower temperature.We concluded that (1 A. anomala is characterized by high flooding tolerance and sustained capability to mobilize carbohydrate pool. (2 The survival percentage and carbohydrate status of submerged A. anomala plants were remarkably improved by lower water
Hong, Yang; Adler, Robert F.; Huffman, George J.; Pierce, Harold
2008-01-01
Advances in flood monitoring/forecasting have been constrained by the difficulty in estimating rainfall continuously over space (catchment-, national-, continental-, or even global-scale areas) and flood-relevant time scale. With the recent availability of satellite rainfall estimates at fine time and space resolution, this paper describes a prototype research framework for global flood monitoring by combining real-time satellite observations with a database of global terrestrial characteristics through a hydrologically relevant modeling scheme. Four major components included in the framework are (1) real-time precipitation input from NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA); (2) a central geospatial database to preprocess the land surface characteristics: water divides, slopes, soils, land use, flow directions, flow accumulation, drainage network etc.; (3) a modified distributed hydrological model to convert rainfall to runoff and route the flow through the stream network in order to predict the timing and severity of the flood wave, and (4) an open-access web interface to quickly disseminate flood alerts for potential decision-making. Retrospective simulations for 1998-2006 demonstrate that the Global Flood Monitor (GFM) system performs consistently at both station and catchment levels. The GFM website (experimental version) has been running at near real-time in an effort to offer a cost-effective solution to the ultimate challenge of building natural disaster early warning systems for the data-sparse regions of the world. The interactive GFM website shows close-up maps of the flood risks overlaid on topography/population or integrated with the Google-Earth visualization tool. One additional capability, which extends forecast lead-time by assimilating QPF into the GFM, also will be implemented in the future.
Crowdsourcing detailed flood data
Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad
2015-04-01
Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK
Increasing stress on disaster risk finance due to large floods
Jongman, Brenden; Hochrainer-Stigler, Stefan; Feyen, Luc; Aerts, Jeroen; Mechler, Reinhard; Botzen, Wouter; Bouwer, Laurens; Pflug, Georg; Rojas, Rodrigo; Ward, Philip
2014-05-01
Recent major flood disasters have shown that single extreme events can affect multiple countries simultaneously, which puts high pressure on trans-national risk reduction and risk transfer mechanisms. To date, little is known about such flood hazard interdependencies across regions, and the corresponding joint risks at regional to continental scales. Reliable information on correlated loss probabilities is crucial for developing robust insurance schemes and public adaptation funds, and for enhancing our understanding of climate change impacts. Here we show that extreme discharges are strongly correlated across European river basins and that these correlations can, or should, be used in national to continental scale risk assessment. We present probabilistic trends in continental flood risk, and demonstrate that currently observed extreme flood losses could more than double in frequency by 2050 under future climate change and socioeconomic development. The results demonstrate that accounting for tail dependencies leads to higher estimates of extreme losses than estimates based on the traditional assumption of independence between basins. We suggest that risk management for these increasing losses is largely feasible, and we demonstrate that risk can be shared by expanding risk transfer financing, reduced by investing in flood protection, or absorbed by enhanced solidarity between countries. We conclude that these measures have vastly different efficiency, equity and acceptability implications, which need to be taken into account in broader consultation, for which our analysis provides a basis.
Sibari, Hayat; Haida, Souad; Foutlane, Mohamed
2018-05-01
This work aims to estimate the contributions of the Inaouene River during the floods. It is in this context that the dissolved and particulate matter flows were measured during the flood periods followed by the 1996/97 study year at the two hydrological stations Bab Marzouka (upstream) and El Kouchat (downstream). The specific flows of dissolved materials calculated upstream and downstream of the Inaouene watershed correspond respectively to 257 t/ km2/year and 117 t/ km2/year. Chlorides represent 30% and 41% respectively of the total dissolved transport upstream and downstream. The potential mechanical degradation affecting the Inaouene watershed can deliver a solid load estimated at 6.106 t/year corresponding to a specific flow of 2142 t/km2/year.
Towards a Flood Severity Index
Kettner, A.; Chong, A.; Prades, L.; Brakenridge, G. R.; Muir, S.; Amparore, A.; Slayback, D. A.; Poungprom, R.
2017-12-01
Flooding is the most common natural hazard worldwide, affecting 21 million people every year. In the immediate moments following a flood event, humanitarian actors like the World Food Program need to make rapid decisions ( 72 hrs) on how to prioritize affected areas impacted by such an event. For other natural disasters like hurricanes/cyclones and earthquakes, there are industry-recognized standards on how the impacted areas are to be classified. Shake maps, quantifying peak ground motion, from for example the US Geological Survey are widely used for assessing earthquakes. Similarly, cyclones are tracked by Joint Typhoon Warning Center (JTWC) and Global Disaster Alert and Coordination System (GDACS) who release storm nodes and tracks (forecasted and actual), with wind buffers and classify the event according to the Saffir-Simpson Hurricane Wind Scale. For floods, the community is usually able to acquire unclassified data of the flood extent as identified from satellite imagery. Most often no water discharge hydrograph is available to classify the event into recurrence intervals simply because there is no gauging station, or the gauging station was unable to record the maximum discharge due to overtopping or flood damage. So, the question remains: How do we methodically turn a flooded area into classified areas of different gradations of impact? Here, we present a first approach towards developing a global applicable flood severity index. The flood severity index is set up such that it considers relatively easily obtainable physical parameters in a short period of time like: flood frequency (relating the current flood to historical events) and magnitude, as well as land cover, slope, and where available pre-event simulated flood depth. The scale includes categories ranging from very minor flooding to catastrophic flooding. We test and evaluate the postulated classification scheme against a set of past flood events. Once a severity category is determined, socio
Magnitude of flood flows for selected annual exceedance probabilities for streams in Massachusetts
Zarriello, Phillip J.
2017-05-11
The U.S. Geological Survey, in cooperation with the Massachusetts Department of Transportation, determined the magnitude of flood flows at selected annual exceedance probabilities (AEPs) at streamgages in Massachusetts and from these data developed equations for estimating flood flows at ungaged locations in the State. Flood magnitudes were determined for the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEPs at 220 streamgages, 125 of which are in Massachusetts and 95 are in the adjacent States of Connecticut, New Hampshire, New York, Rhode Island, and Vermont. AEP flood flows were computed for streamgages using the expected moments algorithm weighted with a recently computed regional skewness coefficient for New England.Regional regression equations were developed to estimate the magnitude of floods for selected AEP flows at ungaged sites from 199 selected streamgages and for 60 potential explanatory basin characteristics. AEP flows for 21 of the 125 streamgages in Massachusetts were not used in the final regional regression analysis, primarily because of regulation or redundancy. The final regression equations used generalized least squares methods to account for streamgage record length and correlation. Drainage area, mean basin elevation, and basin storage explained 86 to 93 percent of the variance in flood magnitude from the 50- to 0.2-percent AEPs, respectively. The estimates of AEP flows at streamgages can be improved by using a weighted estimate that is based on the magnitude of the flood and associated uncertainty from the at-site analysis and the regional regression equations. Weighting procedures for estimating AEP flows at an ungaged site on a gaged stream also are provided that improve estimates of flood flows at the ungaged site when hydrologic characteristics do not abruptly change.Urbanization expressed as the percentage of imperviousness provided some explanatory power in the regional regression; however, it was not statistically
Yin, Yixing; Chen, Haishan; Xu, Chong-Yu; Xu, Wucheng; Chen, Changchun; Sun, Shanlei
2016-05-01
The regionalization methods, which "trade space for time" by pooling information from different locations in the frequency analysis, are efficient tools to enhance the reliability of extreme quantile estimates. This paper aims at improving the understanding of the regional frequency of extreme precipitation by using regionalization methods, and providing scientific background and practical assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region. To achieve the main goals, L-moment-based index-flood (LMIF) method, one of the most popular regionalization methods, is used in the regional frequency analysis of extreme precipitation with special attention paid to inter-site dependence and its influence on the accuracy of quantile estimates, which has not been considered by most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence, and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, generalized extreme-value (GEV) and generalized normal (GNO) distributions were identified as the best fitted distributions for most of the sub-regions, and estimated quantiles for each region were obtained. Monte Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root-mean-square errors (RMSEs) were bigger and the 90 % error bounds were wider with inter-site dependence than those without inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with a return period of 100 years were finally obtained which indicated that there are two regions with highest precipitation
Flood risk analysis procedure for nuclear power plants
International Nuclear Information System (INIS)
Wagner, D.P.
1982-01-01
This paper describes a methodology and procedure for determining the impact of floods on nuclear power plant risk. The procedures are based on techniques of fault tree and event tree analysis and use the logic of these techniques to determine the effects of a flood on system failure probability and accident sequence occurrence frequency. The methodology can be applied independently or as an add-on analysis for an existing risk assessment. Each stage of the analysis yields useful results such as the critical flood level, failure flood level, and the flood's contribution to accident sequence occurrence frequency. The results of applications show the effects of floods on the risk from nuclear power plants analyzed in the Reactor Safety Study
Current and future pluvial flood hazard analysis for the city of Antwerp
Willems, Patrick; Tabari, Hossein; De Niel, Jan; Van Uytven, Els; Lambrechts, Griet; Wellens, Geert
2016-04-01
For the city of Antwerp in Belgium, higher rainfall extremes were observed in comparison with surrounding areas. The differences were found statistically significant for some areas and may be the result of the heat island effect in combination with the higher concentrations of aerosols. A network of 19 rain gauges but with varying records length (the longest since the 1960s) and continuous radar data for 10 years were combined to map the spatial variability of rainfall extremes over the city at various durations from 15 minutes to 1 day together with the uncertainty. The improved spatial rainfall information was used as input in the sewer system model of the city to analyze the frequency of urban pluvial floods. Comparison with historical flood observations from various sources (fire brigade and media) confirmed that the improved spatial rainfall information also improved sewer impact results on both the magnitude and frequency of the sewer floods. Next to these improved urban flood impact results for recent and current climatological conditions, the new insights on the local rainfall microclimate were also helpful to enhance future projections on rainfall extremes and pluvial floods in the city. This was done by improved statistical downscaling of all available CMIP5 global climate model runs (160 runs) for the 4 RCP scenarios, as well as the available EURO-CORDEX regional climate model runs. Two types of statistical downscaling methods were applied for that purpose (a weather typing based method, and a quantile perturbation approach), making use of the microclimate results and its dependency on specific weather types. Changes in extreme rainfall intensities were analyzed and mapped as a function of the RCP scenario, together with the uncertainty, decomposed in the uncertainties related to the climate models, the climate model initialization or limited length of the 30-year time series (natural climate variability) and the statistical downscaling (albeit limited
Tracing the value of data for flood loss modelling
Directory of Open Access Journals (Sweden)
Schröter Kai
2016-01-01
Full Text Available Flood loss modelling is associated with considerable uncertainty. If prediction uncertainty of flood loss models is large, the reliability of model outcomes is questionable, and thus challenges the practical usefulness. A key problem in flood loss estimation is the transfer of models to geographical regions and to flood events that may differ from the ones used for model development. Variations in local characteristics and continuous system changes require regional adjustments and continuous updating with current evidence. However, acquiring data on damage influencing factors is usually very costly. Therefore, it is of relevance to assess the value of additional data in terms of model performance improvement. We use empirical flood loss data on direct damage to residential buildings available from computer aided telephone interviews that were compiled after major floods in Germany. This unique data base allows us to trace the changes in predictive model performance by incrementally extending the data base used to derive flood loss models. Two models are considered: a uni-variable stage damage function and RF-FLEMO, a multi-variable probabilistic model approach using Random Forests. Additional data are useful to improve model predictive performance and increase model reliability, however the gains also seem to depend on the model approach.
DEFF Research Database (Denmark)
Madsen, Henrik; Rosbjerg, Dan
1997-01-01
parameters is inferred from regional data using generalized least squares (GLS) regression. Two different Bayesian T-year event estimators are introduced: a linear estimator that requires only some moments of the prior distributions to be specified and a parametric estimator that is based on specified......A regional estimation procedure that combines the index-flood concept with an empirical Bayes method for inferring regional information is introduced. The model is based on the partial duration series approach with generalized Pareto (GP) distributed exceedances. The prior information of the model...
Accurate Estimation of Low Fundamental Frequencies from Real-Valued Measurements
DEFF Research Database (Denmark)
Christensen, Mads Græsbøll
2013-01-01
In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason for this is that the......In this paper, the difficult problem of estimating low fundamental frequencies from real-valued measurements is addressed. The methods commonly employed do not take the phenomena encountered in this scenario into account and thus fail to deliver accurate estimates. The reason...... for this is that they employ asymptotic approximations that are violated when the harmonics are not well-separated in frequency, something that happens when the observed signal is real-valued and the fundamental frequency is low. To mitigate this, we analyze the problem and present some exact fundamental frequency estimators...
Improved recovery potential in mature heavy oil fields by Alkali-surfactant flooding
Energy Technology Data Exchange (ETDEWEB)
Bryan, J.; Kantzas, A. [Calgary Univ., AB (Canada). Tomographic Imaging and Porous Media Laboratory
2008-10-15
Primary and secondary alkali surfactant (AS) chemical flooding techniques were optimized in this study. Core flooding experiments were conducted in order to investigate the formation of emulsions in bulk liquid system due to flow through rock pores. Cores were dried and then saturated with water or brine in order to measure permeability. The floods were then performed at various injection rates followed by the AS solution. Solutions were also injected without previous waterflooding. Individual oil and water mobilities were then calculated using the experimental data. Individual phase mobilities were calculated using the total pressure gradient measured across the core. Nuclear magnetic resonance (NMR) studies were conducted in order to determine emulsion formation within porous media from in situ flooding tests at 4 different locations. Data from the NMR studies were used to calculate fluid distributions and measurements of in situ emulsification during the chemical floods. The study demonstrated that the use of the surfactants resulted in the in situ formation of oil-water and water-oil emulsions. Responses from de-ionized alkali and brine AS systems were similar. The recovery mechanism blocked off water channels and provided improved sweep efficiency in the core. It was concluded that injection rates and pressure gradients for chemical floods should be lowered in order to optimize their efficiency. 26 refs., 6 tabs., 15 figs.
Dvory, N. Z.; Ronen, A.; Livshitz, Y.; Adar, E.; Kuznetsov, M.; Yakirevich, A.
2017-12-01
Sustainable groundwater production from karstic aquifers is primarily dictated by its recharge rate. Therefore, in order to limit over-exploitation, it is essential to accurately quantify groundwater recharge. Infiltration during erratic floods in karstic basins may contribute substantial amount to aquifer recharge. However, the complicated nature of karst systems, which are characterized in part by multiple springs, sinkholes, and losing/gaining streams, present a large obstacle to accurately assess the actual contribution of flood water to groundwater recharge. In this study, we aim to quantify the proportion of groundwater recharge during flood events in relation to the annual recharge for karst aquifers. The role of karst conduits on flash flood infiltration was examined during four flood and artificial runoff events in the Sorek creek near Jerusalem, Israel. The events were monitored in short time steps (four minutes). This high resolution analysis is essential to accurately estimating surface flow volumes, which are of particular importance in arid and semi-arid climate where ephemeral flows may provide a substantial contribution to the groundwater reservoirs. For the present investigation, we distinguished between direct infiltration, percolation through karst conduits and diffused infiltration, which is most affected by evapotranspiration. A water balance was then calculated for the 2014/15 hydrologic year using the Hydrologic Engineering Center - Hydrologic Modelling System (HEC-HMS). Simulations show that an additional 8% to 24% of the annual recharge volume is added from runoff losses along the creek that infiltrate through the karst system into the aquifer. The results improve the understanding of recharge processes and support the use of the proposed methodology for quantifying groundwater recharge.
The influence of hydroclimatic variability on flood frequency in the Lower Rhine
Toonen, W.H.J.; Middelkoop, H.; Konijnendijk, T.Y.M.; Macklin, M.G.; Cohen, K.M.
Climate change is expected to significantly affect flooding regimes of river systems in the future. For Western Europe, flood risk assessments generally assume an increase in extreme events and flood risk, and as a result major investments are planned to reduce their impacts. However, flood risk
Multi-temporal clustering of continental floods and associated atmospheric circulations
Liu, Jianyu; Zhang, Yongqiang
2017-12-01
Investigating clustering of floods has important social, economic and ecological implications. This study examines the clustering of Australian floods at different temporal scales and its possible physical mechanisms. Flood series with different severities are obtained by peaks-over-threshold (POT) sampling in four flood thresholds. At intra-annual scale, Cox regression and monthly frequency methods are used to examine whether and when the flood clustering exists, respectively. At inter-annual scale, dispersion indices with four-time variation windows are applied to investigate the inter-annual flood clustering and its variation. Furthermore, the Kernel occurrence rate estimate and bootstrap resampling methods are used to identify flood-rich/flood-poor periods. Finally, seasonal variation of horizontal wind at 850 hPa and vertical wind velocity at 500 hPa are used to investigate the possible mechanisms causing the temporal flood clustering. Our results show that: (1) flood occurrences exhibit clustering at intra-annual scale, which are regulated by climate indices representing the impacts of the Pacific and Indian Oceans; (2) the flood-rich months occur from January to March over northern Australia, and from July to September over southwestern and southeastern Australia; (3) stronger inter-annual clustering takes place across southern Australia than northern Australia; and (4) Australian floods are characterised by regional flood-rich and flood-poor periods, with 1987-1992 identified as the flood-rich period across southern Australia, but the flood-poor period across northern Australia, and 2001-2006 being the flood-poor period across most regions of Australia. The intra-annual and inter-annual clustering and temporal variation of flood occurrences are in accordance with the variation of atmospheric circulation. These results provide relevant information for flood management under the influence of climate variability, and, therefore, are helpful for developing
Future Nuisance Flooding at Boston Caused by Astronomical Tides Alone
Ray, Richard D.; Foster, Grant
2016-01-01
Sea level rise necessarily triggers more occurrences of minor, or nuisance, flooding events along coastlines, a fact well documented in recent studies. At some locations nuisance flooding can be brought about merely by high spring tides, independent of storms, winds, or other atmospheric conditions. Analysis of observed water levels at Boston indicates that tidal flooding began to occur there in 2011 and will become more frequent in subsequent years. A compilation of all predicted nuisance-flooding events, induced by astronomical tides alone, is presented through year 2050. The accuracy of the tide prediction is improved when several unusual properties of Gulf of Maine tides, including secular changes, are properly accounted for. Future mean sea-level rise at Boston cannot be predicted with comparable confidence, so two very different climate scenarios are adopted; both predict a large increase in the frequency and the magnitude of tidal flooding events.
Chao, Y.; Cheng, C. T.; Hsiao, Y. H.; Hsu, C. T.; Yeh, K. C.; Liu, P. L.
2017-12-01
There are 5.3 typhoons hit Taiwan per year on average in last decade. Typhoon Morakot in 2009, the most severe typhoon, causes huge damage in Taiwan, including 677 casualties and roughly NT 110 billion (3.3 billion USD) in economic loss. Some researches documented that typhoon frequency will decrease but increase in intensity in western North Pacific region. It is usually preferred to use high resolution dynamical model to get better projection of extreme events; because coarse resolution models cannot simulate intense extreme events. Under that consideration, dynamical downscaling climate data was chosen to describe typhoon satisfactorily, this research used the simulation data from AGCM of Meteorological Research Institute (MRI-AGCM). Considering dynamical downscaling methods consume massive computing power, and typhoon number is very limited in a single model simulation, using dynamical downscaling data could cause uncertainty in disaster risk assessment. In order to improve the problem, this research used four sea surfaces temperatures (SSTs) to increase the climate change scenarios under RCP 8.5. In this way, MRI-AGCMs project 191 extreme typhoons in Taiwan (when typhoon center touches 300 km sea area of Taiwan) in late 21th century. SOBEK, a two dimensions flood simulation model, was used to assess the flood risk under four SSTs climate change scenarios in Tainan, Taiwan. The results show the uncertainty of future flood risk assessment is significantly decreased in Tainan, Taiwan in late 21th century. Four SSTs could efficiently improve the problems of limited typhoon numbers in single model simulation.
Joint channel/frequency offset estimation and correction for coherent optical FBMC/OQAM system
Wang, Daobin; Yuan, Lihua; Lei, Jingli; wu, Gang; Li, Suoping; Ding, Runqi; Wang, Dongye
2017-12-01
In this paper, we focus on analysis of the preamble-based joint estimation for channel and laser-frequency offset (LFO) in coherent optical filter bank multicarrier systems with offset quadrature amplitude modulation (CO-FBMC/OQAM). In order to reduce the noise impact on the estimation accuracy, we proposed an estimation method based on inter-frame averaging. This method averages the cross-correlation function of real-valued pilots within multiple FBMC frames. The laser-frequency offset is estimated according to the phase of this average. After correcting LFO, the final channel response is also acquired by averaging channel estimation results within multiple frames. The principle of the proposed method is analyzed theoretically, and the preamble structure is thoroughly designed and optimized to suppress the impact of inherent imaginary interference (IMI). The effectiveness of our method is demonstrated numerically using different fiber and LFO values. The obtained results show that the proposed method can improve transmission performance significantly.
Deciphering flood frequency curves from a coupled human-nature system perspective
Li, H. Y.; Abeshu, G. W.; Wang, W.; Ye, S.; Guo, J.; Bloeschl, G.; Leung, L. R.
2017-12-01
Most previous studies and applications in deriving or applying FFC are underpinned by the stationarity assumption. To examine the theoretical robustness of this basic assumption, we analyzed the observed FFCs at hundreds of catchments in the contiguous United States along the gradients of climate conditions and human influences. The shape of FFCs is described using three similarity indices: mean annual floods (MAF), coefficient of variance (CV), and a seasonality index defined using circular statistics. The characteristics of catchments are quantified with a small number of dimensionless indices, including particularly: 1) the climatic aridity index, AI, which is a measure of the competition between energy and water availability; 2) reservoir impact index, defined as the total upstream reservoir storage capacity normalized by the annual streamflow volume. The linkages between these two sets of indices are then explored based on a combination of mathematical derivations of the Budyko formula, simple but physically based reservoir operation models, and other auxiliary data. It is found that the shape of FFCs shifts from arid to humid climate, and from periods with weak human influences to periods with strong influences. The seasonality of floods is found to be largely controlled by the synchronization between the seasonal cycles of precipitation and solar radiation in pristine catchments, but also by the reservoir regulation capacity in managed catchments. Our findings may help improve flood-risk assessment and mitigation in both natural and regulated river systems across various climate gradients.
Fast and Statistically Efficient Fundamental Frequency Estimation
DEFF Research Database (Denmark)
Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom
2016-01-01
Fundamental frequency estimation is a very important task in many applications involving periodic signals. For computational reasons, fast autocorrelation-based estimation methods are often used despite parametric estimation methods having superior estimation accuracy. However, these parametric...... a recursive solver. Via benchmarks, we demonstrate that the computation time is reduced by approximately two orders of magnitude. The proposed fast algorithm is available for download online....
Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?
Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy
2016-10-01
The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.
Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A
2017-01-01
As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.
Directory of Open Access Journals (Sweden)
T. Okada
2011-12-01
Full Text Available Flooding on the Tone River, which drains the largest catchment area in Japan and is now home to 12 million people, poses significant risk to the Greater Tokyo Area. In April 2010, an expert panel in Japan, the Central Disaster Prevention Council, examined the potential for large-scale flooding and outlined possible mitigation measures in the Greater Tokyo Area. One of the scenarios considered closely mimics the pattern of flooding that occurred with the passage of Typhoon Kathleen in 1947 and would potentially flood some 680 000 households above floor level. Building upon that report, this study presents a Geographical Information System (GIS-based data integration approach to estimate the insurance losses for residential buildings and contents as just one component of the potential financial cost. Using a range of publicly available data – census information, location reference data, insurance market information and flood water elevation data – this analysis finds that insurance losses for residential property alone could reach approximately 1 trillion JPY (US$ 12.5 billion. Total insurance losses, including commercial and industrial lines of business, are likely to be at least double this figure with total economic costs being much greater again. The results are sensitive to the flood scenario assumed, position of levee failures, local flood depths and extents, population and building heights. The Average Recurrence Interval (ARI of the rainfall following Typhoon Kathleen has been estimated to be on the order of 200 yr; however, at this juncture it is not possible to put an ARI on the modelled loss since we cannot know the relative or joint probability of the different flooding scenarios. It is possible that more than one of these scenarios could occur simultaneously or that levee failure at one point might lower water levels downstream and avoid a failure at all other points. In addition to insurance applications, spatial analyses like
Flood Disaster Mitigation as Revealed by Cawang-Manggarai River Improvement of Ciliwung River
Directory of Open Access Journals (Sweden)
Airlangga Mardjono
2015-06-01
The final result of this simulation shows that Scenario 3 gives the lowest water surface elevation profile. Scenario 3 is subjected to river normalization, revetment works along the river, and also flood control structure improvement through the additional sluice gate on Manggarai Barrage. This scenario results 167 cm, 163 cm, 172 cm, 179 cm, 167 cm and 171 cm or 17,60%, 17,16%, 18,09%, 18,76%, 17,38% and 17,72% of maximum water level reduction respectively over cross section number S 20 to S 25, for several simulations with 100 year of design discharge. Keywords: Simulation, river improvement, flood water surface elevation.
The Impact of Changing Storage Area on Flood Magnitude and Occurrence
Directory of Open Access Journals (Sweden)
Kusumastuti D.I.
2012-01-01
Full Text Available This study focuses on the impact of combined catchment and storage upon flood occurrences and flood peaks. A significant factor that plays an important role of the combined catchment and storage is the ratio of contributing catchment area to storage area (AC/AS where the impact significantly shows increasing frequency of storage overflow and flood peaks with the increasing of AC/AS. Some case studies examined in this work, i.e. Way Pegadungan (Lampung, Sumatra and NagaraRiver (South Kalimantan catchments show similar behavior. Swamps located on the sides of downstream of Way Pegadungan as well as Nagara River act as storages during flood events. The dyke which was planned to be built increases the ratio of AC/AS significantly as storage area reduced considerably. This has an impact on flood peaks which can increase considerably. The improved understanding of these process controls will be useful in assisting the management of such catchments, particularly to assist in flood prevention and mitigation.
Swift delineation of flood-prone areas over large European regions
Tavares da Costa, Ricardo; Castellarin, Attilio; Manfreda, Salvatore; Samela, Caterina; Domeneghetti, Alessio; Mazzoli, Paolo; Luzzi, Valerio; Bagli, Stefano
2017-04-01
According to the European Environment Agency (EEA Report No 1/2016), a significant share of the European population is estimated to be living on or near a floodplain, with Italy having the highest population density in flood-prone areas among the countries analysed. This tendency, tied with event frequency and magnitude (e.g.: the 24/11/2016 floods in Italy) and the fact that river floods may occur at large scales and at a transboundary level, where data is often sparse, presents a challenge in flood-risk management. The availability of consistent flood hazard and risk maps during prevention, preparedness, response and recovery phases are a valuable and important step forward in improving the effectiveness, efficiency and robustness of evidence-based decision making. The present work aims at testing and discussing the usefulness of pattern recognition techniques based on geomorphologic indices (Manfreda et al., J. Hydrol. Eng., 2011, Degiorgis et al., J Hydrol., 2012, Samela et al., J. Hydrol. Eng., 2015) for the simplified mapping of river flood-prone areas at large scales. The techniques are applied to 25m Digital Elevation Models (DEM) of the Danube, Po and Severn river watersheds, obtained from the Copernicus data and information funded by the European Union - EU-DEM layers. Results are compared to the Pan-European flood hazard maps derived by Alfieri et al. (Hydrol. Proc., 2013) using a set of distributed hydrological (LISFLOOD, van der Knijff et al., Int. J. Geogr. Inf. Sci., 2010, employed within the European Flood Awareness System, www.efas.eu) and hydraulic models (LISFLOOD-FP, Bates and De Roo, J. Hydrol., 2000). Our study presents different calibration and cross-validation exercises of the DEM-based mapping algorithms to assess to which extent, and with which accuracy, they can be reproduced over different regions of Europe. This work is being developed under the System-Risk project (www.system-risk.eu) that received funding from the European Union
Directory of Open Access Journals (Sweden)
Liu KJ Ray
2002-01-01
Full Text Available Orthogonal frequency division multiplexing (OFDM is an effective technique for the future 3G communications because of its great immunity to impulse noise and intersymbol interference. The channel estimation is a crucial aspect in the design of OFDM systems. In this work, we propose a channel estimation algorithm based on a time-frequency polynomial model of the fading multipath channels. The algorithm exploits the correlation of the channel responses in both time and frequency domains and hence reduce more noise than the methods using only time or frequency polynomial model. The estimator is also more robust compared to the existing methods based on Fourier transform. The simulation shows that it has more than improvement in terms of mean-squared estimation error under some practical channel conditions. The algorithm needs little prior knowledge about the delay and fading properties of the channel. The algorithm can be implemented recursively and can adjust itself to follow the variation of the channel statistics.
Kojima, Yohei; Takeda, Kazuaki; Adachi, Fumiyuki
Frequency-domain equalization (FDE) based on the minimum mean square error (MMSE) criterion can provide better downlink bit error rate (BER) performance of direct sequence code division multiple access (DS-CDMA) than the conventional rake combining in a frequency-selective fading channel. FDE requires accurate channel estimation. In this paper, we propose a new 2-step maximum likelihood channel estimation (MLCE) for DS-CDMA with FDE in a very slow frequency-selective fading environment. The 1st step uses the conventional pilot-assisted MMSE-CE and the 2nd step carries out the MLCE using decision feedback from the 1st step. The BER performance improvement achieved by 2-step MLCE over pilot assisted MMSE-CE is confirmed by computer simulation.
State-of-the-art for evaluating the potential impact of flooding on a radioactive waste repository
International Nuclear Information System (INIS)
1980-01-01
This report is a review of the state-of-the-art for evaluating the potential impact of flooding on a deep radioactive-waste repository, namely, for predicting the future occurrence of catastrophic flooding and for estimating the effect of such flooding on waste containment characteristics. Several detrimental effects are identified: flooding can increase groundwater seepage velocities through a repository within the framework of the existing hydrologic system and thus increase the rate of radioactive-waste leakage to the biosphere; flooding may alter repository hydrology by reversing flow gradients, relocating sources of groundwater recharge and discharge, or shortening seepage paths, thereby producing unpredictable leakage; saturation of a vadose-zone repository during flooding can increase groundwater seepage velocities by several orders of magnitude; and flooding can damage repository-media containment properties by inducing seismic or chemical instability or increasing fracture permeability in relatively shallow repository rock as a result of redistributing in-situ stresses. Short-term flooding frequency and magnitude can be predicted statistically by analyzing historical records of flooding. However, long-term flooding events that could damage a permanent repository cannot be predicted with confidence because the geologic record is neither unique nor sufficienly complete for statistical analysis. It is more important to identify parameters characterizing containment properties (such as permeability, groundwater gradient, and shortest seepage path length to the biosphere) that could be affected by future flooding, estimate the maximum magnitude of flooding that could occur within the life of the repository by examining the geologic record, and determine the impact such flooding could have on the parameter values
Carrier Frequency Offset Estimation and I/Q Imbalance Compensation for OFDM Systems
Directory of Open Access Journals (Sweden)
M. Omair Ahmad
2007-01-01
Full Text Available Two types of radio-frequency front-end imperfections, that is, carrier frequency offset and the inphase/quadrature (I/Q imbalance are considered for orthogonal frequency division multiplexing (OFDM communication systems. A preamble-assisted carrier frequency estimator is proposed along with an I/Q imbalance compensation scheme. The new frequency estimator reveals the relationship between the inphase and the quadrature components of the received preamble and extracts the frequency offset from the phase shift caused by the frequency offset and the cross-talk interference due to the I/Q imbalance. The proposed frequency estimation algorithm is fast, efficient, and robust to I/Q imbalance. An I/Q imbalance estimation/compensation algorithm is also presented by solving a least-square problem formulated using the same preamble as employed for the frequency offset estimation. The computational complexity of the I/Q estimation scheme is further reduced by using part of the short symbols with a little sacrifice in the estimation accuracy. Computer simulation and comparison with some of the existing algorithms are conducted, showing the effectiveness of the proposed method.
Best Statistical Distribution of flood variables for Johor River in Malaysia
Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.
2012-12-01
A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow
The Generation of a Stochastic Flood Event Catalogue for Continental USA
Quinn, N.; Wing, O.; Smith, A.; Sampson, C. C.; Neal, J. C.; Bates, P. D.
2017-12-01
Recent advances in the acquisition of spatiotemporal environmental data and improvements in computational capabilities has enabled the generation of large scale, even global, flood hazard layers which serve as a critical decision-making tool for a range of end users. However, these datasets are designed to indicate only the probability and depth of inundation at a given location and are unable to describe the likelihood of concurrent flooding across multiple sites.Recent research has highlighted that although the estimation of large, widespread flood events is of great value to flood mitigation and insurance industries, to date it has been difficult to deal with this spatial dependence structure in flood risk over relatively large scales. Many existing approaches have been restricted to empirical estimates of risk based on historic events, limiting their capability of assessing risk over the full range of plausible scenarios. Therefore, this research utilises a recently developed model-based approach to describe the multisite joint distribution of extreme river flows across continental USA river gauges. Given an extreme event at a site, the model characterises the likelihood neighbouring sites are also impacted. This information is used to simulate an ensemble of plausible synthetic extreme event footprints from which flood depths are extracted from an existing global flood hazard catalogue. Expected economic losses are then estimated by overlaying flood depths with national datasets defining asset locations, characteristics and depth damage functions. The ability of this approach to quantify probabilistic economic risk and rare threshold exceeding events is expected to be of value to those interested in the flood mitigation and insurance sectors.This work describes the methodological steps taken to create the flood loss catalogue over a national scale; highlights the uncertainty in the expected annual economic vulnerability within the USA from extreme river flows
On identifying relationships between the flood scaling exponent and basin attributes.
Medhi, Hemanta; Tripathi, Shivam
2015-07-01
Floods are known to exhibit self-similarity and follow scaling laws that form the basis of regional flood frequency analysis. However, the relationship between basin attributes and the scaling behavior of floods is still not fully understood. Identifying these relationships is essential for drawing connections between hydrological processes in a basin and the flood response of the basin. The existing studies mostly rely on simulation models to draw these connections. This paper proposes a new methodology that draws connections between basin attributes and the flood scaling exponents by using observed data. In the proposed methodology, region-of-influence approach is used to delineate homogeneous regions for each gaging station. Ordinary least squares regression is then applied to estimate flood scaling exponents for each homogeneous region, and finally stepwise regression is used to identify basin attributes that affect flood scaling exponents. The effectiveness of the proposed methodology is tested by applying it to data from river basins in the United States. The results suggest that flood scaling exponent is small for regions having (i) large abstractions from precipitation in the form of large soil moisture storages and high evapotranspiration losses, and (ii) large fractions of overland flow compared to base flow, i.e., regions having fast-responding basins. Analysis of simple scaling and multiscaling of floods showed evidence of simple scaling for regions in which the snowfall dominates the total precipitation.
Integrating lateral contributions along river reaches to improve SWOT discharge estimates
Beighley, E.; Zhao, Y.; Feng, D.; Fisher, C. K.; Raoufi, R.; Durand, M. T.; David, C. H.; Lee, H.; Boone, A. A.; Cretaux, J. F.
2016-12-01
Understanding the potential impacts of climate and land cover change at continental to global scales with a sufficient resolution for community scale planning and management requires an improved representation of the hydrologic cycle that is possible based on existing measurement networks and current Earth system models. The Surface Water and Ocean Topography (SWOT) mission, scheduled to launch in 2021, has the potential to address this challenge by providing measurements of water surface elevation, slope and extent for rivers wider than roughly 50-100 meters at a temporal sampling frequency ranging from days to weeks. The global uniformity and space/time resolution of the proposed SWOT measurements will enable hydrologic discovery, model advancements and new applications addressing the above challenges that are not currently possible or likely even conceivable. One derived data product planned for the SWOT mission is river discharge. Although there are several discharge algorithms that perform well for a range of conditions, this effort is focused on the MetroMan discharge algorithm. For example, in MetroMan, lateral inflow assumptions have been shown to impact performance. Here, the role of lateral inflows on discharge estimate performance is investigated. Preliminary results are presented for the Ohio River Basin. Lateral inflows are quantified for SWOT-observable river reaches using surface and subsurface runoff from North American Land Data Assimilation System (NLDAS) and lateral routing in the Hillslope River Routing (HRR) model. Frequency distributions for the fraction of reach-averaged discharge resulting from lateral inflow are presented. Future efforts will integrate lateral inflow characteristics into the MetroMan discharge algorithm and quantify the potential value of SWOT measurement in flood insurance applications.
Effectiveness of flood damage mitigation measures: Empirical evidence from French flood disasters
Poussin, J.K.; Botzen, W.J.W.; Aerts, J.C.J.H.
2015-01-01
Recent destructive flood events and projected increases in flood risks as a result of climate change in many regions around the world demonstrate the importance of improving flood risk management. Flood-proofing of buildings is often advocated as an effective strategy for limiting damage caused by
Improving a DSM Obtained by Unmanned Aerial Vehicles for Flood Modelling
Mourato, Sandra; Fernandez, Paulo; Pereira, Luísa; Moreira, Madalena
2017-12-01
According to the EU flood risks directive, flood hazard map must be used to assess the flood risk. These maps can be developed with hydraulic modelling tools using a Digital Surface Runoff Model (DSRM). During the last decade, important evolutions of the spatial data processing has been developed which will certainly improve the hydraulic models results. Currently, images acquired with Red/Green/Blue (RGB) camera transported by Unmanned Aerial Vehicles (UAV) are seen as a good alternative data sources to represent the terrain surface with a high level of resolution and precision. The question is if the digital surface model obtain with this data is adequate enough for a good representation of the hydraulics flood characteristics. For this purpose, the hydraulic model HEC-RAS was run with 4 different DSRM for an 8.5 km reach of the Lis River in Portugal. The computational performance of the 4 modelling implementations is evaluated. Two hydrometric stations water level records were used as boundary conditions of the hydraulic model. The records from a third hydrometric station were used to validate the optimal DSRM. The HEC-RAS results had the best performance during the validation step were the ones where the DSRM with integration of the two altimetry data sources.
Burns, Douglas A.; Smith, Martyn J.; Freehafer, Douglas A.
2015-12-31
A new Web-based application, titled “Application of Flood Regressions and Climate Change Scenarios To Explore Estimates of Future Peak Flows”, has been developed by the U.S. Geological Survey, in cooperation with the New York State Department of Transportation, that allows a user to apply a set of regression equations to estimate the magnitude of future floods for any stream or river in New York State (exclusive of Long Island) and the Lake Champlain Basin in Vermont. The regression equations that are the basis of the current application were developed in previous investigations by the U.S. Geological Survey (USGS) and are described at the USGS StreamStats Web sites for New York (http://water.usgs.gov/osw/streamstats/new_york.html) and Vermont (http://water.usgs.gov/osw/streamstats/Vermont.html). These regression equations include several fixed landscape metrics that quantify aspects of watershed geomorphology, basin size, and land cover as well as a climate variable—either annual precipitation or annual runoff.
Improving precipitation estimates over the western United States using GOES-R precipitation data
Karbalaee, N.; Kirstetter, P. E.; Gourley, J. J.
2017-12-01
Satellite remote sensing data with fine spatial and temporal resolution are widely used for precipitation estimation for different applications such as hydrological modeling, storm prediction, and flash flood monitoring. The Geostationary Operational Environmental Satellites-R series (GOES-R) is the next generation of environmental satellites that provides hydrologic, atmospheric, and climatic information every 30 seconds over the western hemisphere. The high-resolution and low-latency of GOES-R observations is essential for the monitoring and prediction of floods, specifically in the Western United States where the vantage point of space can complement the degraded weather radar coverage of the NEXRAD network. The GOES-R rainfall rate algorithm will yield deterministic quantitative precipitation estimates (QPE). Accounting for inherent uncertainties will further advance the GOES-R QPEs since with quantifiable error bars, the rainfall estimates can be more readily fused with ground radar products. On the ground, the high-resolution NEXRAD-based precipitation estimation from the Multi-Radar/Multi-Sensor (MRMS) system, which is now operational in the National Weather Service (NWS), is challenged due to a lack of suitable coverage of operational weather radars over complex terrain. Distribution of QPE uncertainties associated with the GOES-R deterministic retrievals are derived and analyzed using MRMS over regions with good radar coverage. They will be merged with MRMS-based probabilistic QPEs developed to advance multisensor QPE integration. This research aims at improving precipitation estimation over the CONUS by combining the observations from GOES-R and MRMS to provide consistent, accurate and fine resolution precipitation rates with uncertainties over the CONUS.
Caporali, E.; Chiarello, V.; Galeati, G.
2014-12-01
Peak discharges estimates for a given return period are of primary importance in engineering practice for risk assessment and hydraulic structure design. Different statistical methods are chosen here for the assessment of flood frequency curve: one indirect technique based on the extreme rainfall event analysis, the Peak Over Threshold (POT) model and the Annual Maxima approach as direct techniques using river discharge data. In the framework of the indirect method, a Monte Carlo simulation approach is adopted to determine a derived frequency distribution of peak runoff using a probabilistic formulation of the SCS-CN method as stochastic rainfall-runoff model. A Monte Carlo simulation is used to generate a sample of different runoff events from different stochastic combination of rainfall depth, storm duration, and initial loss inputs. The distribution of the rainfall storm events is assumed to follow the GP law whose parameters are estimated through GEV's parameters of annual maximum data. The evaluation of the initial abstraction ratio is investigated since it is one of the most questionable assumption in the SCS-CN model and plays a key role in river basin characterized by high-permeability soils, mainly governed by infiltration excess mechanism. In order to take into account the uncertainty of the model parameters, this modified approach, that is able to revise and re-evaluate the original value of the initial abstraction ratio, is implemented. In the POT model the choice of the threshold has been an essential issue, mainly based on a compromise between bias and variance. The Generalized Extreme Value (GEV) distribution fitted to the annual maxima discharges is therefore compared with the Pareto distributed peaks to check the suitability of the frequency of occurrence representation. The methodology is applied to a large dam in the Serchio river basin, located in the Tuscany Region. The application has shown as Monte Carlo simulation technique can be a useful
Floods in Central Texas, September 7-14, 2010
Winters, Karl E.
2012-01-01
Severe flooding occurred near the Austin metropolitan area in central Texas September 7–14, 2010, because of heavy rainfall associated with Tropical Storm Hermine. The U.S. Geological Survey, in cooperation with the Upper Brushy Creek Water Control and Improvement District, determined rainfall amounts and annual exceedance probabilities for rainfall resulting in flooding in Bell, Williamson, and Travis counties in central Texas during September 2010. We documented peak streamflows and the annual exceedance probabilities for peak streamflows recorded at several streamflow-gaging stations in the study area. The 24-hour rainfall total exceeded 12 inches at some locations, with one report of 14.57 inches at Lake Georgetown. Rainfall probabilities were estimated using previously published depth-duration frequency maps for Texas. At 4 sites in Williamson County, the 24-hour rainfall had an annual exceedance probability of 0.002. Streamflow measurement data and flood-peak data from U.S. Geological Survey surface-water monitoring stations (streamflow and reservoir gaging stations) are presented, along with a comparison of September 2010 flood peaks to previous known maximums in the periods of record. Annual exceedance probabilities for peak streamflow were computed for 20 streamflow-gaging stations based on an analysis of streamflow-gaging station records. The annual exceedance probability was 0.03 for the September 2010 peak streamflow at the Geological Survey's streamflow-gaging stations 08104700 North Fork San Gabriel River near Georgetown, Texas, and 08154700 Bull Creek at Loop 360 near Austin, Texas. The annual exceedance probability was 0.02 for the peak streamflow for Geological Survey's streamflow-gaging station 08104500 Little River near Little River, Texas. The lack of similarity in the annual exceedance probabilities computed for precipitation and streamflow might be attributed to the small areal extent of the heaviest rainfall over these and the other gaged
Climate Change Impacts on Flooding in Southeastern Austria
Switanek, Matt; Truhetz, Heimo; Reszler, Christian
2015-04-01
Floods in southeastern Austria can cause significant damage to life, property and infrastructure. These flood events are often the result of extreme precipitation from small-scale convective storms. In order to more accurately model the changes to flood magnitude and frequency, Regional Climate Models (RCMs) must be able to simulate small-scale convective storms similar to those that have been observed. Even as computational resources have increased, RCMs are just now achieving the high spatial and temporal scales necessary to physically resolve the processes that govern small-scale convection. With increased resolution, RCMs can rely on their internal physics to model convective precipitation and need not depend on parameterization. This study uses historical and future scenarios of Regional Climate Models (RCMs) run at a spatial scale of 3 km and temporal scale of 1 hr. In order to subsequently force a hydrological flood model, the sub-daily precipitation and temperature data from the RCMs are first bias corrected. A newly proposed bias correction method is presented and compared to the commonly used quantile mapping. The proposed bias correction method performs better in its ability to preserve the model projected climate change signal (measured by changes in mean and variance). Lastly, the changes in the quantity and frequency of projected extreme precipitation, at the watershed level, are analyzed with respect to the historic time period. With these improvements in dynamical modeling and bias correction methods, a clearer picture emerges revealing the more likely impacts climate change will have on floods in southeastern Austria.
Directory of Open Access Journals (Sweden)
Daniela Molinari
2017-09-01
Full Text Available IN-depth SYnthetic Model for Flood Damage Estimation (INSYDE is a model for the estimation of flood damage to residential buildings at the micro-scale. This study investigates the sensitivity of INSYDE to the accuracy of input data. Starting from the knowledge of input parameters at the scale of individual buildings for a case study, the level of detail of input data is progressively downgraded until the condition in which a representative value is defined for all inputs at the census block scale. The analysis reveals that two conditions are required to limit the errors in damage estimation: the representativeness of representatives values with respect to micro-scale values and the local knowledge of the footprint area of the buildings, being the latter the main extensive variable adopted by INSYDE. Such a result allows for extending the usability of the model at the meso-scale, also in different countries, depending on the availability of aggregated building data.
Coping capacities for improving adaptation pathways for flood protection in Can Tho, Vietnam
Pathirana, A.; Radhakrishnan, M.; Quan, N. H.; Gersonius, B.; Ashley, R.; Zevenbergen, C.
2016-12-01
Studying the evolution of coping and adaptation capacities is a prerequisite for preparing an effective flood management plan for the future, especially in the dynamic and fast changing cities of developing countries. The objectives, requirements, targets, design and performance of flood protection measures will have to be determined after taking into account, or in conjunction with, the coping capacities. A methodology is presented based on adaptation pathways to account for coping capacities and to assess the effect on flood protection measures. The adaptation pathways method determines the point of failure of a particular strategy based on the change in an external driver, a point in time or a socio economic situation where / at which the strategy can no longer meet its objective. Pathways arrived at based on this methodology reflect future reality by considering changing engineering standards along with future uncertainties, risk taking abilities and adaptation capacities. This pathways based methodology determines the Adaptation tipping points (ATP), `time of occurrence of ATP' of flood protection measures after accounting for coping capacities, evaluates the measures and then provides the means to determine the adaptation pathways. Application of this methodology for flood protection measures in Can Tho city in the Mekong delta reveals the effect of coping capacity on the usefulness of flood protection measures and the delay in occurrence of tipping points. Consideration of coping capacity in the system owing to elevated property floor levels lead to the postponement of tipping points and improved the adaptation pathways comprising flood protection measures such as dikes. This information is useful to decision makers for planning and phasing of investments in flood protection.
Kiss, Andrea
2016-04-01
The present paper is based on a recently developed database including contemporary original, administrative, legal and private source materials (published and archival) as well as media reports related to the floods occurred on the lower sections of the Tisza river in Hungary, with special emphasis on the area of Szeged town. The study area is well-represented by contemporary source evidence from the late 17th century onwards, when the town and its broader area was reoccupied from the Ottoman Turkish Empire. Concerning the applied source materials, the main bases of investigation are the administrative (archival) sources such as town council protocols of Szeged and county meeting protocols of Csanád and Csongrád Counties. In these (legal-)administrative documents damaging events (natural/environmental hazards) were systematically recorded. Moreover, other source types such as taxation-related damage accounts as well as private and official reports, letters and correspondence (published, unpublished) were also included. Concerning published evidence, a most important source is flood reports in contemporary newspapers as well as town chronicles and other contemporary narratives. In the presentation the main focus is on the analysis of flood-rich flood-poor periods of the last ca. 330 years; moreover, the seasonality distribution as well as the magnitude of Tisza flood events are also discussed. Another important aim of the poster is to provide a short overview, in the form of case studies, on the greatest flood events (e.g. duration, magnitude, damages, multi-annual consequences), and their further impacts on the urban and countryside development as well as on (changes in) flood defence strategies. In this respect, especially two flood events, the great (1815-)1816 and the catastrophic 1879 flood (shortly with causes and consequences) - that practically erased Szeged town from the ground - are presented in more detail.
Prompt Proxy Mapping of Flood Damaged Rice Fields Using MODIS-Derived Indices
Directory of Open Access Journals (Sweden)
Youngjoo Kwak
2015-11-01
Full Text Available Flood mapping, particularly hazard and risk mapping, is an imperative process and a fundamental part of emergency response and risk management. This paper aims to produce a flood risk proxy map of damaged rice fields over the whole of Bangladesh, where monsoon river floods are dominant and frequent, affecting over 80% of the total population. This proxy risk map was developed to meet the request of the government on a national level. This study represents a rapid, straightforward methodology for estimating rice-crop damage in flood areas of Bangladesh during the large flood from July to September 2007, despite the lack of primary data. We improved a water detection algorithm to achieve a better discrimination capacity to discern flood areas by using a modified land surface water index (MLSWI. Then, rice fields were estimated utilizing a hybrid rice field map from land-cover classification and MODIS-derived indices, such as the normalized difference vegetation index (NDVI and enhanced vegetation index (EVI. The results showed that the developed method is capable of providing instant, comprehensive, nationwide mapping of flood risks, such as rice field damage. The detected flood areas and damaged rice fields during the 2007 flood were verified by comparing them with the Advanced Land Observing Satellite (ALOS AVNIR-2 images (a 10 m spatial resolution and in situ field survey data with moderate agreement (K = 0.57.
Institute of Scientific and Technical Information of China (English)
ZHOU Yuliang; LU Guihua; JIN Juliang; TONG Fang; ZHOU Ping
2006-01-01
Precise comprehensive evaluation of flood disaster loss is significant for the prevention and mitigation of flood disasters. Here, one of the difficulties involved is how to establish a model capable of describing the complex relation between the input and output data of the system of flood disaster loss. Genetic programming (GP) solves problems by using ideas from genetic algorithm and generates computer programs automatically. In this study a new method named the evaluation of the grade of flood disaster loss (EGFD) on the basis of improved genetic programming (IGP) is presented (IGPEGFD). The flood disaster area and the direct economic loss are taken as the evaluation indexes of flood disaster loss. Obviously that the larger the evaluation index value, the larger the corresponding value of the grade of flood disaster loss is. Consequently the IGP code is designed to make the value of the grade of flood disaster be an increasing function of the index value. The result of the application of the IGP-EGFD model to Henan Province shows that a good function expression can be obtained within a bigger searched function space; and the model is of high precision and considerable practical significance.Thus, IGP-EGFD can be widely used in automatic modeling and other evaluation systems.
A free and open source QGIS plugin for flood risk analysis: FloodRisk
Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo
2016-04-01
information and to generate knowledge in the stakeholders for improving flood risk management. In particular, Floodrisk comprises a set of calculators capable of computing human or economic losses for a collection of assets, caused by a given scenario event, explicitly covering mitigation and adaptation measures (Mancusi et al., 2015). It is important to mention that despite the fact that some literature models incorporates calculator philosophies identical to the ones implemented in the FloodRisk engine, its implementation might vary significantly, such as the need for a user-friendly and intuitive user interface, or the capability of running the calculations on any platform (Windows, Mac, Linux, etc.), the ability to promotes extensibility, efficient testability, and scientific operability. FloodRisk has been designed as an initiative for implemented a standard and harmonized procedure to determine the flood impacts. Albano, R.; Mancusi, L.; Sole, A.; Adamowski, J. Collaborative Strategies for Sustainable EU Flood Risk Management: FOSS and Geospatial Tools - Challenges and Opportunities for Operative Risk Analysis. ISPRS Int. J. Geo-Inf. 2015, 4, 2704-2727. Mancusi, L., Albano, R., Sole, A.. FloodRisk: a QGIS plugin for flood consequences estimation, In: Geomatics Workbooks n°12 - FOSS4G Europe Como, 2015
Assessment of flooding in a best estimate thermal hydraulic code (WCOBRA/TRAC)
International Nuclear Information System (INIS)
Takeuchi, K.; Young, M.Y.
1998-01-01
The performance of WCOBRA/TRAC code in predicting the flooding, the counter-current flow limit, is evaluated in three geometries important to nuclear reactor loss-of-coolant accident evaluation; a vertical pipe, a perforated plate, and a downcomer annulus. These flow limits are computationally evaluated through transient conditions. The flooding in the vertical pipe is compared with the classical Wallis flooding limit. The flooding on the perforated plate is compared with the Northwestern flooding data correlation. The downcomer flooding in 1/15th and 1/5th scale model is compared with the Creare data. Finally, full scale downcomer flooding is compared with the UPTF test data. The prediction capability of the code for the flooding is found to be very good. (orig.)
Future changes in atmospheric rivers and their implications for winter flooding in Britain
International Nuclear Information System (INIS)
Lavers, David A; Allan, Richard P; Brayshaw, David J; Villarini, Gabriele; Lloyd-Hughes, Benjamin; Wade, Andrew J
2013-01-01
Within the warm conveyor belt of extra-tropical cyclones, atmospheric rivers (ARs) are the key synoptic features which deliver the majority of poleward water vapour transport, and are associated with episodes of heavy and prolonged rainfall. ARs are responsible for many of the largest winter floods in the mid-latitudes resulting in major socioeconomic losses; for example, the loss from United Kingdom (UK) flooding in summer/winter 2012 is estimated to be about $1.6 billion in damages. Given the well-established link between ARs and peak river flows for the present day, assessing how ARs could respond under future climate projections is of importance in gauging future impacts from flooding. We show that North Atlantic ARs are projected to become stronger and more numerous in the future scenarios of multiple simulations from five state-of-the-art global climate models (GCMs) in the fifth Climate Model Intercomparison Project (CMIP5). The increased water vapour transport in projected ARs implies a greater risk of higher rainfall totals and therefore larger winter floods in Britain, with increased AR frequency leading to more flood episodes. In the high emissions scenario (RCP8.5) for 2074–2099 there is an approximate doubling of AR frequency in the five GCMs. Our results suggest that the projected change in ARs is predominantly a thermodynamic response to warming resulting from anthropogenic radiative forcing. (letter)
Order Tracking Based on Robust Peak Search Instantaneous Frequency Estimation
International Nuclear Information System (INIS)
Gao, Y; Guo, Y; Chi, Y L; Qin, S R
2006-01-01
Order tracking plays an important role in non-stationary vibration analysis of rotating machinery, especially to run-up or coast down. An instantaneous frequency estimation (IFE) based order tracking of rotating machinery is introduced. In which, a peak search algorithms of spectrogram of time-frequency analysis is employed to obtain IFE of vibrations. An improvement to peak search is proposed, which can avoid strong non-order components or noises disturbing to the peak search work. Compared with traditional methods of order tracking, IFE based order tracking is simplified in application and only software depended. Testing testify the validity of the method. This method is an effective supplement to traditional methods, and the application in condition monitoring and diagnosis of rotating machinery is imaginable
Directory of Open Access Journals (Sweden)
Seung Oh Lee
2013-10-01
Full Text Available Collection and investigation of flood information are essential to understand the nature of floods, but this has proved difficult in data-poor environments, or in developing or under-developed countries due to economic and technological limitations. The development of remote sensing data, GIS, and modeling techniques have, therefore, proved to be useful tools in the analysis of the nature of floods. Accordingly, this study attempts to estimate a flood discharge using the generalized likelihood uncertainty estimation (GLUE methodology and a 1D hydraulic model, with remote sensing data and topographic data, under the assumed condition that there is no gauge station in the Missouri river, Nebraska, and Wabash River, Indiana, in the United States. The results show that the use of Landsat leads to a better discharge approximation on a large-scale reach than on a small-scale. Discharge approximation using the GLUE depended on the selection of likelihood measures. Consideration of physical conditions in study reaches could, therefore, contribute to an appropriate selection of informal likely measurements. The river discharge assessed by using Landsat image and the GLUE Methodology could be useful in supplementing flood information for flood risk management at a planning level in ungauged basins. However, it should be noted that this approach to the real-time application might be difficult due to the GLUE procedure.
Large Scale Processes and Extreme Floods in Brazil
Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.
2016-12-01
Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).
Ermolieva, T.; Filatova, Tatiana; Ermoliev, Y.; Obersteiner, M.; de Bruijn, K.M.; Jeuken, A.
2017-01-01
As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article
Simulating groundwater-induced sewer flooding
Mijic, A.; Mansour, M.; Stanic, M.; Jackson, C. R.
2016-12-01
During the last decade, Chalk catchments of southern England experienced severe groundwater flooding. High groundwater levels resulted in the groundwater ingress into the sewer network that led to restricted toilet use and the overflow of diluted, but untreated sewage to road surfaces, rivers and water courses. In response to these events the water and sewerage company Thames Water Utilities Ltd (TWUL) had to allocate significant funds to mitigate the impacts. It was estimated that approximately £19m was spent responding to the extreme wet weather of 2013-14, along with the use of a fleet of over 100 tankers. However, the magnitude of the event was so large that these efforts could not stop the discharge of sewage to the environment. This work presents the analysis of the risk of groundwater-induced sewer flooding within the Chalk catchment of the River Lambourn, Berkshire. A spatially distributed groundwater model was used to assess historic groundwater flood risk and the potential impacts of changes in future climate. We then linked this model to an urban groundwater model to enable us to simulate groundwater-sewer interaction in detail. The modelling setup was used to identify relationships between infiltration into sewers and groundwater levels at specific points on TWUL's sewer network, and to estimate historic and future groundwater flood risk, and how this varies across the catchment. The study showed the significance of understanding the impact of groundwater on the urban water systems, and producing information that can inform a water company's response to groundwater flood risk, their decision making process and their asset management planning. However, the knowledge gained through integrated modelling of groundwater-sewer interactions has highlighted limitations of existing approaches for the simulation of these coupled systems. We conclude this work with number of recommendations about how to improve such hydrological/sewer analysis.
International Nuclear Information System (INIS)
Mic, Rodica; Gaida, Gilles
2004-01-01
A flow-duration-frequency regionalisation is carried out on the Timis and Bega rivers sub-catchments in the west of Romania. This regionalisation concerns 28 sub-catchments having about thirty years of stream flow measurements (daily flow, instantaneous flood peaks and hydrographs). This work about the floods regionalisation is realized in the framework of the European project Riverlife. The regional model will allow defining the hydrographs of project necessary for the hydraulic modelling. This hydraulic project is necessary in order to protect Timisoara - city against the floods. The method uses the hypotheses of the converging QdF model and adapts the index flood method for obtaining a regional dimensionless distribution. For long return periods, this approach uses the GRADEX method, which extrapolates discharge distributions according to the rainfall distributions. The dimensionless regional QdF model needs two local descriptors of target site to be denormed: QIXA10 and Δ: the annual maximum instantaneous flow with a 10% probability to be exceeded (the 10-year peak flood) and a characteristic duration, respectively. For these both variables, the relations obtained by regression are presented, involving morphologic and climatic basin characteristics.(Author)
Robust k-mer frequency estimation using gapped k-mers.
Ghandi, Mahmoud; Mohammad-Noori, Morteza; Beer, Michael A
2014-08-01
Oligomers of fixed length, k, commonly known as k-mers, are often used as fundamental elements in the description of DNA sequence features of diverse biological function, or as intermediate elements in the constuction of more complex descriptors of sequence features such as position weight matrices. k-mers are very useful as general sequence features because they constitute a complete and unbiased feature set, and do not require parameterization based on incomplete knowledge of biological mechanisms. However, a fundamental limitation in the use of k-mers as sequence features is that as k is increased, larger spatial correlations in DNA sequence elements can be described, but the frequency of observing any specific k-mer becomes very small, and rapidly approaches a sparse matrix of binary counts. Thus any statistical learning approach using k-mers will be susceptible to noisy estimation of k-mer frequencies once k becomes large. Because all molecular DNA interactions have limited spatial extent, gapped k-mers often carry the relevant biological signal. Here we use gapped k-mer counts to more robustly estimate the ungapped k-mer frequencies, by deriving an equation for the minimum norm estimate of k-mer frequencies given an observed set of gapped k-mer frequencies. We demonstrate that this approach provides a more accurate estimate of the k-mer frequencies in real biological sequences using a sample of CTCF binding sites in the human genome.
Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.
2015-08-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by
Directory of Open Access Journals (Sweden)
A. M. Hashemi
2000-01-01
Full Text Available Regionalized and at-site flood frequency curves exhibit considerable variability in their shapes, but the factors controlling the variability (other than sampling effects are not well understood. An application of the Monte Carlo simulation-based derived distribution approach is presented in this two-part paper to explore the influence of climate, described by simulated rainfall and evapotranspiration time series, and basin factors on the flood frequency curve (ffc. The sensitivity analysis conducted in the paper should not be interpreted as reflecting possible climate changes, but the results can provide an indication of the changes to which the flood frequency curve might be sensitive. A single site Neyman Scott point process model of rainfall, with convective and stratiform cells (Cowpertwait, 1994; 1995, has been employed to generate synthetic rainfall inputs to a rainfall runoff model. The time series of the potential evapotranspiration (ETp demand has been represented through an AR(n model with seasonal component, while a simplified version of the ARNO rainfall-runoff model (Todini, 1996 has been employed to simulate the continuous discharge time series. All these models have been parameterised in a realistic manner using observed data and results from previous applications, to obtain ‘reference’ parameter sets for a synthetic case study. Subsequently, perturbations to the model parameters have been made one-at-a-time and the sensitivities of the generated annual maximum rainfall and flood frequency curves (unstandardised, and standardised by the mean have been assessed. Overall, the sensitivity analysis described in this paper suggests that the soil moisture regime, and, in particular, the probability distribution of soil moisture content at the storm arrival time, can be considered as a unifying link between the perturbations to the several parameters and their effects on the standardised and unstandardised ffcs, thus revealing the
A methodology for urban flood resilience assessment
Lhomme, Serge; Serre, Damien; Diab, Youssef; Laganier, Richard
2010-05-01
In Europe, river floods have been increasing in frequency and severity [Szöllösi-Nagy and Zevenbergen, 2005]. Moreover, climate change is expected to exacerbate the frequency and intensity of hydro meteorological disaster [IPCC, 2007]. Despite efforts made to maintain the flood defense assets, we often observe levee failures leading to finally increase flood risk in protected area. Furthermore, flood forecasting models, although benefiting continuous improvements, remain partly inaccurate due to uncertainties arising all along data calculation processes. In the same time, the year 2007 marks a turning point in history: half of the world population now lives in cities (UN-Habitat, 2007). Moreover, the total urban population is expected to double from two to four billion over the next 30 to 35 years (United Nations, 2006). This growing rate is equivalent to the creation of a new city of one million inhabitants every week, and this during the next four decades [Flood resilience Group]. So, this quick urban development coupled with technical failures and climate change have increased flood risk and corresponding challenges to urban flood risk management [Ashley et al., 2007], [Nie et al., 2009]. These circumstances oblige to manage flood risk by integrating new concepts like urban resilience. In recent years, resilience has become a central concept for risk management. This concept has emerged because a more resilient system is less vulnerable to risk and, therefore, more sustainable [Serre et al., 2010]. But urban flood resilience is a concept that has not yet been directly assessed. Therefore, when decision makers decide to use the resilience concept to manage urban flood, they have no tool to help them. That is why this paper proposes a methodology to assess urban flood resilience in order to make this concept operational. Networks affect the well-being of the people and the smooth functioning of services and, more generally, of economical activities. Yet
Flood early warning system in I.R. of Iran
International Nuclear Information System (INIS)
Samadi, Slina; Jamali, Javad B.; Javanmard, Soheila
2004-01-01
At the close of the twentieth century, natural hazards and disasters are one of the most common forms of disasters around the world. Natural disasters cause in significant loss of life and serious economic, environmental and social impacts that greatly retard the development process. Careful hazard assessment and planning, and a range of social, economic and political measures, can significantly contain these threats. Risk is defined as the potential for loss or damage as the result of a particular action or decision and Risk Management is a process consisting of well-defined steps which, when taken in sequence, support better decision making by contributing to a greater insight into risks and their impacts. Most commonly, there are three components in a natural disaster plan: monitoring and early warning; risk assessment; and mitigation and response. Given the improved tools and technologies available today, it is possible to provide disaster information and minimize the potential damage of disasters. In the following parts of the report, the national early warning systems for flood would be discussed, as one of the important component of natural disaster risk management. In 1. R. of Iran, also, different types of natural disasters occur, such as drought, flood, earthquake, sea-level rise, dust storm, hail, freezing and etc, but Flood hazard and disaster is one of the most frequent and damaging types of natural disasters. They have been the most common type of geophysical disaster in the latter half of the twentieth century in Iran, generating an estimated more than 20 percent of all disasters from 1950 to 2003. One of the hazardous floods of Iran occurred in Golestan and north of Khorasan provinces, located in north-east of the country, on August 2001 and 2002. In this regard, according to the responsibility of I. R. of Iran Meteorological Organization (IRIMO) on the flood forecasting, the early warning issue of the mentioned flood, issued within 48 hour's in
Kinoshita, Youhei; Tanoue, Masahiro; Watanabe, Satoshi; Hirabayashi, Yukiko
2018-01-01
This study represents the first attempt to quantify the effects of autonomous adaptation on the projection of global flood hazards and to assess future flood risk by including this effect. A vulnerability scenario, which varies according to the autonomous adaptation effect for conventional disaster mitigation efforts, was developed based on historical vulnerability values derived from flood damage records and a river inundation simulation. Coupled with general circulation model outputs and future socioeconomic scenarios, potential future flood fatalities and economic loss were estimated. By including the effect of autonomous adaptation, our multimodel ensemble estimates projected a 2.0% decrease in potential flood fatalities and an 821% increase in potential economic losses by 2100 under the highest emission scenario together with a large population increase. Vulnerability changes reduced potential flood consequences by 64%-72% in terms of potential fatalities and 28%-42% in terms of potential economic losses by 2100. Although socioeconomic changes made the greatest contribution to the potential increased consequences of future floods, about a half of the increase of potential economic losses was mitigated by autonomous adaptation. There is a clear and positive relationship between the global temperature increase from the pre-industrial level and the estimated mean potential flood economic loss, while there is a negative relationship with potential fatalities due to the autonomous adaptation effect. A bootstrapping analysis suggests a significant increase in potential flood fatalities (+5.7%) without any adaptation if the temperature increases by 1.5 °C-2.0 °C, whereas the increase in potential economic loss (+0.9%) was not significant. Our method enables the effects of autonomous adaptation and additional adaptation efforts on climate-induced hazards to be distinguished, which would be essential for the accurate estimation of the cost of adaptation to
Methodology for flood risk analysis for nuclear power plants
International Nuclear Information System (INIS)
Wagner, D.P.; Casada, M.L.; Fussell, J.B.
1984-01-01
The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability
Costache, Romulus; Zaharia, Liliana
2017-06-01
Given the significant worldwide human and economic losses caused due to floods annually, reducing the negative consequences of these hazards is a major concern in development strategies at different spatial scales. A basic step in flood risk management is identifying areas susceptible to flood occurrences. This paper proposes a methodology allowing the identification of areas with high potential of accelerated surface run-off and consequently, of flash-flood occurrences. The methodology involves assessment and mapping in GIS environment of flash flood potential index (FFPI), by integrating two statistical methods: frequency ratio and weights-of-evidence. The methodology was applied for Bâsca Chiojdului River catchment (340 km2), located in the Carpathians Curvature region (Romania). Firstly, the areas with torrential phenomena were identified and the main factors controlling the surface run-off were selected (in this study nine geographical factors were considered). Based on the features of the considered factors, many classes were set for each of them. In the next step, the weights of each class/category of the considered factors were determined, by identifying their spatial relationships with the presence or absence of torrential phenomena. Finally, the weights for each class/category of geographical factors were summarized in GIS, resulting the FFPI values for each of the two statistical methods. These values were divided into five classes of intensity and were mapped. The final results were used to estimate the flash-flood potential and also to identify the most susceptible areas to this phenomenon. Thus, the high and very high values of FFPI characterize more than one-third of the study catchment. The result validation was performed by (i) quantifying the rate of the number of pixels corresponding to the torrential phenomena considered for the study (training area) and for the results' testing (validating area) and (ii) plotting the ROC (receiver operating
Flood hazards for nuclear power plants
International Nuclear Information System (INIS)
Yen, B.C.
1988-01-01
Flooding hazards for nuclear power plants may be caused by various external geophysical events. In this paper the hydrologic hazards from flash floods, river floods and heavy rain at the plant site are considered. Depending on the mode of analysis, two types of hazard evaluation are identified: 1) design hazard which is the probability of flooding over an expected service period, and 2) operational hazard which deals with real-time forecasting of the probability of flooding of an incoming event. Hazard evaluation techniques using flood frequency analysis can only be used for type 1) design hazard. Evaluation techniques using rainfall-runoff simulation or multi-station correlation can be used for both types of hazard prediction. (orig.)
Flash Floods Simulation using a Physical-Based Hydrological Model at Different Hydroclimatic Regions
Saber, Mohamed; Kamil Yilmaz, Koray
2016-04-01
Currently, flash floods are seriously increasing and affecting many regions over the world. Therefore, this study will focus on two case studies; Wadi Abu Subeira, Egypt as arid environment, and Karpuz basin, Turkey as Mediterranean environment. The main objective of this work is to simulate flash floods at both catchments considering the hydrometeorological differences between them which in turn effect their flash flood behaviors. An integrated methodology incorporating Hydrological River Basin Environmental Assessment Model (Hydro-BEAM) and remote sensing observations was devised. Global Satellite Mapping of Precipitation (GSMAP) were compared with the rain gauge network at the target basins to estimate the bias in an effort to further use it effectively in simulation of flash floods. Based on the preliminary results of flash floods simulation on both basins, we found that runoff behaviors of flash floods are different due to the impacts of climatology, hydrological and topographical conditions. Also, the simulated surface runoff hydrographs are reasonably coincide with the simulated ones. Consequently, some mitigation strategies relying on this study could be introduced to help in reducing the flash floods disasters at different climate regions. This comparison of different climatic basins would be a reasonable implication for the potential impact of climate change on the flash floods frequencies and occurrences.
Dominant wave frequency and amplitude estimation for adaptive control of wave energy converters
Nguyen , Hoai-Nam; Tona , Paolino; Sabiron , Guillaume
2017-01-01
International audience; Adaptive control is of great interest for wave energy converters (WEC) due to the inherent time-varying nature of sea conditions. Robust and accurate estimation algorithms are required to improve the knowledge of the current sea state on a wave-to-wave basis in order to ensure power harvesting as close as possible to optimal behavior. In this paper, we present a simple but innovative approach for estimating the wave force dominant frequency and wave force dominant ampl...
A framework for global river flood risk assessments
Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.
2013-05-01
There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from the EM
A framework for global river flood risk assessments
Directory of Open Access Journals (Sweden)
H. C. Winsemius
2013-05-01
Full Text Available There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population. The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE. We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from
Flood loss assessment in the Kota Tinggi
International Nuclear Information System (INIS)
Tam, T H; Ibrahim, A L; Rahman, M Z A; Mazura, Z
2014-01-01
Malaysia is free from several destructive and widespread natural disasters but frequently affected by floods, which caused massive flood damage. In 2006 and 2007, an extreme rainfall occured in many parts of Peninsular Malaysia, which caused severe flooding in several major cities. Kota Tinggi was chosen as study area as it is one the seriously affected area in Johor state. The aim of this study is to estimate potential flood damage to physical elements in Kota Tinggi. The flood damage map contains both qualitative and quantitative information which corresponds to the consequences of flooding. This study only focuses on physical elements. Three different damage functions were adopted to calculate the potential flood damage and flood depth is considered as the main parameter. The adopted functions are United States, the Netherlands and Malaysia. The estimated flood damage for housing using United States, the Netherlands and Malaysia was RM 350/m 2 RM 200/m 2 and RM 100/m 2 respectively. These results successfully showed the average flood damage of physical element. Such important information needed by local authority and government for urban spatial planning and aiming to reduce flood risk
Teranishi, Masaru; Omatu, Sigeru; Kosaka, Toshihisa
Fatigued monetary bills adversely affect the daily operation of automated teller machines (ATMs). In order to make the classification of fatigued bills more efficient, the development of an automatic fatigued monetary bill classification method is desirable. We propose a new method by which to estimate the fatigue level of monetary bills from the feature-selected frequency band acoustic energy pattern of banking machines. By using a supervised self-organizing map (SOM), we effectively estimate the fatigue level using only the feature-selected frequency band acoustic energy pattern. Furthermore, the feature-selected frequency band acoustic energy pattern improves the estimation accuracy of the fatigue level of monetary bills by adding frequency domain information to the acoustic energy pattern. The experimental results with real monetary bill samples reveal the effectiveness of the proposed method.
Improving Gas Flooding Efficiency
Energy Technology Data Exchange (ETDEWEB)
Reid Grigg; Robert Svec; Zheng Zeng; Alexander Mikhalin; Yi Lin; Guoqiang Yin; Solomon Ampir; Rashid Kassim
2008-03-31
This study focuses on laboratory studies with related analytical and numerical models, as well as work with operators for field tests to enhance our understanding of and capabilities for more efficient enhanced oil recovery (EOR). Much of the work has been performed at reservoir conditions. This includes a bubble chamber and several core flood apparatus developed or modified to measure interfacial tension (IFT), critical micelle concentration (CMC), foam durability, surfactant sorption at reservoir conditions, and pressure and temperature effects on foam systems.Carbon dioxide and N{sub 2} systems have been considered, under both miscible and immiscible conditions. The injection of CO2 into brine-saturated sandstone and carbonate core results in brine saturation reduction in the range of 62 to 82% brine in the tests presented in this paper. In each test, over 90% of the reduction occurred with less than 0.5 PV of CO{sub 2} injected, with very little additional brine production after 0.5 PV of CO{sub 2} injected. Adsorption of all considered surfactant is a significant problem. Most of the effect is reversible, but the amount required for foaming is large in terms of volume and cost for all considered surfactants. Some foams increase resistance to the value beyond what is practical in the reservoir. Sandstone, limestone, and dolomite core samples were tested. Dissolution of reservoir rock and/or cement, especially carbonates, under acid conditions of CO2 injection is a potential problem in CO2 injection into geological formations. Another potential change in reservoir injectivity and productivity will be the precipitation of dissolved carbonates as the brine flows and pressure decreases. The results of this report provide methods for determining surfactant sorption and can be used to aid in the determination of surfactant requirements for reservoir use in a CO{sub 2}-foam flood for mobility control. It also provides data to be used to determine rock permeability
Impact of the Three-Gorges Dam and water transfer project on Changjiang floods
Nakayama, Tadanobu; Shankman, David
2013-01-01
Increasing frequency of severe floods on the middle and lower Changjiang (Yangtze) River during the past few decades can be attributed to both abnormal monsoon rainfall and landscape changes that include extensive deforestation affecting river sedimentation, and shrinking lakes and levee construction that reduced the areas available for floodwater storage. The Three-Gorges Dam (TGD) and the South-to-North Water Transfer Project (SNWTP) will also affect frequency and intensity of severe floods in the Poyang Lake region of the middle Changjiang. Process-based National Integrated Catchment-based Eco-hydrology (NICE) model predicts that the TGD will increase flood risk during the early summer monsoon against the original justifications for building the dam, relating to complex river-lake-groundwater interactions. Several scenarios predict that morphological change will increase flood risk around the lake. This indicates the importance of managing both flood discharge and sediment deposition for the entire basin. Further, the authors assessed the impact of sand mining in the lake after its prohibition on the Changjiang, and clarified that alternative scenario of sand mining in lakes currently disconnected from the mainstream would reduce the flood risk to a greater extent than intensive dredging along junction channel. Because dry biomasses simulated by the model were linearly related to the Time-Integrated Normalized Difference Vegetation Index (TINDVI) estimated from satellite images, its decadal gradient during 1982-1999 showed a spatially heterogeneous distribution and generally decreasing trends beside the lakes, indicating that the increases in lake reclamation and the resultant decrease in rice productivity are closely related to the hydrologic changes. This integrated approach could help to minimize flood damage and promote better decisions addressing sustainable development.
Robust Pitch Estimation Using an Optimal Filter on Frequency Estimates
DEFF Research Database (Denmark)
Karimian-Azari, Sam; Jensen, Jesper Rindom; Christensen, Mads Græsbøll
2014-01-01
of such signals from unconstrained frequency estimates (UFEs). A minimum variance distortionless response (MVDR) method is proposed as an optimal solution to minimize the variance of UFEs considering the constraint of integer harmonics. The MVDR filter is designed based on noise statistics making it robust...
Cigrand, Charles V.
2018-03-26
The U.S. Geological Survey (USGS) in cooperation with the city of West Branch and the Herbert Hoover National Historic Site of the National Park Service assessed flood-mitigation scenarios within the West Branch Wapsinonoc Creek watershed. The scenarios are intended to demonstrate several means of decreasing peak streamflows and improving the conveyance of overbank flows from the West Branch Wapsinonoc Creek and its tributary Hoover Creek where they flow through the city and the Herbert Hoover National Historic Site located within the city.Hydrologic and hydraulic models of the watershed were constructed to assess the flood-mitigation scenarios. To accomplish this, the models used the U.S. Army Corps of Engineers Hydrologic Engineering Center-Hydrologic Modeling System (HEC–HMS) version 4.2 to simulate the amount of runoff and streamflow produced from single rain events. The Hydrologic Engineering Center-River Analysis System (HEC–RAS) version 5.0 was then used to construct an unsteady-state model that may be used for routing streamflows, mapping areas that may be inundated during floods, and simulating the effects of different measures taken to decrease the effects of floods on people and infrastructure.Both models were calibrated to three historic rainfall events that produced peak streamflows ranging between the 2-year and 10-year flood-frequency recurrence intervals at the USGS streamgage (05464942) on Hoover Creek. The historic rainfall events were calibrated by using data from two USGS streamgages along with surveyed high-water marks from one of the events. The calibrated HEC–HMS model was then used to simulate streamflows from design rainfall events of 24-hour duration ranging from a 20-percent to a 1-percent annual exceedance probability. These simulated streamflows were incorporated into the HEC–RAS model.The unsteady-state HEC–RAS model was calibrated to represent existing conditions within the watershed. HEC–RAS model simulations with the
Distillation Column Flooding Predictor
Energy Technology Data Exchange (ETDEWEB)
George E. Dzyacky
2010-11-23
The Flooding Predictor™ is a patented advanced control technology proven in research at the Separations Research Program, University of Texas at Austin, to increase distillation column throughput by over 6%, while also increasing energy efficiency by 10%. The research was conducted under a U. S. Department of Energy Cooperative Agreement awarded to George Dzyacky of 2ndpoint, LLC. The Flooding Predictor™ works by detecting the incipient flood point and controlling the column closer to its actual hydraulic limit than historical practices have allowed. Further, the technology uses existing column instrumentation, meaning no additional refining infrastructure is required. Refiners often push distillation columns to maximize throughput, improve separation, or simply to achieve day-to-day optimization. Attempting to achieve such operating objectives is a tricky undertaking that can result in flooding. Operators and advanced control strategies alike rely on the conventional use of delta-pressure instrumentation to approximate the column’s approach to flood. But column delta-pressure is more an inference of the column’s approach to flood than it is an actual measurement of it. As a consequence, delta pressure limits are established conservatively in order to operate in a regime where the column is never expected to flood. As a result, there is much “left on the table” when operating in such a regime, i.e. the capacity difference between controlling the column to an upper delta-pressure limit and controlling it to the actual hydraulic limit. The Flooding Predictor™, an innovative pattern recognition technology, controls columns at their actual hydraulic limit, which research shows leads to a throughput increase of over 6%. Controlling closer to the hydraulic limit also permits operation in a sweet spot of increased energy-efficiency. In this region of increased column loading, the Flooding Predictor is able to exploit the benefits of higher liquid
Akyurek, Z.; Bozoglu, B.; Girayhan, T.
2015-12-01
Flooding has the potential to cause significant impacts to economic activities as well as to disrupt or displace populations. Changing climate regimes such as extreme precipitation events increase flood vulnerability and put additional stresses on infrastructure. In this study the flood modelling in an urbanized area, namely Samsun-Terme in Blacksea region of Turkey is done. MIKE21 with flexible grid is used in 2- dimensional shallow water flow modelling. 1/1000 scaled maps with the buildings for the urbanized area and 1/5000 scaled maps for the rural parts are used to obtain DTM needed in the flood modelling. The bathymetry of the river is obtained from additional surveys. The main river passing through the urbanized area has a capacity of Q5 according to the design discharge obtained by simple ungauged discharge estimation depending on catchment area only. The effects of the available structures like bridges across the river on the flooding are presented. The upstream structural measures are studied on scenario basis. Four sub-catchments of Terme River are considered as contributing the downstream flooding. The existing circumstance of the Terme River states that the meanders of the river have a major effect on the flood situation and lead to approximately 35% reduction in the peak discharge between upstream and downstream of the river. It is observed that if the flow from the upstream catchments can be retarded through a detention pond constructed in at least two of the upstream catchments, estimated Q100 flood can be conveyed by the river without overtopping from the river channel. The operation of the upstream detention ponds and the scenarios to convey Q500 without causing flooding are also presented. Structural management measures to address changes in flood characteristics in water management planning are discussed. Flood risk is obtained by using the flood hazard maps and water depth-damage functions plotted for a variety of building types and occupancies
Analysis of magnitude and duration of floods and droughts in the context of climate change
Eshetu Debele, Sisay; Bogdanowicz, Ewa; Strupczewski, Witold
2016-04-01
Research and scientific information are key elements of any decision-making process. There is also a strong need for tools to describe and compare in a concise way the regime of hydrological extreme events in the context of presumed climate change. To meet these demands, two complementary methods for estimating high and low-flow frequency characteristics are proposed. Both methods deal with duration and magnitude of extreme events. The first one "flow-duration-frequency" (known as QdF) has already been applied successfully to low-flow analysis, flood flows and rainfall intensity. The second one called "duration-flow-frequency" (DqF) was proposed by Strupczewski et al. in 2010 to flood frequency analysis. The two methods differ in the treatment of flow and duration. In the QdF method the duration (d-consecutive days) is a chosen fixed value and the frequency analysis concerns the annual or seasonal series of mean value of flows exceeded (in the case of floods) or non-exceeded (in the case of droughts) within d-day period. In the second method, DqF, the flows are treated as fixed thresholds and the duration of flows exceeding (floods) and non-exceeding (droughts) these thresholds are a subject of frequency analysis. The comparison of characteristics of floods and droughts in reference period and under future climate conditions for catchments studied within the CHIHE project is presented and a simple way to show the results to non-professionals and decision-makers is proposed. The work was undertaken within the project "Climate Change Impacts on Hydrological Extremes (CHIHE)", which is supported by the Norway-Poland Grants Program administered by the Norwegian Research Council. The observed time series were provided by the Institute of Meteorology and Water Management (IMGW), Poland. Strupczewski, W. G., Kochanek, K., Markiewicz, I., Bogdanowicz, E., Weglarczyk, S., & Singh V. P. (2010). On the Tails of Distributions of Annual Peak Flow. Hydrology Research, 42, 171
Valent, Peter; Paquet, Emmanuel
2017-09-01
A reliable estimate of extreme flood characteristics has always been an active topic in hydrological research. Over the decades a large number of approaches and their modifications have been proposed and used, with various methods utilizing continuous simulation of catchment runoff, being the subject of the most intensive research in the last decade. In this paper a new and promising stochastic semi-continuous method is used to estimate extreme discharges in two mountainous Slovak catchments of the rivers Váh and Hron, in which snow-melt processes need to be taken into account. The SCHADEX method used, couples a precipitation probabilistic model with a rainfall-runoff model used to both continuously simulate catchment hydrological conditions and to transform generated synthetic rainfall events into corresponding discharges. The stochastic nature of the method means that a wide range of synthetic rainfall events were simulated on various historical catchment conditions, taking into account not only the saturation of soil, but also the amount of snow accumulated in the catchment. The results showed that the SCHADEX extreme discharge estimates with return periods of up to 100 years were comparable to those estimated by statistical approaches. In addition, two reconstructed historical floods with corresponding return periods of 100 and 1000 years were compared to the SCHADEX estimates. The results confirmed the usability of the method for estimating design discharges with a recurrence interval of more than 100 years and its applicability in Slovak conditions.
Directory of Open Access Journals (Sweden)
B. Büchele
2006-01-01
Full Text Available Currently, a shift from classical flood protection as engineering task towards integrated flood risk management concepts can be observed. In this context, a more consequent consideration of extreme events which exceed the design event of flood protection structures and failure scenarios such as dike breaches have to be investigated. Therefore, this study aims to enhance existing methods for hazard and risk assessment for extreme events and is divided into three parts. In the first part, a regionalization approach for flood peak discharges was further developed and substantiated, especially regarding recurrence intervals of 200 to 10 000 years and a large number of small ungauged catchments. Model comparisons show that more confidence in such flood estimates for ungauged areas and very long recurrence intervals may be given as implied by statistical analysis alone. The hydraulic simulation in the second part is oriented towards hazard mapping and risk analyses covering the whole spectrum of relevant flood events. As the hydrodynamic simulation is directly coupled with a GIS, the results can be easily processed as local inundation depths for spatial risk analyses. For this, a new GIS-based software tool was developed, being presented in the third part, which enables estimations of the direct flood damage to single buildings or areas based on different established stage-damage functions. Furthermore, a new multifactorial approach for damage estimation is presented, aiming at the improvement of damage estimation on local scale by considering factors like building quality, contamination and precautionary measures. The methods and results from this study form the base for comprehensive risk analyses and flood management strategies.