WorldWideScience

Sample records for demonstration flood big

  1. Thirty Years Later: Reflections of the Big Thompson Flood, Colorado, 1976 to 2006

    Science.gov (United States)

    Jarrett, R. D.; Costa, J. E.; Brunstein, F. C.; Quesenberry, C. A.; Vandas, S. J.; Capesius, J. P.; O'Neill, G. B.

    2006-12-01

    Thirty years ago, over 300 mm of rain fell in about 4 to 6 hours in the middle reaches of the Big Thompson River Basin during the devastating flash flood on July 31, 1976. The rainstorm produced flood discharges that exceeded 40 m3/s/km2. A peak discharge of 883 m3/s was estimated at the Big Thompson River near Drake streamflow-gaging station. The raging waters left 144 people dead, 250 injured, and over 800 people were evacuated by helicopter. Four-hundred eighteen homes and businesses were destroyed, as well as 438 automobiles, and damage to infrastructure left the canyon reachable only via helicopter. Total damage was estimated in excess of $116 million (2006 dollars). Natural hazards similar to the Big Thompson flood are rare, but the probability of a similar event hitting the Front Range, other parts of Colorado, or other parts of the Nation is real. Although much smaller in scale than the Big Thompson flood, several flash floods have happened during the monsoon in early July 2006 in the Colorado foothills that reemphasized the hazards associated with flash flooding. The U.S. Geological Survey (USGS) conducts flood research to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson flood. A summary of hydrologic conditions of the 1976 flood, what the 1976 flood can teach us about flash floods, a description of some of the advances in USGS flood science as a consequence of this disaster, and lessons that we learned to help reduce loss of life from this extraordinary flash flood are discussed. In the 30 years since the Big Thompson flood, there have been important advances in streamflow monitoring and flood warning. The National Weather Service (NWS) NEXRAD radar allows real-time monitoring of precipitation in most places in the United States. The USGS currently (2006) operates about 7,250 real-time streamflow-gaging stations in the United States that are monitored by the USGS, the NWS, and emergency managers

  2. Flood-inundation maps for a 12.5-mile reach of Big Papillion Creek at Omaha, Nebraska

    Science.gov (United States)

    Strauch, Kellan R.; Dietsch, Benjamin J.; Anderson, Kayla J.

    2016-03-22

    Digital flood-inundation maps for a 12.5-mile reach of the Big Papillion Creek from 0.6 mile upstream from the State Street Bridge to the 72nd Street Bridge in Omaha, Nebraska, were created by the U.S. Geological Survey (USGS) in cooperation with the Papio-Missouri River Natural Resources District. The flood-inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage on the Big Papillion Creek at Fort Street at Omaha, Nebraska (station 06610732). Near-real-time stages at this streamgage may be obtained on the Internet from the USGS National Water Information System at http://waterdata.usgs.gov/ or the National Weather Service Advanced Hydrologic Prediction Service at http:/water.weather.gov/ahps/, which also forecasts flood hydrographs at this site.

  3. Assessment of big floods in the Eastern Black Sea Basin of Turkey.

    Science.gov (United States)

    Yüksek, Ömer; Kankal, Murat; Üçüncü, Osman

    2013-01-01

    In this study, general knowledge and some details of the floods in Eastern Black Sea Basin of Turkey are presented. Brief hydro-meteorological analysis of selected nine floods and detailed analysis of the greatest flood are given. In the studied area, 51 big floods have taken place between 1955-2005 years, causing 258 deaths and nearly US $500,000,000 of damage. Most of the floods have occurred in June, July and August. It is concluded that especially for the rainstorms that have caused significantly damages, the return periods of the rainfall heights and resultant flood discharges have gone up to 250 and 500 years, respectively. A general agreement is observed between the return periods of rains and resultant floods. It is concluded that there has been no significant climate change to cause increases in flood harms. The most important human factors to increase the damage are determined as wrong and illegal land use, deforestation and wrong urbanization and settlement, psychological and technical factors. Some structural and non-structural measures to mitigate flood damages are also included in the paper. Structural measures include dykes and flood levees. Main non-structural measures include flood warning system, modification of land use, watershed management and improvement, flood insurance, organization of flood management studies, coordination between related institutions and education of the people and informing of the stakeholders.

  4. Effectiveness and reliability of emergency measures for flood prevention

    NARCIS (Netherlands)

    Lendering, K.T.; Jonkman, S.N.; Kok, M.

    2014-01-01

    Floods in the summer of 2013 in Central Europe demonstrated once again that floods account for a large part of damage and loss of life caused by natural disasters. During flood threats emergency measures, such as sand bags and big bags, are often applied to strengthen the flood defences and attempt

  5. Flood-inundation maps for Big Creek from the McGinnis Ferry Road bridge to the confluence of Hog Wallow Creek, Alpharetta and Roswell, Georgia

    Science.gov (United States)

    Musser, Jonathan W.

    2015-08-20

    Digital flood-inundation maps for a 12.4-mile reach of Big Creek that extends from 260 feet above the McGinnis Ferry Road bridge to the U.S. Geological Survey (USGS) streamgage at Big Creek below Hog Wallow Creek at Roswell, Georgia (02335757), were developed by the USGS in cooperation with the cities of Alpharetta and Roswell, Georgia. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage at Big Creek near Alpharetta, Georgia (02335700). Real-time stage information from this USGS streamgage may be obtained at http://waterdata.usgs.gov/ and can be used in conjunction with these maps to estimate near real-time areas of inundation. The National Weather Service (NWS) is incorporating results from this study into the Advanced Hydrologic Prediction Service (AHPS) flood-warning system http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs for many streams where the USGS operates streamgages and provides flow data. The forecasted peak-stage information for the USGS streamgage at Big Creek near Alpharetta (02335700), available through the AHPS Web site, may be used in conjunction with the maps developed for this study to show predicted areas of flood inundation.

  6. Technical note: River modelling to infer flood management framework

    African Journals Online (AJOL)

    River hydraulic models have successfully identified the weaknesses and areas for improvement with respect to flooding in the Sarawak River system, and can also be used to support decisions on flood management measures. Often, the big question is 'how'. This paper demonstrates a theoretical flood management ...

  7. Peak discharge, flood frequency, and peak stage of floods on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado, and Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado, 2016

    Science.gov (United States)

    Kohn, Michael S.; Stevens, Michael R.; Mommandi, Amanullah; Khan, Aziz R.

    2017-12-14

    The U.S. Geological Survey (USGS), in cooperation with the Colorado Department of Transportation, determined the peak discharge, annual exceedance probability (flood frequency), and peak stage of two floods that took place on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado (hereafter referred to as “Big Cottonwood Creek site”), on August 23, 2016, and on Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado (hereafter referred to as “Fountain Creek site”), on August 29, 2016. A one-dimensional hydraulic model was used to estimate the peak discharge. To define the flood frequency of each flood, peak-streamflow regional-regression equations or statistical analyses of USGS streamgage records were used to estimate annual exceedance probability of the peak discharge. A survey of the high-water mark profile was used to determine the peak stage, and the limitations and accuracy of each component also are presented in this report. Collection and computation of flood data, such as peak discharge, annual exceedance probability, and peak stage at structures critical to Colorado’s infrastructure are an important addition to the flood data collected annually by the USGS.The peak discharge of the August 23, 2016, flood at the Big Cottonwood Creek site was 917 cubic feet per second (ft3/s) with a measurement quality of poor (uncertainty plus or minus 25 percent or greater). The peak discharge of the August 29, 2016, flood at the Fountain Creek site was 5,970 ft3/s with a measurement quality of poor (uncertainty plus or minus 25 percent or greater).The August 23, 2016, flood at the Big Cottonwood Creek site had an annual exceedance probability of less than 0.01 (return period greater than the 100-year flood) and had an annual exceedance probability of greater than 0.005 (return period less than the 200-year flood). The August 23, 2016, flood event was caused by a precipitation event having an annual exceedance probability of 1.0 (return

  8. Coastal Flooding in Florida's Big Bend Region with Application to Sea Level Rise Based on Synthetic Storms Analysis

    Directory of Open Access Journals (Sweden)

    Scott C. Hagen Peter Bacopoulos

    2012-01-01

    Full Text Available Flooding is examined by comparing maximum envelopes of water against the 0.2% (= 1-in-500-year return-period flooding surface generated as part of revising the Federal Emergency Management Agency¡¦s flood insurance rate maps for Franklin, Wakulla, and Jefferson counties in Florida¡¦s Big Bend Region. The analysis condenses the number of storms to a small fraction of the original 159 used in production. The analysis is performed by assessing which synthetic storms contributed to inundation extent (the extent of inundation into the floodplain, coverage (the overall surface area of the inundated floodplain and the spatially variable 0.2% flooding surface. The results are interpreted in terms of storm attributes (pressure deficit, radius to maximum winds, translation speed, storm heading, and landfall location and the physical processes occurring within the natural system (storms surge and waves; both are contextualized against existing and new hurricane scales. The approach identifies what types of storms and storm attributes lead to what types of inundation, as measured in terms of extent and coverage, in Florida¡¦s Big Bend Region and provides a basis in the identification of a select subset of synthetic storms for studying the impact of sea level rise. The sea level rise application provides a clear contrast between a dynamic approach versus that of a static approach.

  9. From Big Data to Small Transportable Products for Decision Support for Floods in Namibia

    Science.gov (United States)

    Mandl, D.; Frye, S.; Cappelaere, P.; Policelli, F.; Handy, M.; Sohlberg, R. A.; Grossman, R.

    2013-12-01

    During the past four years, a team from NASA, Oklahoma University, University of Maryland and University of Chicago in collaboration with the Namibia Hydrological Services (NHS) has explored ways to provide decision support products for floods. The products include a variety of data including a hydrological model, ground measurements such as river gauges, and earth remote sensing data. This poster or presentation highlights the lessons learned in acquiring, storing, managing big data on the cloud and turning it into relevant products for GEOSS users. Technology that has been explored includes the use of Hadoop/MapReduce and Accumulo to process and manage the large data sets. OpenStreetMap was explored for use in cataloging water boundaries and enabling collaborative mapping of the base water mask and floods. A Flood Dashboard was created to customize displays of various data products. Finally, a higher level Geo-Social Application Processing Interface (API) was developed so that users can discover, generate products dynamically for their specific needs/societal benefit areas and then share them with their Community of Practice over social networks. Results of this experiment have included 100x reduction in size of some flood products, making it possible to distribute these products to mobile platforms and/or bandwidth-limited users.

  10. The ordered network structure and its prediction for the big floods of the Changjiang River Basins

    Energy Technology Data Exchange (ETDEWEB)

    Men, Ke-Pei; Zhao, Kai; Zhu, Shu-Dan [Nanjing Univ. of Information Science and Technology, Nanjing (China). College of Mathematics and Statistics

    2013-12-15

    According to the latest statistical data of hydrology, a total of 21 floods took place over the Changjiang (Yangtze) River Basins from 1827 to 2012 and showed an obvious commensurable orderliness. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered analysis with complex network technology, we focus on the summary of the ordered network structure of the Changjiang floods, supplement new information, further optimize networks, construct the 2D- and 3D-ordered network structure and make prediction research. Predictions show that the future big deluges will probably occur over the Changjiang River Basin around 2013-2014, 2020-2021, 2030, 2036, 2051, and 2058. (orig.)

  11. The ordered network structure and its prediction for the big floods of the Changjiang River Basins

    International Nuclear Information System (INIS)

    Men, Ke-Pei; Zhao, Kai; Zhu, Shu-Dan

    2013-01-01

    According to the latest statistical data of hydrology, a total of 21 floods took place over the Changjiang (Yangtze) River Basins from 1827 to 2012 and showed an obvious commensurable orderliness. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered analysis with complex network technology, we focus on the summary of the ordered network structure of the Changjiang floods, supplement new information, further optimize networks, construct the 2D- and 3D-ordered network structure and make prediction research. Predictions show that the future big deluges will probably occur over the Changjiang River Basin around 2013-2014, 2020-2021, 2030, 2036, 2051, and 2058. (orig.)

  12. The geomorphic effectiveness of a large flood on the Rio Grande in the Big Bend region: insights on geomorphic controls and post-flood geomorphic response

    Science.gov (United States)

    Dean, David J.; Schmidt, John C.

    2013-01-01

    Since the 1940s, the Rio Grande in the Big Bend region has undergone long periods of channel narrowing, which have been occasionally interrupted by rare, large floods that widen the channel (termed a channel reset). The most recent channel reset occurred in 2008 following a 17-year period of extremely low stream flow and rapid channel narrowing. Flooding was caused by precipitation associated with the remnants of tropical depression Lowell in the Rio Conchos watershed, the largest tributary to the Rio Grande. Floodwaters approached 1500 m3/s (between a 13 and 15 year recurrence interval) and breached levees, inundated communities, and flooded the alluvial valley of the Rio Grande; the wetted width exceeding 2.5 km in some locations. The 2008 flood had the 7th largest magnitude of record, however, conveyed the largest volume of water than any other flood. Because of the narrow pre-flood channel conditions, record flood stages occurred. We used pre- and post-flood aerial photographs, channel and floodplain surveys, and 1-dimensional hydraulic models to quantify the magnitude of channel change, investigate the controls of flood-induced geomorphic changes, and measure the post-flood response of the widened channel. These analyses show that geomorphic changes included channel widening, meander migration, avulsions, extensive bar formation, and vertical floodplain accretion. Reach-averaged channel widening between 26 and 52% occurred, but in some localities exceeded 500%. The degree and style of channel response was related, but not limited to, three factors: 1) bed-load supply and transport, 2) pre-flood channel plan form, and 3) rapid declines in specific stream power downstream of constrictions and areas of high channel bed slope. The post-flood channel response has consisted of channel contraction through the aggradation of the channel bed and the formation of fine-grained benches inset within the widened channel margins. The most significant post-flood geomorphic

  13. PROCESSING BIG REMOTE SENSING DATA FOR FAST FLOOD DETECTION IN A DISTRIBUTED COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2017-07-01

    Full Text Available The Earth observation (EO missions of the space agencies and space industry (ESA, NASA, national and commercial companies are evolving as never before. These missions aim to develop and launch next-generation series of satellites and sensors and often provide huge amounts of data, even free of charge, to enable novel monitoring services. The wide geospatial sector is targeted to handle new challenges to store, process and visualize these geospatial data, reaching the level of Big Data by their volume, variety, velocity, along with the need of multi-source spatio-temporal geospatial data processing. Handling and analysis of remote sensing data has always been a cumbersome task due to the ever-increasing size and frequency of collected information. This paper presents the achievements of the IQmulus EU FP7 research and development project with respect to processing and analysis of geospatial big data in the context of flood and waterlogging detection.

  14. WBP: The wood Brazilian BIG-GT demonstration project

    Energy Technology Data Exchange (ETDEWEB)

    Carpentieri, E. [Companhia Hidro Eletrica do Sao Francisco, Recife (Brazil)

    1993-12-31

    Brazil is one of the leading countries in the use of renewable energy. Most of its electricity comes from hydro power, about 200,000 barrels a day of ethanol from sugar cane is used as fuel, around 38% of the pig iron, and 20% of the steel production, uses charcoal as a reducing medium. Located in the tropics, with the sun shining all year round, and with its vast territory, the Country may be regarded as having all the basic conditions to develop a modern Biomass for Electricity industry. The conjunction of those characteristics with, the necessity of developing new energy resources for electricity production in the Northeast of the Country, the results of the studies made by Princeton University, Shell and Chesf, the progress achieved by the BIG-GT (Biomass Integrated Gasification Gas Turbine) technology in Europe, and the organization of the Global Environment Facility (GEF), provided the unique opportunity for the implementation of a commercial demonstration in Brazil. This paper describes the idea, the scope, the technical challenges, and actual status of development of the WBP, a project which aims to demonstrate the commercial viability of the BIG-GT technology. It also highlights, the project management structure, the role of the GEF, World Bank and of the United Nations Development Program (UNDP), and the participation of the Brazilian Federal Government, through the Ministry of Science and Technology (MCT). Finally it describes the Participants (ELETROBRAS, CVRD, CIENTEC, SHELL, and CHESF), their role in the project, and how the group was formed and operates.

  15. The analysis on the flood property of Weihe River in 2003

    International Nuclear Information System (INIS)

    Liu Longqing; Jiang Xinhui

    2004-01-01

    From the end of Aug to Oct in 2003, it occurred a serious rainfall in the Weihe River --the largest tributary of Yellow River. The rainfall is rare in the history with long duration in the Weihe River valley so that 5 successive floods have formed at the controlling hydrological station-Huaxian station. Those floods overflow the beach, broke the dykes and flood the big area of Lower Weihe River. The natural adversity made near 200.000 populations leave their homeland the serious economic losses. The durations of the floods are long, the water levels are high and the volume of floods is largeness, which is rare in the history to a large extent. The flood peak at Huaxian station is up to 3570 m 3 /s, which is the first biggest peak since 1992. In recent years, owing to the fact that probability of the big flood on Weihe River was rare, the main river was withered clearly, propagation time of flood is lengthened and the discharge flowing over the floodplain was only 800-1000 m 3 /s. The water producing areas of those floods were in the area with little sediment production and the sediment content of the river is lower. As a result, the main river is eroded, the discharge ability of the river course becomes big gradually and the discharge flowing over the floodplain recovers above 2000 m 3 /s. From the analyses of flood components and flood progress, the conclusion is: the sediment deposit and the rising of channel bed, the withering of the main river, the decreasing of the discharge flowing over the floodplain, the increasing of the large peak whittling rate and the prolonging of the propagation duration, all have become the universal appearance of the rivers in arid and half arid districts. The appearance is extremely easily to create the serious calamity in the big flood and the flood law in local area should be researched further.(Author)

  16. Exploitation of Documented Historical Floods for Achieving Better Flood Defense

    Directory of Open Access Journals (Sweden)

    Slobodan Kolaković

    2016-01-01

    Full Text Available Establishing Base Flood Elevation for a stream network corresponding to a big catchment is feasible by interdisciplinary approach, involving stochastic hydrology, river hydraulics, and computer aided simulations. A numerical model calibrated by historical floods has been exploited in this study. The short presentation of the catchment of the Tisza River in this paper is followed by the overview of historical floods which hit the region in the documented period of 130 years. Several well documented historical floods provided opportunity for the calibration of the chosen numerical model. Once established, the model could be used for investigation of different extreme flood scenarios and to establish the Base Flood Elevation. The calibration has shown that the coefficient of friction in case of the Tisza River is dependent both on the actual water level and on the preceding flood events. The effect of flood plain maintenance as well as the activation of six potential detention ponds on flood mitigation has been examined. Furthermore, the expected maximum water levels have also been determined for the case if the ever observed biggest 1888 flood hit the region again. The investigated cases of flood superposition highlighted the impact of tributary Maros on flood mitigation along the Tisza River.

  17. Understanding the allure of big infrastructure: Jakarta’s Great Garuda Sea Wall Project

    Directory of Open Access Journals (Sweden)

    Emma Colven

    2017-06-01

    Full Text Available In response to severe flooding in Jakarta, a consortium of Dutch firms in collaboration with the Indonesian government has designed the 'Great Garuda Sea Wall' project. The master plan proposes to construct a sea wall to enclose Jakarta Bay. A new waterfront city will be built on over 1000 hectares (ha of reclaimed land in the shape of the Garuda, Indonesia’s national symbol. By redeveloping North Jakarta, the project promises to realise the world-class city aspirations of Indonesia’s political elites. Heavily reliant on hydrological engineering, hard infrastructure and private capital, the project has been presented by proponents as the optimum way to protect the city from flooding. The project retains its allure among political elites despite not directly addressing land subsidence, understood to be a primary cause of flooding. I demonstrate how this project is driven by a techno-political network that brings together political and economic interests, world-class city discourses, engineering expertise, colonial histories, and postcolonial relations between Jakarta and the Netherlands. Due in part to this network, big infrastructure has long constituted the preferred state response to flooding in Jakarta. I thus make a case for provincialising narratives that claim we are witnessing a return to big infrastructure in water management.

  18. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  19. Enhancing Big Data Value Using Knowledge Discovery Techniques

    OpenAIRE

    Mai Abdrabo; Mohammed Elmogy; Ghada Eltaweel; Sherif Barakat

    2016-01-01

    The world has been drowned by floods of data due to technological development. Consequently, the Big Data term has gotten the expression to portray the gigantic sum. Different sorts of quick data are doubling every second. We have to profit from this enormous surge of data to convert it to knowledge. Knowledge Discovery (KDD) can enhance detecting the value of Big Data based on some techniques and technologies like Hadoop, MapReduce, and NoSQL. The use of Big D...

  20. Application of Indigenous Knowledge to Flood Prevention and ...

    African Journals Online (AJOL)

    In the last three decades, flooding has become a nightmare associated with rainfall in all the continents of the world, as it records heavy casualties everywhere and each time it occurred. Flooding is now a big and seemingly unstoppable environmental threat to rural and urban settlements, in both developed and developing ...

  1. Results from the Big Spring basin water quality monitoring and demonstration projects, Iowa, USA

    Science.gov (United States)

    Rowden, R.D.; Liu, H.; Libra, R.D.

    2001-01-01

    Agricultural practices, hydrology, and water quality of the 267-km2 Big Spring groundwater drainage basin in Clayton County, Iowa, have been monitored since 1981. Land use is agricultural; nitrate-nitrogen (-N) and herbicides are the resulting contaminants in groundwater and surface water. Ordovician Galena Group carbonate rocks comprise the main aquifer in the basin. Recharge to this karstic aquifer is by infiltration, augmented by sinkhole-captured runoff. Groundwater is discharged at Big Spring, where quantity and quality of the discharge are monitored. Monitoring has shown a threefold increase in groundwater nitrate-N concentrations from the 1960s to the early 1980s. The nitrate-N discharged from the basin typically is equivalent to over one-third of the nitrogen fertilizer applied, with larger losses during wetter years. Atrazine is present in groundwater all year; however, contaminant concentrations in the groundwater respond directly to recharge events, and unique chemical signatures of infiltration versus runoff recharge are detectable in the discharge from Big Spring. Education and demonstration efforts have reduced nitrogen fertilizer application rates by one-third since 1981. Relating declines in nitrate and pesticide concentrations to inputs of nitrogen fertilizer and pesticides at Big Spring is problematic. Annual recharge has varied five-fold during monitoring, overshadowing any water-quality improvements resulting from incrementally decreased inputs. ?? Springer-Verlag 2001.

  2. Harvesting Social Media for Generation of Near Real-time Flood Maps

    NARCIS (Netherlands)

    Eilander, Dirk; Trambauer, Patricia; Wagemaker, Jurjen; Van Loenen, Arnejan

    2016-01-01

    Social media are a new, big and exciting source of data. Rather than from traditional sensors and models, this data is from local people experiencing real-world phenomena, such as flood events. During floods, disaster managers often have trouble getting an accurate overview of the current situation.

  3. 76 FR 21664 - Final Flood Elevation Determinations

    Science.gov (United States)

    2011-04-18

    ... proof Flood Insurance Study and FIRM available at the address cited below for each community. The BFEs... 2,100 feet +861 upstream of 11th Street. Big Duck Creek At South P Street........ +843 City of...

  4. A Cloud-Based Global Flood Disaster Community Cyber-Infrastructure: Development and Demonstration

    Science.gov (United States)

    Wan, Zhanming; Hong, Yang; Khan, Sadiq; Gourley, Jonathan; Flamig, Zachary; Kirschbaum, Dalia; Tang, Guoqiang

    2014-01-01

    Flood disasters have significant impacts on the development of communities globally. This study describes a public cloud-based flood cyber-infrastructure (CyberFlood) that collects, organizes, visualizes, and manages several global flood databases for authorities and the public in real-time, providing location-based eventful visualization as well as statistical analysis and graphing capabilities. In order to expand and update the existing flood inventory, a crowdsourcing data collection methodology is employed for the public with smartphones or Internet to report new flood events, which is also intended to engage citizen-scientists so that they may become motivated and educated about the latest developments in satellite remote sensing and hydrologic modeling technologies. Our shared vision is to better serve the global water community with comprehensive flood information, aided by the state-of-the- art cloud computing and crowdsourcing technology. The CyberFlood presents an opportunity to eventually modernize the existing paradigm used to collect, manage, analyze, and visualize water-related disasters.

  5. Effectiveness of flood damage mitigation measures: Empirical evidence from French flood disasters

    NARCIS (Netherlands)

    Poussin, J.K.; Botzen, W.J.W.; Aerts, J.C.J.H.

    2015-01-01

    Recent destructive flood events and projected increases in flood risks as a result of climate change in many regions around the world demonstrate the importance of improving flood risk management. Flood-proofing of buildings is often advocated as an effective strategy for limiting damage caused by

  6. Hydrological forecast of maximal water level in Lepenica river basin and flood control measures

    Directory of Open Access Journals (Sweden)

    Milanović Ana

    2006-01-01

    Full Text Available Lepenica river basin territory has became axis of economic and urban development of Šumadija district. However, considering Lepenica River with its tributaries, and their disordered river regime, there is insufficient of water for water supply and irrigation, while on the other hand, this area is suffering big flood and torrent damages (especially Kragujevac basin. The paper presents flood problems in the river basin, maximum water level forecasts, and flood control measures carried out until now. Some of the potential solutions, aiming to achieve the effective flood control, are suggested as well.

  7. Hyper-resolution monitoring of urban flooding with social media and crowdsourcing data

    Science.gov (United States)

    Wang, Ruo-Qian; Mao, Huina; Wang, Yuan; Rae, Chris; Shaw, Wesley

    2018-02-01

    Hyper-resolution datasets for urban flooding are rare. This problem prevents detailed flooding risk analysis, urban flooding control, and the validation of hyper-resolution numerical models. We employed social media and crowdsourcing data to address this issue. Natural Language Processing and Computer Vision techniques are applied to the data collected from Twitter and MyCoast (a crowdsourcing app). We found these big data based flood monitoring approaches can complement the existing means of flood data collection. The extracted information is validated against precipitation data and road closure reports to examine the data quality. The two data collection approaches are compared and the two data mining methods are discussed. A series of suggestions is given to improve the data collection strategy.

  8. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  9. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  10. The Semantic Network of Flood Hydrological Data for Kelantan, Malaysia

    Science.gov (United States)

    Yusoff, Aziyati; Din, Norashidah Md; Yussof, Salman; Ullah Khan, Samee

    2016-03-01

    Every year, authorities in Malaysia are putting efforts on disaster management mechanisms including the flood incidence that might hit the east coast of Peninsular Malaysia. This includes the state of Kelantan of which it was reported that flood is just a normal event occurred annually. However, the aftermath was always unmanageable and had left the state to struggle for its own recoveries. Though it was expected that flood occurred every year, among the worst were in 1967, 1974, 1982 and recently in December 2014. This study is proposing a semantic network as an approach to the method of utilising big data analytics in analysing the huge data from the state’s flood reading stations. It is expected that by using current computing edge can also facilitate mitigating this particular disaster.

  11. Nursing Management Minimum Data Set: Cost-Effective Tool To Demonstrate the Value of Nurse Staffing in the Big Data Science Era.

    Science.gov (United States)

    Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L

    2016-01-01

    There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.

  12. Development of flood index by characterisation of flood hydrographs

    Science.gov (United States)

    Bhattacharya, Biswa; Suman, Asadusjjaman

    2015-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Due to climatological characteristics there are catchments where flood forecasting may have a relatively limited role and flood event management may have to be trusted upon. For example, in flash flood catchments, which often may be tiny and un-gauged, flood event management often depends on approximate prediction tools such as flash flood guidance (FFG). There are catchments fed largely by flood waters coming from upstream catchments, which are un-gauged or due to data sharing issues in transboundary catchments the flow of information from upstream catchment is limited. Hydrological and hydraulic modelling of these downstream catchments will never be sufficient to provide any required forecasting lead time and alternative tools to support flood event management will be required. In FFG, or similar approaches, the primary motif is to provide guidance by synthesising the historical data. We follow a similar approach to characterise past flood hydrographs to determine a flood index (FI), which varies in space and time with flood magnitude and its propagation. By studying the variation of the index the pockets of high flood risk, requiring attention, can be earmarked beforehand. This approach can be very useful in flood risk management of catchments where information about hydro-meteorological variables is inadequate for any forecasting system. This paper presents the development of FI and its application to several catchments including in Kentucky in the USA

  13. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  14. WBP/SIGAME the Brazilian BIG-GT demonstration project actual status and perspectives

    International Nuclear Information System (INIS)

    Carpentier, E.; Silva, A.

    1998-01-01

    Located in the tropics, with the sun shining all year round, and with its vast territory, Brazil may be regarded as having all the basic conditions to develop a modern Biomass for Electricity industry. Those characteristics together with: (a) the necessity of developing new energy resources for electricity production, in the northeast of the country; (b) the results of studies made by various entities, including CHESF; (c) the progress achieved by the BIG-GT technology; (d) the organisation of the Global Environment Facility (GEF); (e) and the support of the Brazilian government, through the Ministry of Science and Technology (MCT), provided the unique opportunity for the implementation of a commercial demonstration of that technology in Brazil. This paper describes the idea, scope, challenges, lessons, and actual status of development of the WBP/SIGAME project. It also highlights some institutional issues, budget figures, and energy prices. (author)

  15. Flooding and Flood Management

    Science.gov (United States)

    Brooks, K.N.; Fallon, J.D.; Lorenz, D.L.; Stark, J.R.; Menard, Jason; Easter, K.W.; Perry, Jim

    2011-01-01

    Floods result in great human disasters globally and nationally, causing an average of $4 billion of damages each year in the United States. Minnesota has its share of floods and flood damages, and the state has awarded nearly $278 million to local units of government for flood mitigation projects through its Flood Hazard Mitigation Grant Program. Since 1995, flood mitigation in the Red River Valley has exceeded $146 million. Considerable local and state funding has been provided to manage and mitigate problems of excess stormwater in urban areas, flooding of farmlands, and flood damages at road crossings. The cumulative costs involved with floods and flood mitigation in Minnesota are not known precisely, but it is safe to conclude that flood mitigation is a costly business. This chapter begins with a description of floods in Minneosta to provide examples and contrasts across the state. Background material is presented to provide a basic understanding of floods and flood processes, predication, and management and mitigation. Methods of analyzing and characterizing floods are presented because they affect how we respond to flooding and can influence relevant practices. The understanding and perceptions of floods and flooding commonly differ among those who work in flood forecasting, flood protection, or water resource mamnagement and citizens and businesses affected by floods. These differences can become magnified following a major flood, pointing to the need for better understanding of flooding as well as common language to describe flood risks and the uncertainty associated with determining such risks. Expectations of accurate and timely flood forecasts and our ability to control floods do not always match reality. Striving for clarity is important in formulating policies that can help avoid recurring flood damages and costs.

  16. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    Exarhos, C.A.; Van Swam, L.F.; Wahlquist, F.P.

    1981-12-01

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  17. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  18. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  19. Statistics and Analysis of the Relations between Rainstorm Floods and Earthquakes

    Directory of Open Access Journals (Sweden)

    Baodeng Hou

    2016-01-01

    Full Text Available The frequent occurrence of geophysical disasters under climate change has drawn Chinese scholars to pay their attention to disaster relations. If the occurrence sequence of disasters could be identified, long-term disaster forecast could be realized. Based on the Earth Degassing Effect (EDE which is valid, this paper took the magnitude, epicenter, and occurrence time of the earthquake, as well as the epicenter and occurrence time of the rainstorm floods as basic factors to establish an integrated model to study the correlation between rainstorm floods and earthquakes. 2461 severe earthquakes occurred in China or within 3000 km from China and the 169 heavy rainstorm floods occurred in China over the past 200+ years as the input data of the model. The computational results showed that although most of the rainstorm floods have nothing to do with the severe earthquakes from a statistical perspective, some floods might relate to earthquakes. This is especially true when the earthquakes happen in the vapor transmission zone where rainstorms lead to abundant water vapors. In this regard, earthquakes are more likely to cause big rainstorm floods. However, many cases of rainstorm floods could be found after severe earthquakes with a large extent of uncertainty.

  20. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  1. On the stationarity of Floods in west African rivers

    Science.gov (United States)

    NKA, B. N.; Oudin, L.; Karambiri, H.; Ribstein, P.; Paturel, J. E.

    2014-12-01

    West Africa undergoes a big change since the years 1970-1990, characterized by very low precipitation amounts, leading to low stream flows in river basins, except in the Sahelian region where the impact of human activities where pointed out to justify the substantial increase of floods in some catchments. More recently, studies showed an increase in the frequency of intense rainfall events, and according to observations made over the region, increase of flood events is also noticeable during the rainy season. Therefore, the assumption of stationarity on flood events is questionable and the reliability of flood evolution and climatic patterns is justified. In this work, we analyzed the trends of floods events for several catchments in the Sahelian and Sudanian regions of Burkina Faso. We used thirteen tributaries of large river basins (Niger, Nakambe, Mouhoun, Comoé) for which daily rainfall and flow data were collected from national hydrological and meteorological services of the country. We used Mann-Kendall and Pettitt tests to detect trends and break points in the annual time series of 8 rainfall indices and the annual maximum discharge records. We compare the trends of precipitation indices and flood size records to analyze the possible causality link between floods size and rainfall pattern. We also analyze the stationary of the frequency of flood exceeding the ten year return period level. The samples were extracted by a Peak over threshold method and the quantification of change in flood frequency was assessed by using a test developed by Lang M. (1995). The results exhibit two principal behaviors. Generally speaking, no trend is detected on catchments annual maximum discharge, but positive break points are pointed out in a group of three right bank tributaries of the Niger river that are located in the sahelian region between 300mm to 650mm. These same catchments show as well an increase of the yearly number of flood greater than the ten year flood since

  2. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  3. Behaviour of liquid films and flooding in counter-current two-phase flow, (1)

    International Nuclear Information System (INIS)

    Suzuki, Shin-ichi; Ueda, Tatsuhiro.

    1978-01-01

    This paper reports on the results of study of the behavior of liquid film and flooding in counter-current two phase flow, and the flow speed of gas phase was measured over the wide ranges of tube diameter, tube length, amount of liquid flow, viscosity and surface tension. Liquid samples used for this experiment were water, glycerol, and second octyl alcohol. The phenomena were observed with a high speed camera. The maximum thickness of liquid film was measured, and the effects of various factors on the flooding were investigated. The results of investigation were as follows. The big waves which cause the flooding were developed by the interaction of one of the waves on liquid film surface with gas phase flow. The flow speed of gas phase at the time of beginning of flooding increases with the reduction of amount of liquid flow and the increase of tube diameter. The flooding flow speed is reduced with the increase of tube length. The larger maximum film thickness at the time of no gas phase flow causes flooding at low gas phase flow speed. (Kato, T.)

  4. Environmental impact of flood: the study of arsenic speciation in exchangeable fraction of flood deposits of Warta river (Poland) in determination of "finger prints" of the pollutants origin and the ways of the migration.

    Science.gov (United States)

    Kozak, Lidia; Skolasińska, Katarzyna; Niedzielski, Przemysław

    2012-09-01

    The paper presents the application of the hyphenated technique - high-performance liquid chromatography with atomic absorption spectrometry detection with hydride generation (HPLC-HG-AAS) - in the determinations of inorganic forms of arsenic: As(III) and As(V) in the exchangeable fraction of flood deposits. The separation of analytical signals of the determined arsenic forms was obtained using an ion-exchange column in a chromatographic system with the atomic absorption spectrometer as a detector, at the determination limits of 5 ngg(-1) for As(III) and 10 ngg(-1) for As(V). Flood deposits were collected after big flood event in valley of the Warta river which took place in summer 2010. Samples of overbank deposits were taken in Poznań agglomeration and vicinity (NW Poland). The results of determinations of arsenic forms in the exchangeable fraction of flood deposits allowed indication of a hypothetical path of deposits migration transported by a river during flood and environmental threats posed by their deposition by flood. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Interactive Web-based Floodplain Simulation System for Realistic Experiments of Flooding and Flood Damage

    Science.gov (United States)

    Demir, I.

    2013-12-01

    Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.

  6. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  7. Impacts of dyke development in flood prone areas in the Vietnamese Mekong Delta to downstream flood hazard

    Science.gov (United States)

    Khanh Triet Nguyen, Van; Dung Nguyen, Viet; Fujii, Hideto; Kummu, Matti; Merz, Bruno; Apel, Heiko

    2016-04-01

    The Vietnamese Mekong Delta (VMD) plays an important role in food security and socio-economic development of the country. Being a low-lying coastal region, the VMD is particularly susceptible to both riverine and tidal floods, which provide, on (the) one hand, the basis for the rich agricultural production and the livelihood of the people, but on the other hand pose a considerable hazard depending on the severity of the floods. But despite of potentially hazardous flood, the area remain active as a rice granary due to its nutrient-rich soils and sediment input, and dense waterways, canals and the long standing experience of the population living with floods. In response to both farmers' requests and governmental plans, the construction of flood protection infrastructure in the delta progressed rapidly in the last twenty years, notably at areas prone to deep flooding, i.e. the Plain of Reeds (PoR) and Long Xuyen Quadrangle (LXQ). Triple rice cropping becomes possible in farmlands enclosed by "full-dykes", i.e. dykes strong and high enough to prevent flooding of the flood plains for most of the floods. In these protected flood plains rice can be grown even during the peak flood period (September to November). However, little is known about the possibly (and already alleged) negative impacts of this fully flood protection measure to downstream areas. This study aims at quantifying how the flood regime in the lower part of the VMD (e.g. Can Tho, My Thuan, …) has been changed in the last 2 recent "big flood" events of 2000 and 2011 due to the construction of the full-dyke system in the upper part. First, an evaluation of 35 years of daily water level data was performed in order to detect trends at key gauging stations: Kratie: upper boundary of the Delta, Tan Chau and Chau Doc: areas with full-dyke construction, Can Tho and My Thuan: downstream. Results from the Mann-Kendall (MK) test show a decreasing trend of the annual maximum water level at 3 stations Kratie, Tan

  8. Societal and economic impacts of flood hazards in Turkey – an overview

    Directory of Open Access Journals (Sweden)

    Koç Gamze

    2016-01-01

    Full Text Available Turkey has been severely affected by many natural hazards, in particular earthquakes and floods. Although there is a large body of literature on earthquake hazards and risks in Turkey, comparatively little is known about flood hazards and risks. Therefore, with this study it is aimed to investigate flood patterns, societal and economic impacts of flood hazards in Turkey, as well as providing a comparative overview of the temporal and spatial distribution of flood losses by analysing EM-DAT (Emergency Events Database and TABB (Turkey Disaster Data Base databases on disaster losses throughout Turkey for the years 1960-2014. The comparison of these two databases reveals big mismatches of the flood data, e.g. the reported number of events, number of affected people and economic loss, differ dramatically. With this paper, it has been explored reasons for mismatches. Biases and fallacies for loss data in the two databases has been discussed as well. Since loss data collection is gaining more and more attention, e.g. in the Sendai Framework for Disaster Risk Reduction 2015-2030 (SFDRR, the study could offer a base-work for developing guidelines and procedures on how to standardize loss databases and implement across the other hazard events, as well as substantial insights for flood risk mitigation and adaptation studies in Turkey and will offer valuable insights for other (European countries.

  9. Floods and Flash Flooding

    Science.gov (United States)

    Floods and flash flooding Now is the time to determine your area’s flood risk. If you are not sure whether you ... If you are in a floodplain, consider buying flood insurance. Do not drive around barricades. If your ...

  10. Flexibility in flood management design: proactive planning under uncertainty

    Science.gov (United States)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2016-12-01

    This paper presents a value-enhancing approach for proactive planning and design of long-lived flood management infrastructure given uncertain future flooding threats. Designing infrastructure that can be adapted over time is a method to safeguard the efficacy of current design decisions given future uncertainties. We explore the value of embedding "options" in a physical structure, where an option is the right but not the obligation to do something at a later date (e.g. over-dimensioning a floodwall foundation now facilitates a future height addition in response to observed increases in sea level; building extra pump bays in a drainage pumping station enables the easy addition of pumping capacity whenever increased precipitation warrants an expansion.) The proposed approach couples a simulation model that captures future climate induced changes to the hydrologic operating environment of a structure, with an economic model that estimates the lifetime economic performance of alternative investment strategies. The economic model uses Real "In" Options analysis, a type of cash flow analysis that quantifies the implicit value of options and the flexibility they provide. We demonstrate the approach using replacement planning for the multi-functional pumping station IJmuiden on the North Sea Canal in the Netherlands. The analysis models flexibility in design decisions, varying the size and specific options included in the new structure. Results indicate that the incorporation of options within the structural design has the potential to improve its economic performance, as compared to more traditional, "build it once and build it big" designs where flexibility is not an explicit design criterion. The added value resulting from the incorporation of flexibility varies with the range of future conditions considered, and the specific options examined. This approach could be applied to explore investment strategies for the design of other flood management structures, as well

  11. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  12. A global flash flood forecasting system

    Science.gov (United States)

    Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin

    2016-04-01

    The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial

  13. Flood Risk Management in Iowa through an Integrated Flood Information System

    Science.gov (United States)

    Demir, Ibrahim; Krajewski, Witold

    2013-04-01

    communities in advance to help minimize damage of floods. This presentation provides an overview and live demonstration of the tools and interfaces in the IFIS developed to date to provide a platform for one-stop access to flood related data, visualizations, flood conditions, and forecast.

  14. The August 2002 flood in Salzburg / Austria experience gained and lessons learned from the ``Flood of the century''?

    Science.gov (United States)

    Wiesenegger, H.

    2003-04-01

    On the {12th} of August 2002 a low pressure system moved slowly from northern Italy towards Slovakia. It continuously carried moist air from the Mediterranean towards the northern rim of the Alps with the effect of wide-spread heavy rainfall in Salzburg and other parts of Austria. Daily precipitation amounts of 100 - 160 mm, in some parts even more, as well as rainfall intensities of 5 - 10 mm/h , combined with well saturated soils lead to a rare flood with a return period of 100 years and more. This rare hydrological event not only caused a national catastrophe with damages of several Billion Euro, but also endangered more than 200,000 people, and even killed some. As floods are dangerous, life-threatening, destructive, and certainly amongst the most frequent and costly natural disasters in terms of human hardship as well as economic loss, a great effort, therefore, has to be made to protect people against negative impacts of floods. In order to achieve this objective, various regulations in land use planning (flood maps), constructive measurements (river regulations and technical constructions) as well as flood warning systems, which are not suitable to prevent big floods, but offer in-time-warnings to minimize the loss of human lives, are used in Austria. HYDRIS (Hydrological Information System for flood forecasting in Salzburg), a modular river basin model, developed at Technical University Vienna and operated by the Hydrological Service of Salzburg, was used during the August 2002 flood providing accurate 3 to 4 hour forecasts within 3 % of the real peak discharge of the fast flowing River Salzach. The August {12^th}} flood was in many ways an exceptional, very fast happening event which took many people by surprise. At the gauging station Salzburg / Salzach (catchment area 4425 {km^2}) it took only eighteen hours from mean annual discharge (178 {m3/s}) to the hundred years flood (2300 {m3/s}). The August flood made clear, that there is a strong need for

  15. Time-dependent reliability analysis of flood defences

    International Nuclear Information System (INIS)

    Buijs, F.A.; Hall, J.W.; Sayers, P.B.; Gelder, P.H.A.J.M. van

    2009-01-01

    This paper describes the underlying theory and a practical process for establishing time-dependent reliability models for components in a realistic and complex flood defence system. Though time-dependent reliability models have been applied frequently in, for example, the offshore, structural safety and nuclear industry, application in the safety-critical field of flood defence has to date been limited. The modelling methodology involves identifying relevant variables and processes, characterisation of those processes in appropriate mathematical terms, numerical implementation, parameter estimation and prediction. A combination of stochastic, hierarchical and parametric processes is employed. The approach is demonstrated for selected deterioration mechanisms in the context of a flood defence system. The paper demonstrates that this structured methodology enables the definition of credible statistical models for time-dependence of flood defences in data scarce situations. In the application of those models one of the main findings is that the time variability in the deterioration process tends to be governed the time-dependence of one or a small number of critical attributes. It is demonstrated how the need for further data collection depends upon the relevance of the time-dependence in the performance of the flood defence system.

  16. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  17. Predicting Coastal Flood Severity using Random Forest Algorithm

    Science.gov (United States)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2017-12-01

    Coastal floods have become more common recently and are predicted to further increase in frequency and severity due to sea level rise. Predicting floods in coastal cities can be difficult due to the number of environmental and geographic factors which can influence flooding events. Built stormwater infrastructure and irregular urban landscapes add further complexity. This paper demonstrates the use of machine learning algorithms in predicting street flood occurrence in an urban coastal setting. The model is trained and evaluated using data from Norfolk, Virginia USA from September 2010 - October 2016. Rainfall, tide levels, water table levels, and wind conditions are used as input variables. Street flooding reports made by city workers after named and unnamed storm events, ranging from 1-159 reports per event, are the model output. Results show that Random Forest provides predictive power in estimating the number of flood occurrences given a set of environmental conditions with an out-of-bag root mean squared error of 4.3 flood reports and a mean absolute error of 0.82 flood reports. The Random Forest algorithm performed much better than Poisson regression. From the Random Forest model, total daily rainfall was by far the most important factor in flood occurrence prediction, followed by daily low tide and daily higher high tide. The model demonstrated here could be used to predict flood severity based on forecast rainfall and tide conditions and could be further enhanced using more complete street flooding data for model training.

  18. 2 Dimensional Hydrodynamic Flood Routing Analysis on Flood Forecasting Modelling for Kelantan River Basin

    Directory of Open Access Journals (Sweden)

    Azad Wan Hazdy

    2017-01-01

    Full Text Available Flood disaster occurs quite frequently in Malaysia and has been categorized as the most threatening natural disaster compared to landslides, hurricanes, tsunami, haze and others. A study by Department of Irrigation and Drainage (DID show that 9% of land areas in Malaysia are prone to flood which may affect approximately 4.9 million of the population. 2 Dimensional floods routing modelling demonstrate is turning out to be broadly utilized for flood plain display and is an extremely viable device for evaluating flood. Flood propagations can be better understood by simulating the flow and water level by using hydrodynamic modelling. The hydrodynamic flood routing can be recognized by the spatial complexity of the schematization such as 1D model and 2D model. It was found that most of available hydrological models for flood forecasting are more focus on short duration as compared to long duration hydrological model using the Probabilistic Distribution Moisture Model (PDM. The aim of this paper is to discuss preliminary findings on development of flood forecasting model using Probabilistic Distribution Moisture Model (PDM for Kelantan river basin. Among the findings discuss in this paper includes preliminary calibrated PDM model, which performed reasonably for the Dec 2014, but underestimated the peak flows. Apart from that, this paper also discusses findings on Soil Moisture Deficit (SMD and flood plain analysis. Flood forecasting is the complex process that begins with an understanding of the geographical makeup of the catchment and knowledge of the preferential regions of heavy rainfall and flood behaviour for the area of responsibility. Therefore, to decreases the uncertainty in the model output, so it is important to increase the complexity of the model.

  19. Identification of flood-rich and flood-poor periods in flood series

    Science.gov (United States)

    Mediero, Luis; Santillán, David; Garrote, Luis

    2015-04-01

    Recently, a general concern about non-stationarity of flood series has arisen, as changes in catchment response can be driven by several factors, such as climatic and land-use changes. Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Trends are usually detected by the Mann-Kendall test. However, the results of this test depend on the starting and ending year of the series, which can lead to different results in terms of the period considered. The results can be conditioned to flood-poor and flood-rich periods located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to a set of long series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. Mediero et al. (2014) found a general decreasing trend in flood series in some parts of Spain that could be caused by a flood-rich period observed in 1950-1970, placed at the beginning of the flood series. The results of this study support the findings of Mediero et al. (2014), as a flood-rich period in 1950-1970 was identified in most of the selected sites. References: Mediero, L., Santillán, D., Garrote, L., Granados, A. Detection and attribution of trends in magnitude, frequency and timing of floods in Spain, Journal of Hydrology, 517, 1072-1088, 2014.

  20. Flood Foresight: A near-real time flood monitoring and forecasting tool for rapid and predictive flood impact assessment

    Science.gov (United States)

    Revilla-Romero, Beatriz; Shelton, Kay; Wood, Elizabeth; Berry, Robert; Bevington, John; Hankin, Barry; Lewis, Gavin; Gubbin, Andrew; Griffiths, Samuel; Barnard, Paul; Pinnell, Marc; Huyck, Charles

    2017-04-01

    The hours and days immediately after a major flood event are often chaotic and confusing, with first responders rushing to mobilise emergency responders, provide alleviation assistance and assess loss to assets of interest (e.g., population, buildings or utilities). Preparations in advance of a forthcoming event are becoming increasingly important; early warning systems have been demonstrated to be useful tools for decision markers. The extent of damage, human casualties and economic loss estimates can vary greatly during an event, and the timely availability of an accurate flood extent allows emergency response and resources to be optimised, reduces impacts, and helps prioritise recovery. In the insurance sector, for example, insurers are under pressure to respond in a proactive manner to claims rather than waiting for policyholders to report losses. Even though there is a great demand for flood inundation extents and severity information in different sectors, generating flood footprints for large areas from hydraulic models in real time remains a challenge. While such footprints can be produced in real time using remote sensing, weather conditions and sensor availability limit their ability to capture every single flood event across the globe. In this session, we will present Flood Foresight (www.floodforesight.com), an operational tool developed to meet the universal requirement for rapid geographic information, before, during and after major riverine flood events. The tool provides spatial data with which users can measure their current or predicted impact from an event - at building, basin, national or continental scales. Within Flood Foresight, the Screening component uses global rainfall predictions to provide a regional- to continental-scale view of heavy rainfall events up to a week in advance, alerting the user to potentially hazardous situations relevant to them. The Forecasting component enhances the predictive suite of tools by providing a local

  1. Nogales flood detention study

    Science.gov (United States)

    Norman, Laura M.; Levick, Lainie; Guertin, D. Phillip; Callegary, James; Guadarrama, Jesus Quintanar; Anaya, Claudia Zulema Gil; Prichard, Andrea; Gray, Floyd; Castellanos, Edgar; Tepezano, Edgar; Huth, Hans; Vandervoet, Prescott; Rodriguez, Saul; Nunez, Jose; Atwood, Donald; Granillo, Gilberto Patricio Olivero; Ceballos, Francisco Octavio Gastellum

    2010-01-01

    Flooding in Ambos Nogales often exceeds the capacity of the channel and adjacent land areas, endangering many people. The Nogales Wash is being studied to prevent future flood disasters and detention features are being installed in tributaries of the wash. This paper describes the application of the KINEROS2 model and efforts to understand the capacity of these detention features under various flood and urbanization scenarios. Results depict a reduction in peak flow for the 10-year, 1-hour event based on current land use in tributaries with detention features. However, model results also demonstrate that larger storm events and increasing urbanization will put a strain on the features and limit their effectiveness.

  2. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  3. Health impacts of floods.

    Science.gov (United States)

    Du, Weiwei; FitzGerald, Gerard Joseph; Clark, Michele; Hou, Xiang-Yu

    2010-01-01

    Floods are the most common hazard to cause disasters and have led to extensive morbidity and mortality throughout the world. The impact of floods on the human community is related directly to the location and topography of the area, as well as human demographics and characteristics of the built environment. The aim of this study is to identify the health impacts of disasters and the underlying causes of health impacts associated with floods. A conceptual framework is developed that may assist with the development of a rational and comprehensive approach to prevention, mitigation, and management. This study involved an extensive literature review that located >500 references, which were analyzed to identify common themes, findings, and expert views. The findings then were distilled into common themes. The health impacts of floods are wide ranging, and depend on a number of factors. However, the health impacts of a particular flood are specific to the particular context. The immediate health impacts of floods include drowning, injuries, hypothermia, and animal bites. Health risks also are associated with the evacuation of patients, loss of health workers, and loss of health infrastructure including essential drugs and supplies. In the medium-term, infected wounds, complications of injury, poisoning, poor mental health, communicable diseases, and starvation are indirect effects of flooding. In the long-term, chronic disease, disability, poor mental health, and poverty-related diseases including malnutrition are the potential legacy. This article proposes a structured approach to the classification of the health impacts of floods and a conceptual framework that demonstrates the relationships between floods and the direct and indirect health consequences.

  4. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  5. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  6. ARSENIC REMOVAL FROM DRINKING WATER BY IRON REMOVAL USEPA DEMONSTRATION PROJECT AT BIG SAUK LAKE MOBILE HOME PARK IN SAUK CENTRE, MN. SIX MONTH EVALUATION REPORT

    Science.gov (United States)

    This report documents the activities performed and the results obtained from the first six months of the arsenic removal treatment technology demonstration project at the Big Sauk Lake Mobile Home Park (BSLMHP) in Sauk Centre, MN. The objectives of the project are to evaluate the...

  7. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  8. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  9. Flexibility in Flood Management Design: Proactive Planning Under Climate Change Uncertainty

    Science.gov (United States)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2015-12-01

    This paper presents an innovative, value-enhancing procedure for effective planning and design of long-lived flood management infrastructure given uncertain future flooding threats due to climate change. Designing infrastructure that can be adapted over time is a method to safeguard the efficacy of current design decisions given uncertainty about rates and future impacts of climate change. This paper explores the value of embedding "options" in a physical structure, where an option is the right but not the obligation to do something at a later date (e.g. over-dimensioning a floodwall foundation now facilitates a future height addition in response to observed increases in sea level; building of extra pump bays in a pumping station now enables the addition of pumping capacity whenever increased precipitation warrants an expansion.) The proposed procedure couples a simulation model that captures future climate induced changes to the hydrologic operating environment of a structure, with an economic model that estimates the lifetime economic performance of alternative investments. The economic model uses Real "In" Options analysis, a type of cash flow analysis that quantifies the implicit value of options and the flexibility they provide. This procedure is demonstrated using replacement planning for the multi-functional pumping station IJmuiden on the North Sea Canal in the Netherlands. Flexibility in design decisions is modelled, varying the size and specific options included in the new structure. Results indicate that the incorporation of options within the structural design has the potential to improve its economic performance, as compared to more traditional, "build it once and build it big" designs where flexibility is not an explicit design criterion. The added value resulting from the incorporation of flexibility varies with the range of future conditions considered, as well as the options examined. This procedure could be applied more broadly to explore

  10. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  11. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  12. Modeling urban coastal flood severity from crowd-sourced flood reports using Poisson regression and Random Forest

    Science.gov (United States)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2018-04-01

    Sea level rise has already caused more frequent and severe coastal flooding and this trend will likely continue. Flood prediction is an essential part of a coastal city's capacity to adapt to and mitigate this growing problem. Complex coastal urban hydrological systems however, do not always lend themselves easily to physically-based flood prediction approaches. This paper presents a method for using a data-driven approach to estimate flood severity in an urban coastal setting using crowd-sourced data, a non-traditional but growing data source, along with environmental observation data. Two data-driven models, Poisson regression and Random Forest regression, are trained to predict the number of flood reports per storm event as a proxy for flood severity, given extensive environmental data (i.e., rainfall, tide, groundwater table level, and wind conditions) as input. The method is demonstrated using data from Norfolk, Virginia USA from September 2010 to October 2016. Quality-controlled, crowd-sourced street flooding reports ranging from 1 to 159 per storm event for 45 storm events are used to train and evaluate the models. Random Forest performed better than Poisson regression at predicting the number of flood reports and had a lower false negative rate. From the Random Forest model, total cumulative rainfall was by far the most dominant input variable in predicting flood severity, followed by low tide and lower low tide. These methods serve as a first step toward using data-driven methods for spatially and temporally detailed coastal urban flood prediction.

  13. Determining tropical cyclone inland flooding loss on a large scale through a new flood peak ratio-based methodology

    International Nuclear Information System (INIS)

    Czajkowski, Jeffrey; Michel-Kerjan, Erwann; Villarini, Gabriele; Smith, James A

    2013-01-01

    In recent years, the United States has been severely affected by numerous tropical cyclones (TCs) which have caused massive damages. While media attention mainly focuses on coastal losses from storm surge, these TCs have inflicted significant devastation inland as well. Yet, little is known about the relationship between TC-related inland flooding and economic losses. Here we introduce a novel methodology that first successfully characterizes the spatial extent of inland flooding, and then quantifies its relationship with flood insurance claims. Hurricane Ivan in 2004 is used as illustration. We empirically demonstrate in a number of ways that our quantified inland flood magnitude produces a very good representation of the number of inland flood insurance claims experienced. These results highlight the new technological capabilities that can lead to a better risk assessment of inland TC flood. This new capacity will be of tremendous value to a number of public and private sector stakeholders dealing with disaster preparedness. (letter)

  14. Flood simulation and verification with IoT sensors

    Science.gov (United States)

    Chang, Che-Hao; Hsu, Chih-Tsung; Wu, Shiang-Jen; Huang, Sue-Wei

    2017-04-01

    2D flood dynamic simulation is a vivid tool to demonstrate the possible expose area that sustain impact of high rise of water level. Along with progress in high resolution digital terrain model, the simulation results are quite convinced yet not proved to be close to what is really happened. Due to the dynamic and uncertain essence, the expose area usually could not be well defined during a flood event. Recent development in IoT sensors bring a low power and long distance communication which help us to collect real time flood depths. With these time series of flood depths at different locations, we are capable of verifying the simulation results corresponding to the flood event. 16 flood gauges with IoT specification as well as two flood events in Annan district, Tainan city, Taiwan are examined in this study. During the event in 11, June, 2016, 12 flood gauges works well and 8 of them provide observation match to simulation.

  15. Big Data Analytics in the Management of Business

    Directory of Open Access Journals (Sweden)

    Jelonek Dorota

    2017-01-01

    Full Text Available Data, information, knowledge have always played a critical role in business. The amount of various data that can be collected and stored is increasing, therefore companies need new solutions for data processing and analysis. The paper presents considerations on the concept of Big Data. The aim of the paper is to demonstrate that Big Data analytics is an effective support in managing the company. It also indicates the areas and activities where the use of Big Data analytics can bring the greatest benefits to companies.

  16. Variations in flood magnitude-effect relations and the implications for flood risk assessment and river management

    Science.gov (United States)

    Hooke, J. M.

    2015-12-01

    In spite of major physical impacts from large floods, present river management rarely takes into account the possible dynamics and variation in magnitude-impact relations over time in flood risk mapping and assessment nor incorporates feedback effects of changes into modelling. Using examples from the literature and from field measurements over several decades in two contrasting environments, a semi-arid region and a humid-temperate region, temporal variations in channel response to flood events are evaluated. The evidence demonstrates how flood physical impacts can vary at a location over time. The factors influencing that variation on differing timescales are examined. The analysis indicates the importance of morphological changes and trajectory of adjustment in relation to thresholds, and that trends in force or resistance can take place over various timescales, altering those thresholds. Sediment supply can also change with altered connectivity upstream and changes in state of hillslope-channel coupling. It demonstrates that seasonal timing and sequence of events can affect response, particularly deposition through sediment supply. Duration can also have a significant effect and modify the magnitude relation. Lack of response or deposits in some events can mean that flood frequency using such evidence is underestimated. A framework for assessment of both past and possible future changes is provided which emphasises the uncertainty and the inconstancy of the magnitude-impact relation and highlights the dynamic factors and nature of variability that should be considered in sustainable management of river channels.

  17. Risk-trading in flood management: An economic model.

    Science.gov (United States)

    Chang, Chiung Ting

    2017-09-15

    Although flood management is no longer exclusively a topic of engineering, flood mitigation continues to be associated with hard engineering options. Flood adaptation or the capacity to adapt to flood risk, as well as a demand for internalizing externalities caused by flood risk between regions, complicate flood management activities. Even though integrated river basin management has long been recommended to resolve the above issues, it has proven difficult to apply widely, and sometimes even to bring into existence. This article explores how internalization of externalities as well as the realization of integrated river basin management can be encouraged via the use of a market-based approach, namely a flood risk trading program. In addition to maintaining efficiency of optimal resource allocation, a flood risk trading program may also provide a more equitable distribution of benefits by facilitating decentralization. This article employs a graphical analysis to show how flood risk trading can be implemented to encourage mitigation measures that increase infiltration and storage capacity. A theoretical model is presented to demonstrate the economic conditions necessary for flood risk trading. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Flood hazard assessment in areas prone to flash flooding

    Science.gov (United States)

    Kvočka, Davor; Falconer, Roger A.; Bray, Michaela

    2016-04-01

    Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.

  19. A little big history of Tiananmen

    OpenAIRE

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why people built the gate the way they did can be found. These explanations are useful in their own right and may also be used to deepen our understanding of more traditional explanations of why Tiananmen ...

  20. Automated Big Traffic Analytics for Cyber Security

    OpenAIRE

    Miao, Yuantian; Ruan, Zichan; Pan, Lei; Wang, Yu; Zhang, Jun; Xiang, Yang

    2018-01-01

    Network traffic analytics technology is a cornerstone for cyber security systems. We demonstrate its use through three popular and contemporary cyber security applications in intrusion detection, malware analysis and botnet detection. However, automated traffic analytics faces the challenges raised by big traffic data. In terms of big data's three characteristics --- volume, variety and velocity, we review three state of the art techniques to mitigate the key challenges including real-time tr...

  1. Principles of big data preparing, sharing, and analyzing complex information

    CERN Document Server

    Berman, Jules J

    2013-01-01

    Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endo

  2. Fragmented patterns of flood change across the United States

    Science.gov (United States)

    Archfield, Stacey A.; Hirsch, Robert M.; Viglione, A.; Blöschl, G.

    2016-01-01

    Trends in the peak magnitude, frequency, duration, and volume of frequent floods (floods occurring at an average of two events per year relative to a base period) across the United States show large changes; however, few trends are found to be statistically significant. The multidimensional behavior of flood change across the United States can be described by four distinct groups, with streamgages experiencing (1) minimal change, (2) increasing frequency, (3) decreasing frequency, or (4) increases in all flood properties. Yet group membership shows only weak geographic cohesion. Lack of geographic cohesion is further demonstrated by weak correlations between the temporal patterns of flood change and large-scale climate indices. These findings reveal a complex, fragmented pattern of flood change that, therefore, clouds the ability to make meaningful generalizations about flood change across the United States.

  3. The use of Natural Flood Management to mitigate local flooding in the rural landscape

    Science.gov (United States)

    Wilkinson, Mark; Quinn, Paul; Ghimire, Sohan; Nicholson, Alex; Addy, Steve

    2014-05-01

    The past decade has seen increases in the occurrence of flood events across Europe, putting a growing number of settlements of varying sizes at risk. The issue of flooding in smaller villages is usually not well publicised. In these small communities, the cost of constructing and maintaining traditional flood defences often outweigh the potential benefits, which has led to a growing quest for more cost effective and sustainable approaches. Here we aim to provide such an approach that alongside flood risk reduction, also has multipurpose benefits of sediment control, water quality amelioration, and habitat creation. Natural flood management (NFM) aims to reduce flooding by working with natural features and characteristics to slow down or temporarily store flood waters. NFM measures include dynamic water storage ponds and wetlands, interception bunds, channel restoration and instream wood placement, and increasing soil infiltration through soil management and tree planting. Based on integrated monitoring and modelling studies, we demonstrate the potential to manage runoff locally using NFM in rural systems by effectively managing flow pathways (hill slopes and small channels) and by exploiting floodplains and buffers strips. Case studies from across the UK show that temporary storage ponds (ranging from 300 to 3000m3) and other NFM measures can reduce peak flows in small catchments (5 to 10 km2) by up to 15 to 30 percent. In addition, increasing the overall effective storage capacity by a network of NFM measures was found to be most effective for total reduction of local flood peaks. Hydraulic modelling has shown that the positioning of such features within the catchment, and how they are connected to the main channel, may also affect their effectiveness. Field evidence has shown that these ponds can collect significant accumulations of fine sediment during flood events. On the other hand, measures such as wetlands could also play an important role during low flow

  4. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    Science.gov (United States)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the

  5. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  6. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  7. Arsenic Removal from Drinking Water by Iron Removal - U.S. EPA Demonstration Project at Big Sauk Lake Mobile Home Park in Sauk Centre, MN Final Performance Evaluation Report

    Science.gov (United States)

    This report documents the activities performed and the results obtained from the one-year arsenic removal treatment technology demonstration project at the Big Sauk Lake Mobile Home Park (BSLMHP) in Sauk Centre, MN. The objectives of the project are to evaluate (1) the effective...

  8. Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?

    Science.gov (United States)

    Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy

    2016-10-01

    The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.

  9. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  10. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  11. An Evaluation of Selected Extraordinary Floods in the United States Reported by the U.S. Geological Survey and Implications for Future Advancement of Flood Science

    Science.gov (United States)

    Costa, John E.; Jarrett, Robert D.

    2008-01-01

    discharges that were estimated by an inappropriate method (slope-area) (Big Creek near Waynesville, North Carolina; Day Creek near Etiwanda, California). Original field notes and records could not be found for three of the floods, however, some data (copies of original materials, records of reviews) were available for two of these floods. A rating was assigned to each of seven peak discharges that had no rating. Errors identified in the reviews include misidentified flow processes, incorrect drainage areas for very small basins, incorrect latitude and longitude, improper field methods, arithmetic mistakes in hand calculations, omission of measured high flows when developing rating curves, and typographical errors. Common problems include use of two-section slope-area measurements, poor site selection, uncertainties in Manning's n-values, inadequate review, lost data files, and insufficient and inadequately described high-water marks. These floods also highlight the extreme difficulty in making indirect discharge measurements following extraordinary floods. Significantly, none of the indirect measurements are rated better than fair, which indicates the need to improve methodology to estimate peak discharge. Highly unsteady flow and resulting transient hydraulic phenomena, two-dimensional flow patterns, debris flows at streamflow-gaging stations, and the possibility of disconnected flow surfaces are examples of unresolved problems not well handled by current indirect discharge methodology. On the basis of a comprehensive review of 50,000 annual peak discharges and miscellaneous floods in California, problems with individual flood peak discharges would be expected to require a revision of discharge or rating curves at a rate no greater than about 0.10 percent of all floods. Many extraordinary floods create complex flow patterns and processes that cannot be adequately documented with quasi-steady, uniform one-dimensional analyses. These floods are most accura

  12. Considerations on the extreme flood produced in Ral Mare Basin (Retezat Mountains, Romania)

    International Nuclear Information System (INIS)

    Barbuc, Mihai

    2004-01-01

    The aim of this paper is to illustrate the major impact of an extreme flood on the landscape, on the upper basin of Raul Mare, from Retezat Mountains, Romania, and what means 'hazardous phenomenon'. Romania is one of the European countries most severely affected by natural hazards, which have a big social and economic impact. Between them, floods are the very frequent and have one of the most important effects on settlements, agriculture and communications. Raul mare has three main sources: Lapusnicul Mare, Lapusnicul Mic and Raul Ses. Its springs from glacier lakes, at high altitude, over 2000 m, and have torrential and narrow valleys. In present, their conflence, at Gura Apelor, is covered by an anthropic lake, formed behind of a great dam, 173 m high. This dam had a major role to attenuate and to fail to control the extreme flood from July 1990 and, at the same time, to reduce significantly, the damages in Hateg depression, a low area with many settlements and economic objectives. Behind of the Gura Apelor kake, the Lapusnicul Mare and Mic valleys, the flush flood covered the whole channel, the effects on the landscape-devastating, and the flood probability, between 0,1 -0,1 %. The maps, graphics and pictures presented in this paper will emphasize the situation before and after the event. Furthermore, some standard forms used to be filled in by authorities for immediate and unitary recording of extreme phenomena are presented.(Author)

  13. Sequential planning of flood protection infrastructure under limited historic flood record and climate change uncertainty

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Straub, Daniel

    2017-04-01

    Flood protection is often designed to safeguard people and property following regulations and standards, which specify a target design flood protection level, such as the 100-year flood level prescribed in Germany (DWA, 2011). In practice, the magnitude of such an event is only known within a range of uncertainty, which is caused by limited historic records and uncertain climate change impacts, among other factors (Hall & Solomatine, 2008). As more observations and improved climate projections become available in the future, the design flood estimate changes and the capacity of the flood protection may be deemed insufficient at a future point in time. This problem can be mitigated by the implementation of flexible flood protection systems (that can easily be adjusted in the future) and/or by adding an additional reserve to the flood protection, i.e. by applying a safety factor to the design. But how high should such a safety factor be? And how much should the decision maker be willing to pay to make the system flexible, i.e. what is the Value of Flexibility (Špačková & Straub, 2017)? We propose a decision model that identifies cost-optimal decisions on flood protection capacity in the face of uncertainty (Dittes et al. 2017). It considers sequential adjustments of the protection system during its lifetime, taking into account its flexibility. The proposed framework is based on pre-posterior Bayesian decision analysis, using Decision Trees and Markov Decision Processes, and is fully quantitative. It can include a wide range of uncertainty components such as uncertainty associated with limited historic record or uncertain climate or socio-economic change. It is shown that since flexible systems are less costly to adjust when flood estimates are changing, they justify initially lower safety factors. Investigation on the Value of Flexibility (VoF) demonstrates that VoF depends on the type and degree of uncertainty, on the learning effect (i.e. kind and quality of

  14. Evaluating the impact and risk of pluvial flash flood on intra-urban road network: A case study in the city center of Shanghai, China

    Science.gov (United States)

    Yin, Jie; Yu, Dapeng; Yin, Zhane; Liu, Min; He, Qing

    2016-06-01

    Urban pluvial flood are attracting growing public concern due to rising intense precipitation and increasing consequences. Accurate risk assessment is critical to an efficient urban pluvial flood management, particularly in transportation sector. This paper describes an integrated methodology, which initially makes use of high resolution 2D inundation modeling and flood depth-dependent measure to evaluate the potential impact and risk of pluvial flash flood on road network in the city center of Shanghai, China. Intensity-Duration-Frequency relationships of Shanghai rainstorm and Chicago Design Storm are combined to generate ensemble rainfall scenarios. A hydrodynamic model (FloodMap-HydroInundation2D) is used to simulate overland flow and flood inundation for each scenario. Furthermore, road impact and risk assessment are respectively conducted by a new proposed algorithm and proxy. Results suggest that the flood response is a function of spatio-temporal distribution of precipitation and local characteristics (i.e. drainage and topography), and pluvial flash flood is found to lead to proportionate but nonlinear impact on intra-urban road inundation risk. The approach tested here would provide more detailed flood information for smart management of urban street network and may be applied to other big cities where road flood risk is evolving in the context of climate change and urbanization.

  15. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  16. August, 2002 - floods events, affected areas revitalisation and prevention for the future in the central Bohemian region, Czech Republic

    Science.gov (United States)

    Bina, L.; Vacha, F.; Vodova, J.

    2003-04-01

    Central Bohemian Region is located in a shape of a ring surrounding the capitol of Prague. Its total territorial area is 11.014 sq.km and population of 1 130.000 inhabitants. According to EU nomenclature of regional statistical units, the Central Bohemian Region is classified as an independent NUTS II. Bohemia's biggest rivers, Vltava and Labe form the region's backbone dividing it along a north-south line, besides that there are Sazava and Berounka, the two big headwaters of Vltava, which flow through the region and there also are some cascade man made lakes and 2 important big dams - Orlik and Slapy on the Vltava River in the area of the region. Overflowing of these rivers and their feeders including cracking of high-water dams during the floods in August 2002 caused total or partial destruction or damage of more than 200 towns and villages and total losses to the extend of 450 mil. EUR. The worst impact was on damaged or destroyed human dwellings, social infrastructure (schools, kindergartens, humanitarian facilities) and technical infrastructure (roads, waterworks, power distribution). Also businesses were considerably damaged including transport terminals in the area of river ports. Flowage of Spolana Neratovice chemical works caused critical environmental havoc. Regional crisis staff with regional Governor in the lead worked continuously during the floods and a regional integrated rescue system was subordinated to it. Due to the huge extent of the floods the crisis staff coordinated its work with central bodies of state including the Government and single "power" resorts (army, interior, transport). Immediately after floods a regional - controlled management was set up including an executive body for regional revitalisation which is connected to state coordinating resort - Ministry for Local Development, EU sources and humanitarian aid. In addition to a program of regional revitalisation additional preventive flood control programs are being developed

  17. Surrogate modeling of joint flood risk across coastal watersheds

    Science.gov (United States)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  18. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  19. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  20. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  1. Flood Catastrophe Model for Designing Optimal Flood Insurance Program: Estimating Location-Specific Premiums in the Netherlands.

    Science.gov (United States)

    Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A

    2017-01-01

    As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.

  2. Survey of Cyber Crime in Big Data

    Science.gov (United States)

    Rajeswari, C.; Soni, Krishna; Tandon, Rajat

    2017-11-01

    Big data is like performing computation operations and database operations for large amounts of data, automatically from the data possessor’s business. Since a critical strategic offer of big data access to information from numerous and various areas, security and protection will assume an imperative part in big data research and innovation. The limits of standard IT security practices are notable, with the goal that they can utilize programming sending to utilize programming designers to incorporate pernicious programming in a genuine and developing risk in applications and working frameworks, which are troublesome. The impact gets speedier than big data. In this way, one central issue is that security and protection innovation are sufficient to share controlled affirmation for countless direct get to. For powerful utilization of extensive information, it should be approved to get to the information of that space or whatever other area from a space. For a long time, dependable framework improvement has arranged a rich arrangement of demonstrated ideas of demonstrated security to bargain to a great extent with the decided adversaries, however this procedure has been to a great extent underestimated as “needless excess” and sellers In this discourse, essential talks will be examined for substantial information to exploit this develop security and protection innovation, while the rest of the exploration difficulties will be investigated.

  3. iFLOOD: A Real Time Flood Forecast System for Total Water Modeling in the National Capital Region

    Science.gov (United States)

    Sumi, S. J.; Ferreira, C.

    2017-12-01

    Extreme flood events are the costliest natural hazards impacting the US and frequently cause extensive damages to infrastructure, disruption to economy and loss of lives. In 2016, Hurricane Matthew brought severe damage to South Carolina and demonstrated the importance of accurate flood hazard predictions that requires the integration of riverine and coastal model forecasts for total water prediction in coastal and tidal areas. The National Weather Service (NWS) and the National Ocean Service (NOS) provide flood forecasts for almost the entire US, still there are service-gap areas in tidal regions where no official flood forecast is available. The National capital region is vulnerable to multi-flood hazards including high flows from annual inland precipitation events and surge driven coastal inundation along the tidal Potomac River. Predicting flood levels on such tidal areas in river-estuarine zone is extremely challenging. The main objective of this study is to develop the next generation of flood forecast systems capable of providing accurate and timely information to support emergency management and response in areas impacted by multi-flood hazards. This forecast system is capable of simulating flood levels in the Potomac and Anacostia River incorporating the effects of riverine flooding from the upstream basins, urban storm water and tidal oscillations from the Chesapeake Bay. Flood forecast models developed so far have been using riverine data to simulate water levels for Potomac River. Therefore, the idea is to use forecasted storm surge data from a coastal model as boundary condition of this system. Final output of this validated model will capture the water behavior in river-estuary transition zone far better than the one with riverine data only. The challenge for this iFLOOD forecast system is to understand the complex dynamics of multi-flood hazards caused by storm surges, riverine flow, tidal oscillation and urban storm water. Automated system

  4. Application of Flood Nomograph for Flood Forecasting in Urban Areas

    Directory of Open Access Journals (Sweden)

    Eui Hoon Lee

    2018-01-01

    Full Text Available Imperviousness has increased due to urbanization, as has the frequency of extreme rainfall events by climate change. Various countermeasures, such as structural and nonstructural measures, are required to prepare for these effects. Flood forecasting is a representative nonstructural measure. Flood forecasting techniques have been developed for the prevention of repetitive flood damage in urban areas. It is difficult to apply some flood forecasting techniques using training processes because training needs to be applied at every usage. The other flood forecasting techniques that use rainfall data predicted by radar are not appropriate for small areas, such as single drainage basins. In this study, a new flood forecasting technique is suggested to reduce flood damage in urban areas. The flood nomograph consists of the first flooding nodes in rainfall runoff simulations with synthetic rainfall data at each duration. When selecting the first flooding node, the initial amount of synthetic rainfall is 1 mm, which increases in 1 mm increments until flooding occurs. The advantage of this flood forecasting technique is its simple application using real-time rainfall data. This technique can be used to prepare a preemptive response in the process of urban flood management.

  5. Flood-Fighting Structures Demonstration and Evaluation Program: Laboratory and Field Testing in Vicksburg, Mississippi

    Science.gov (United States)

    2007-07-01

    then it should be disposed of by recycling or land-filling. This material should not be burned due to the formation of carbon dioxide and carbon...and 2-192). A top spreader bar Chapter 2 Laboratory Testing and Evaluation of Expedient Flood-fighting Barriers 135 Figure 2-189

  6. Characterization of remarkable floods in France, a transdisciplinary approach applied on generalized floods of January 1910

    Science.gov (United States)

    Boudou, Martin; Lang, Michel; Vinet, Freddy; Coeur, Denis

    2014-05-01

    . The January 1910's flood is one of these remarkable floods. This event is foremost known for its aftermaths on the Seine basin, where the flood remains the strongest recorded in Paris since 1658. However, its impacts were also widespread to France's Eastern regions (Martin, 2001). To demonstrate the evaluation grid's interest, we propose a deep analysis of the 1910's river flood with the integration of historical documentation. The approach focus on eastern France where the flood remains the highest recorded for several rivers but were often neglected by scientists in favor of Paris's flood. Through a transdisciplinary research based on the evaluation grid method, we will describe the January 1910 flood event and define why it can be considered as a remarkable flood for these regions.

  7. Burgernomics: a big MacT guide to purchasing power parity

    OpenAIRE

    Michael R. Pakko; Patricia S. Pollard

    2003-01-01

    The theory of purchasing power parity (PPP) has long been a staple of international economic analysis. Recent years have seen the rise in popularity of a tongue-in-cheek, fast-food version of PPP: The Big Mac™ index. In this article, Michael Pakko and Patricia Pollard describe how comparisons of Big Mac prices around the world contain the ingredients necessary to demonstrate the fundamental principles of PPP. They show that the Big Mac index does nearly as well as more comprehensive measures ...

  8. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  9. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California.

    Science.gov (United States)

    Calil, Juliano; Beck, Michael W; Gleason, Mary; Merrifield, Matthew; Klausmeyer, Kirk; Newkirk, Sarah

    2015-01-01

    Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S. Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as "repetitive loss." During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds.

  10. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California.

    Directory of Open Access Journals (Sweden)

    Juliano Calil

    Full Text Available Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S.Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as "repetitive loss." During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds.

  11. Motivation for helping behavior after the floods in 2014: The role of personality and national identity

    Directory of Open Access Journals (Sweden)

    Otašević Biljana

    2014-01-01

    Full Text Available The severe floods, which struck Serbia in May 2014, inflicted enormous material damage and forced tens of thousands of people to leave their homes. The goal of this research was to investigate the motivation of citizens who helped, i.e. to establish the predictors of the motivation for helping behavior as well as its effects on different ways of providing help. Volunteer Functions Inventory, National Identity Scale, Big Five Inventory, and Helping Behavior Questionnaire were applied to the sample of 183 people who provided the flood victims with help. Multivariate analysis of covariance, where the predictors were gender, the prior acquaintance with the victims, six volunteer functions (Social, Understanding, Protective, Enhancement, Values, Career, age, education and financial status, while the criteria were dimensions of helping behavior, shows that the model provides statistically significant explanation of all dimensions of helping behavior, where the Social Motives and Understanding have significant multivariate effects. MANCOVA was used for establishing the predictors of helping motivation. Along with the control variables from the first analysis, the predictors were Big Five personality traits and national identity, and the criteria were 6 volunteer functions. The results showed significant multivariate effects of Openness, Agreeableness, Extraversion and national identity. Practical and theoretical implications are discussed.

  12. Catchment scale multi-objective flood management

    Science.gov (United States)

    Rose, Steve; Worrall, Peter; Rosolova, Zdenka; Hammond, Gene

    2010-05-01

    Rural land management is known to affect both the generation and propagation of flooding at the local scale, but there is still a general lack of good evidence that this impact is still significant at the larger catchment scale given the complexity of physical interactions and climatic variability taking place at this level. The National Trust, in partnership with the Environment Agency, are managing an innovative project on the Holnicote Estate in south west England to demonstrate the benefits of using good rural land management practices to reduce flood risk at the both the catchment and sub-catchment scales. The Holnicote Estate is owned by the National Trust and comprises about 5,000 hectares of land, from the uplands of Exmoor to the sea, incorporating most of the catchments of the river Horner and Aller Water. There are nearly 100 houses across three villages that are at risk from flooding which could potentially benefit from changes in land management practices in the surrounding catchment providing a more sustainable flood attenuation function. In addition to the contribution being made to flood risk management there are a range of other ecosystems services that will be enhanced through these targeted land management changes. Alterations in land management will create new opportunities for wildlife and habitats and help to improve the local surface water quality. Such improvements will not only create additional wildlife resources locally but also serve the landscape response to climate change effects by creating and enhancing wildlife networks within the region. Land management changes will also restore and sustain landscape heritage resources and provide opportunities for amenity, recreation and tourism. The project delivery team is working with the National Trust from source to sea across the entire Holnicote Estate, to identify and subsequently implement suitable land management techniques to manage local flood risk within the catchments. These

  13. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  14. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  15. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    Science.gov (United States)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.

  16. Mapping flood and flooding potential indices: a methodological approach to identifying areas susceptible to flood and flooding risk. Case study: the Prahova catchment (Romania)

    Science.gov (United States)

    Zaharia, Liliana; Costache, Romulus; Prăvălie, Remus; Ioana-Toroimac, Gabriela

    2017-04-01

    Given that floods continue to cause yearly significant worldwide human and material damages, flood risk mitigation is a key issue and a permanent challenge in developing policies and strategies at various spatial scales. Therefore, a basic phase is elaborating hazard and flood risk maps, documents which are an essential support for flood risk management. The aim of this paper is to develop an approach that allows for the identification of flash-flood and flood-prone susceptible areas based on computing and mapping of two indices: FFPI (Flash-Flood Potential Index) and FPI (Flooding Potential Index). These indices are obtained by integrating in a GIS environment several geographical variables which control runoff (in the case of the FFPI) and favour flooding (in the case of the FPI). The methodology was applied in the upper (mountainous) and middle (hilly) catchment of the Prahova River, a densely populated and socioeconomically well-developed area which has been affected repeatedly by water-related hazards over the past decades. The resulting maps showing the spatialization of the FFPI and FPI allow for the identification of areas with high susceptibility to flashfloods and flooding. This approach can provide useful mapped information, especially for areas (generally large) where there are no flood/hazard risk maps. Moreover, the FFPI and FPI maps can constitute a preliminary step for flood risk and vulnerability assessment.

  17. The Global Flood Model

    Science.gov (United States)

    Williams, P.; Huddelston, M.; Michel, G.; Thompson, S.; Heynert, K.; Pickering, C.; Abbott Donnelly, I.; Fewtrell, T.; Galy, H.; Sperna Weiland, F.; Winsemius, H.; Weerts, A.; Nixon, S.; Davies, P.; Schiferli, D.

    2012-04-01

    Recently, a Global Flood Model (GFM) initiative has been proposed by Willis, UK Met Office, Esri, Deltares and IBM. The idea is to create a global community platform that enables better understanding of the complexities of flood risk assessment to better support the decisions, education and communication needed to mitigate flood risk. The GFM will provide tools for assessing the risk of floods, for devising mitigation strategies such as land-use changes and infrastructure improvements, and for enabling effective pre- and post-flood event response. The GFM combines humanitarian and commercial motives. It will benefit: - The public, seeking to preserve personal safety and property; - State and local governments, seeking to safeguard economic activity, and improve resilience; - NGOs, similarly seeking to respond proactively to flood events; - The insurance sector, seeking to understand and price flood risk; - Large corporations, seeking to protect global operations and supply chains. The GFM is an integrated and transparent set of modules, each composed of models and data. For each module, there are two core elements: a live "reference version" (a worked example) and a framework of specifications, which will allow development of alternative versions. In the future, users will be able to work with the reference version or substitute their own models and data. If these meet the specification for the relevant module, they will interoperate with the rest of the GFM. Some "crowd-sourced" modules could even be accredited and published to the wider GFM community. Our intent is to build on existing public, private and academic work, improve local adoption, and stimulate the development of multiple - but compatible - alternatives, so strengthening mankind's ability to manage flood impacts. The GFM is being developed and managed by a non-profit organization created for the purpose. The business model will be inspired from open source software (eg Linux): - for non-profit usage

  18. Influence of Flood Detention Capability in Flood Prevention for Flood Disaster of Depression Area

    OpenAIRE

    Chia Lin Chan; Yi Ju Yang; Chih Chin Yang

    2011-01-01

    Rainfall records of rainfall station including the rainfall potential per hour and rainfall mass of five heavy storms are explored, respectively from 2001 to 2010. The rationalization formula is to investigate the capability of flood peak duration of flood detention pond in different rainfall conditions. The stable flood detention model is also proposed by using system dynamic control theory to get the message of flood detention pond in this research. When rainfall freque...

  19. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  20. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  1. Effects of Flood Control Strategies on Flood Resilience Under Sociohydrological Disturbances

    Science.gov (United States)

    Sung, Kyungmin; Jeong, Hanseok; Sangwan, Nikhil; Yu, David J.

    2018-04-01

    A community capacity to cope with flood hazards, or community flood resilience, emerges from the interplay of hydrological and social processes. This interplay can be significantly influenced by the flood control strategy adopted by a society, i.e., how a society sets its desired flood protection level and strives to achieve this goal. And this interplay can be further complicated by rising land-sea level differences, seasonal water level fluctuations, and economic change. But not much research has been done on how various forms of flood control strategies affect human-flood interactions under these disturbances and therefore flood resilience in the long run. The current study is an effort to address these issues by developing a conceptual model of human-flood interaction mediated by flood control strategies. Our model extends the existing model of Yu et al. (2017), who investigated the flood resilience of a community-based flood protection system in coastal Bangladesh. The major extensions made in this study are inclusions of various forms of flood control strategies (both adaptive and nonadaptive ones), the challenge of rising land-sea level differences, and various high tide level scenarios generated from modifying the statistical variances and averages. Our results show that adaptive forms of flood control strategies tend to outperform nonadaptive ones for maintaining the model community's flood protection system. Adaptive strategies that dynamically adjust target flood protection levels through close monitoring of flood damages and social memories of flood risk can help the model community deal with various disturbances.

  2. Public perception of flood risks, flood forecasting and mitigation

    Directory of Open Access Journals (Sweden)

    M. Brilly

    2005-01-01

    Full Text Available A multidisciplinary and integrated approach to the flood mitigation decision making process should provide the best response of society in a flood hazard situation including preparation works and post hazard mitigation. In Slovenia, there is a great lack of data on social aspects and public response to flood mitigation measures and information management. In this paper, two studies of flood perception in the Slovenian town Celje are represented. During its history, Celje was often exposed to floods, the most recent serious floods being in 1990 and in 1998, with a hundred and fifty return period and more than ten year return period, respectively. Two surveys were conducted in 1997 and 2003, with 157 participants from different areas of the town in the first, and 208 in the second study, aiming at finding the general attitude toward the floods. The surveys revealed that floods present a serious threat in the eyes of the inhabitants, and that the perception of threat depends, to a certain degree, on the place of residence. The surveys also highlighted, among the other measures, solidarity and the importance of insurance against floods.

  3. Reactor safety under design basis flood condition for inland sites

    International Nuclear Information System (INIS)

    Hajela, S.; Bajaj, S.S.; Samota, A.; Verma, U.S.P.; Warudkar, A.S.

    2002-01-01

    Full text: In June 1994, there was an incident of flooding at Kakrapar Atomic Power Station (KAPS) due to combination of heavy rains and mechanical failure in the operation of gates at the adjoining weir. An indepth review of the incident was carried out and a number of flood protection measures were recommended and were implemented at site. As part of this review, a safety analysis was also done to demonstrate reactor safety with a series of failures considered in the flood protection features. For each inland NPP site, as part of design, different flood scenarios are analysed to arrive at design basis flood (DBF) level. This level is estimated based on worst combination of heavy local precipitation, flooding in river, failure of upstream/downstream water control structures

  4. Unstructured mesh adaptivity for urban flooding modelling

    Science.gov (United States)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  5. Impact of stream restoration on flood waves

    Science.gov (United States)

    Sholtes, J.; Doyle, M.

    2008-12-01

    Restoration of channelized or incised streams has the potential to reduce downstream flooding via storing and dissipating the energy of flood waves. Restoration design elements such as restoring meanders, reducing slope, restoring floodplain connectivity, re-introducing in-channel woody debris, and re-vegetating banks and the floodplain have the capacity to attenuate flood waves via energy dissipation and channel and floodplain storage. Flood discharge hydrographs measured up and downstream of several restored reaches of varying stream order and located in both urban and rural catchments are coupled with direct measurements of stream roughness at various stages to directly measure changes to peak discharge, flood wave celerity, and dispersion. A one-dimensional unsteady flow routing model, HEC-RAS, is calibrated and used to compare attenuation characteristics between pre and post restoration conditions. Modeled sensitivity results indicate that a restoration project placed on a smaller order stream demonstrates the highest relative reduction in peak discharge of routed flood waves compared to one of equal length on a higher order stream. Reductions in bed slope, extensions in channel length, and increases in channel and floodplain roughness follow restoration placement with the watershed in relative importance. By better understanding how design, scale, and location of restored reaches within a catchment hydraulically impact flood flows, this study contributes both to restoration design and site decision making. It also quantifies the effect of reach scale stream restoration on flood wave attenuation.

  6. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  7. Increasing stress on disaster risk finance due to large floods

    Science.gov (United States)

    Jongman, Brenden; Hochrainer-Stigler, Stefan; Feyen, Luc; Aerts, Jeroen; Mechler, Reinhard; Botzen, Wouter; Bouwer, Laurens; Pflug, Georg; Rojas, Rodrigo; Ward, Philip

    2014-05-01

    Recent major flood disasters have shown that single extreme events can affect multiple countries simultaneously, which puts high pressure on trans-national risk reduction and risk transfer mechanisms. To date, little is known about such flood hazard interdependencies across regions, and the corresponding joint risks at regional to continental scales. Reliable information on correlated loss probabilities is crucial for developing robust insurance schemes and public adaptation funds, and for enhancing our understanding of climate change impacts. Here we show that extreme discharges are strongly correlated across European river basins and that these correlations can, or should, be used in national to continental scale risk assessment. We present probabilistic trends in continental flood risk, and demonstrate that currently observed extreme flood losses could more than double in frequency by 2050 under future climate change and socioeconomic development. The results demonstrate that accounting for tail dependencies leads to higher estimates of extreme losses than estimates based on the traditional assumption of independence between basins. We suggest that risk management for these increasing losses is largely feasible, and we demonstrate that risk can be shared by expanding risk transfer financing, reduced by investing in flood protection, or absorbed by enhanced solidarity between countries. We conclude that these measures have vastly different efficiency, equity and acceptability implications, which need to be taken into account in broader consultation, for which our analysis provides a basis.

  8. PAI-OFF: A new proposal for online flood forecasting in flash flood prone catchments

    Science.gov (United States)

    Schmitz, G. H.; Cullmann, J.

    2008-10-01

    SummaryThe Process Modelling and Artificial Intelligence for Online Flood Forecasting (PAI-OFF) methodology combines the reliability of physically based, hydrologic/hydraulic modelling with the operational advantages of artificial intelligence. These operational advantages are extremely low computation times and straightforward operation. The basic principle of the methodology is to portray process models by means of ANN. We propose to train ANN flood forecasting models with synthetic data that reflects the possible range of storm events. To this end, establishing PAI-OFF requires first setting up a physically based hydrologic model of the considered catchment and - optionally, if backwater effects have a significant impact on the flow regime - a hydrodynamic flood routing model of the river reach in question. Both models are subsequently used for simulating all meaningful and flood relevant storm scenarios which are obtained from a catchment specific meteorological data analysis. This provides a database of corresponding input/output vectors which is then completed by generally available hydrological and meteorological data for characterizing the catchment state prior to each storm event. This database subsequently serves for training both a polynomial neural network (PoNN) - portraying the rainfall-runoff process - and a multilayer neural network (MLFN), which mirrors the hydrodynamic flood wave propagation in the river. These two ANN models replace the hydrological and hydrodynamic model in the operational mode. After presenting the theory, we apply PAI-OFF - essentially consisting of the coupled "hydrologic" PoNN and "hydrodynamic" MLFN - to the Freiberger Mulde catchment in the Erzgebirge (Ore-mountains) in East Germany (3000 km 2). Both the demonstrated computational efficiency and the prediction reliability underline the potential of the new PAI-OFF methodology for online flood forecasting.

  9. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  10. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  11. Morphodynamic Response of the Unregulated Yampa River at Deerlodge to the 2011 Flood

    Science.gov (United States)

    Wheaton, J. M.; Scott, M.; Perkins, D.; DeMeurichy, K.

    2011-12-01

    The Yampa River, a tributary to the Green River, is the last undammed major tributary in the upper Colorado River Basin. The Yampa River at Deerlodge is actively braiding in an unconfined park valley setting, just upstream of the confined Yampa Canyon in Dinosaur National Monument. Deerlodge is a critical indicator site, which is monitored closely for signs of potential channel narrowing and associated invasions of non-native tamarisk or salt cedar (Tamarix) by the National Park Service's Northern Colorado Plateau Network (NPS-NCPN). Like many rivers draining the Rockies, the Yampa was fed by record snowpack in this year's spring runoff and produced the second largest flood of record at 748 cms (largest food of record was 940 cms in1984). In contrast to most major rivers in the Colorado Basin, which are now dammed, the Yampa's natural, unregulated floods are thought to be of critical importance in rejuvenating the floodplain and reorganizing habitat in a manner favorable to native riparian vegetation and unfavorable to tamarisk. As part of the Big Rivers Monitoring Protocol, a 1.5 km reach of the braided river was surveyed with sub-centimeter resolution ground-based LiDaR and a total station in September of 2010 and was resurveyed after the 2011floods. The ground-based LiDaR captures the vegetation as well as topography. Additionally, vegetation surveys were performed to identify plant species present, percent covers and relative abundance before and after the flood. The Geomorphic Change Detection software was used to distinguish the real net changes from noise and segregate the budget by specific mechanisms of geomorphic change associated with different channel and vegetative patterns. This quantitative study of the morphodynamic response to a major flood highlights a critical potential positive feedback the flood plays on native riparian vegetation recruitment and potential negative feedback on non-native tamarisk.

  12. Predicting the impact of urban flooding using open data.

    Science.gov (United States)

    Tkachenko, Nataliya; Procter, Rob; Jarvis, Stephen

    2016-05-01

    This paper aims to explore whether there is a relationship between search patterns for flood risk information on the Web and how badly localities have been affected by flood events. We hypothesize that localities where people stay more actively informed about potential flooding experience less negative impact than localities where people make less effort to be informed. Being informed, of course, does not hold the waters back; however, it may stimulate (or serve as an indicator of) such resilient behaviours as timely use of sandbags, relocation of possessions from basements to upper floors and/or temporary evacuation from flooded homes to alternative accommodation. We make use of open data to test this relationship empirically. Our results demonstrate that although aggregated Web search reflects average rainfall patterns, its eigenvectors predominantly consist of locations with similar flood impacts during 2014-2015. These results are also consistent with statistically significant correlations of Web search eigenvectors with flood warning and incident reporting datasets.

  13. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    Science.gov (United States)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  14. Guiding rational reservoir flood operation using penalty-type genetic algorithm

    Science.gov (United States)

    Chang, Li-Chiu

    2008-06-01

    SummaryReal-time flood control of a multi-purpose reservoir should consider decreasing the flood peak stage downstream and storing floodwaters for future usage during typhoon seasons. This study proposes a reservoir flood control optimization model with linguistic description of requirements and existing regulations for rational operating decisions. The approach involves formulating reservoir flood operation as an optimization problem and using the genetic algorithm (GA) as a search engine. The optimizing formulation is expressed not only by mathematical forms of objective function and constraints, but also by no analytic expression in terms of parameters. GA is used to search a global optimum of a mixture of mathematical and nonmathematical formulations. Due to the great number of constraints and flood control requirements, it is difficult to reach a solution without violating constraints. To tackle this bottleneck, the proper penalty strategy for each parameter is proposed to guide the GA searching process. The proposed approach is applied to the Shihmen reservoir in North Taiwan for finding the rational release and desired storage as a case study. The hourly historical data sets of 29 typhoon events that have hit the area in last thirty years are investigated bye the proposed method. To demonstrate the effectiveness of the proposed approach, the simplex method was performed. The results demonstrated that a penalty-type genetic algorithm could effectively provide rational hydrographs to reduce flood damage during the flood operation and to increase final storage for future usages.

  15. Damaging Rainfall and Flooding. The Other Sahel Hazards

    Energy Technology Data Exchange (ETDEWEB)

    Tarhule, A. [Department of Geography, University of Oklahoma, 100 East Boyd Street, Norman, OK, 73079 (United States)

    2005-10-01

    Damaging rainfall and rain-induced flooding occur from time to time in the drought-prone Sahel savannah zone of Niger in West Africa but official records of these events and their socioeconomic impacts do not exist. This paper utilized newspaper accounts between 1970 and 2000 to survey and illustrate the range of these flood hazards in the Sahel. During the study interval, 53 newspaper articles reported 79 damaging rainfall and flood events in 47 different communities in the Sahel of Niger. Collectively, these events destroyed 5,580 houses and rendered 27,289 people homeless. Cash losses and damage to infrastructure in only three events exceeded $4 million. Sahel residents attribute these floods to five major causes including both natural and anthropogenic, but they view the flood problem as driven primarily by land use patterns. Despite such awareness, traditional coping strategies appear inadequate for dealing with the problems in part because of significant climatic variability. Analysis of several rainfall measures indicates that the cumulative rainfall in the days prior to a heavy rain event is an important factor influencing whether or not heavy rainfall results in flooding. Thus, despite some limitations, newspaper accounts of historical flooding are largely consistent with measured climatic variables. The study demonstrates that concerted effort is needed to improve the status of knowledge concerning flood impacts and indeed other natural and human hazards in the Sahel.

  16. Increasing stress on disaster-risk finance due to large floods

    Science.gov (United States)

    Jongman, Brenden; Hochrainer-Stigler, Stefan; Feyen, Luc; Aerts, Jeroen C. J. H.; Mechler, Reinhard; Botzen, W. J. Wouter; Bouwer, Laurens M.; Pflug, Georg; Rojas, Rodrigo; Ward, Philip J.

    2014-04-01

    Recent major flood disasters have shown that single extreme events can affect multiple countries simultaneously, which puts high pressure on trans-national risk reduction and risk transfer mechanisms. So far, little is known about such flood hazard interdependencies across regions and the corresponding joint risks at regional to continental scales. Reliable information on correlated loss probabilities is crucial for developing robust insurance schemes and public adaptation funds, and for enhancing our understanding of climate change impacts. Here we show that extreme discharges are strongly correlated across European river basins. We present probabilistic trends in continental flood risk, and demonstrate that observed extreme flood losses could more than double in frequency by 2050 under future climate change and socio-economic development. We suggest that risk management for these increasing losses is largely feasible, and we demonstrate that risk can be shared by expanding risk transfer financing, reduced by investing in flood protection, or absorbed by enhanced solidarity between countries. We conclude that these measures have vastly different efficiency, equity and acceptability implications, which need to be taken into account in broader consultation, for which our analysis provides a basis.

  17. Flood Risk Management In Europe: European flood regulation

    NARCIS (Netherlands)

    Hegger, D.L.T.; Bakker, M.H.; Green, C.; Driessen, Peter; Delvaux, B.; Rijswick, H.F.M.W. van; Suykens, C.; Beyers, J-C.; Deketelaere, K.; Doorn-Hoekveld, W. van; Dieperink, C.

    2013-01-01

    In Europe, water management is moving from flood defense to a risk management approach, which takes both the probability and the potential consequences of flooding into account. In this report, we will look at Directives and (non-)EU- initiatives in place to deal with flood risk in Europe indirectly

  18. Citizen involvement in flood risk governance: flood groups and networks

    Directory of Open Access Journals (Sweden)

    Twigger-Ross Clare

    2016-01-01

    Full Text Available Over the past decade has been a policy shift withinUK flood risk management towards localism with an emphasis on communities taking ownership of flood risk. There is also an increased focus on resilience and, more specifically, on community resilience to flooding. This paper draws on research carried out for UK Department for Environment Food and Rural Affairs to evaluate the Flood Resilience Community Pathfinder (FRCP scheme in England. Resilience is conceptualised as multidimensional and linked to exisiting capacities within a community. Creating resilience to flooding is an ongoing process of adaptation, learning from past events and preparing for future risks. This paper focusses on the development of formal and informal institutions to support improved flood risk management: institutional resilience capacity. It includes new institutions: e.g. flood groups, as well as activities that help to build inter- and intra- institutional resilience capacity e.g. community flood planning. The pathfinder scheme consisted of 13 projects across England led by local authorities aimed at developing community resilience to flood risk between 2013 – 2015. This paper discusses the nature and structure of flood groups, the process of their development, and the extent of their linkages with formal institutions, drawing out the barriers and facilitators to developing institutional resilience at the local level.

  19. 18 CFR 1304.407 - Development within flood control storage zones of TVA reservoirs.

    Science.gov (United States)

    2010-04-01

    ... flood control storage zones of TVA reservoirs. 1304.407 Section 1304.407 Conservation of Power and Water... documentation related to flood control storage, provided the loss of flood control storage caused by the project... control storage. If this determination can be made, the applicant must then demonstrate how the loss of...

  20. A Flood Risk Assessment of Quang Nam, Vietnam Using Spatial Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chinh Luu

    2018-04-01

    Full Text Available Vietnam is highly vulnerable to flood and storm impacts. Holistic flood risk assessment maps that adequately consider flood risk factors of hazard, exposure, and vulnerability are not available. These are vital for flood risk preparedness and disaster mitigation measures at the local scale. Unfortunately, there is a lack of knowledge about spatial multicriteria decision analysis and flood risk analysis more broadly in Vietnam. In response to this need, we identify and quantify flood risk components in Quang Nam province through spatial multicriteria decision analysis. The study presents a new approach to local flood risk assessment mapping, which combines historical flood marks with exposure and vulnerability data. The flood risk map output could assist and empower decision-makers in undertaking flood risk management activities in the province. Our study demonstrates a methodology to build flood risk assessment maps using flood mark, exposure and vulnerability data, which could be applied in other provinces in Vietnam.

  1. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  2. Engineering Study for a Full Scale Demonstration of Steam Reforming Black Liquor Gasification at Georgia-Pacific's Mill in Big Island, Virginia; FINAL

    International Nuclear Information System (INIS)

    Robert De Carrera; Mike Ohl

    2002-01-01

    Georgia-Pacific Corporation performed an engineering study to determine the feasibility of installing a full-scale demonstration project of steam reforming black liquor chemical recovery at Georgia-Pacific's mill in Big Island, Virginia. The technology considered was the Pulse Enhanced Steam Reforming technology that was developed and patented by Manufacturing and Technology Conversion, International (MTCI) and is currently licensed to StoneChem, Inc., for use in North America. Pilot studies of steam reforming have been carried out on a 25-ton per day reformer at Inland Container's Ontario, California mill and on a 50-ton per day unit at Weyerhaeuser's New Bern, North Carolina mill

  3. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  4. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  5. Entering the 'big data' era in medicinal chemistry: molecular promiscuity analysis revisited.

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2017-06-01

    The 'big data' concept plays an increasingly important role in many scientific fields. Big data involves more than unprecedentedly large volumes of data that become available. Different criteria characterizing big data must be carefully considered in computational data mining, as we discuss herein focusing on medicinal chemistry. This is a scientific discipline where big data is beginning to emerge and provide new opportunities. For example, the ability of many drugs to specifically interact with multiple targets, termed promiscuity, forms the molecular basis of polypharmacology, a hot topic in drug discovery. Compound promiscuity analysis is an area that is much influenced by big data phenomena. Different results are obtained depending on chosen data selection and confidence criteria, as we also demonstrate.

  6. Quantification of Uncertainty in the Flood Frequency Analysis

    Science.gov (United States)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  7. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  8. Flood-rich and flood-poor periods in Spain in 1942-2009

    Science.gov (United States)

    Mediero, Luis; Santillán, David; Garrote, Luis

    2016-04-01

    Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Mediero et al. (2015) studied flood trends by using the longest streamflow records available in Europe. They found a decreasing trend in the Atlantic, Continental and Scandinavian regions. More specifically, Mediero et al. (2014) found a general decreasing trend in flood series in Spain in the period 1959-2009. Trends in flood series are usually detected by the Mann-Kendall test applied to a given period. However, the result of the Mann-Kendall test can change in terms of the starting and ending year of the series. Flood oscillations can occur and flood-rich and flood-poor periods could condition the results, especially when they are located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to the longest series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. A flood-rich period in 1950-1970 and a flood-poor period in 1970-1990 are identified in most of the selected sites. The generalised decreasing trend in flood series found by Mediero et al. (2014) could be explained by a flood-rich period placed at the beginning of the series and a flood-poor period located at the end of the series. References: Mediero, L., Kjeldsen, T.R., Macdonald, N., Kohnova, S., Merz, B., Vorogushyn, S., Wilson, D., Alburquerque, T., Blöschl, G., Bogdanowicz, E., Castellarin, A., Hall, J., Kobold, M., Kriauciuniene, J., Lang, M., Madsen, H., Onuşluel Gül, G., Perdigão, R.A.P., Roald, L.A., Salinas, J.L., Toumazis, A.D., Veijalainen, N., Óðinn Þórarinsson. Identification of coherent flood

  9. Recent advances in flood forecasting and flood risk assessment

    Directory of Open Access Journals (Sweden)

    G. Arduino

    2005-01-01

    Full Text Available Recent large floods in Europe have led to increased interest in research and development of flood forecasting systems. Some of these events have been provoked by some of the wettest rainfall periods on record which has led to speculation that such extremes are attributable in some measure to anthropogenic global warming and represent the beginning of a period of higher flood frequency. Whilst current trends in extreme event statistics will be difficult to discern, conclusively, there has been a substantial increase in the frequency of high floods in the 20th century for basins greater than 2x105 km2. There is also increasing that anthropogenic forcing of climate change may lead to an increased probability of extreme precipitation and, hence, of flooding. There is, therefore, major emphasis on the improvement of operational flood forecasting systems in Europe, with significant European Community spending on research and development on prototype forecasting systems and flood risk management projects. This Special Issue synthesises the most relevant scientific and technological results presented at the International Conference on Flood Forecasting in Europe held in Rotterdam from 3-5 March 2003. During that meeting 150 scientists, forecasters and stakeholders from four continents assembled to present their work and current operational best practice and to discuss future directions of scientific and technological efforts in flood prediction and prevention. The papers presented at the conference fall into seven themes, as follows.

  10. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  11. Rethinking the relationship between flood risk perception and flood management.

    Science.gov (United States)

    Birkholz, S; Muro, M; Jeffrey, P; Smith, H M

    2014-04-15

    Although flood risk perceptions and their concomitant motivations for behaviour have long been recognised as significant features of community resilience in the face of flooding events, there has, for some time now, been a poorly appreciated fissure in the accompanying literature. Specifically, rationalist and constructivist paradigms in the broader domain of risk perception provide different (though not always conflicting) contexts for interpreting evidence and developing theory. This contribution reviews the major constructs that have been applied to understanding flood risk perceptions and contextualises these within broader conceptual developments around risk perception theory and contemporary thinking around flood risk management. We argue that there is a need to re-examine and re-invigorate flood risk perception research, in a manner that is comprehensively underpinned by more constructivist thinking around flood risk management as well as by developments in broader risk perception research. We draw attention to an historical over-emphasis on the cognitive perceptions of those at risk to the detriment of a richer understanding of a wider range of flood risk perceptions such as those of policy-makers or of tax-payers who live outside flood affected areas as well as the linkages between these perspectives and protective measures such as state-supported flood insurance schemes. Conclusions challenge existing understandings of the relationship between risk perception and flood management, particularly where the latter relates to communication strategies and the extent to which those at risk from flooding feel responsible for taking protective actions. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Flood Risk, Flood Mitigation, and Location Choice: Evaluating the National Flood Insurance Program's Community Rating System.

    Science.gov (United States)

    Fan, Qin; Davlasheridze, Meri

    2016-06-01

    Climate change is expected to worsen the negative effects of natural disasters like floods. The negative impacts, however, can be mitigated by individuals' adjustments through migration and relocation behaviors. Previous literature has identified flood risk as one significant driver in relocation decisions, but no prior study examines the effect of the National Flood Insurance Program's voluntary program-the Community Rating System (CRS)-on residential location choice. This article fills this gap and tests the hypothesis that flood risk and the CRS-creditable flood control activities affect residential location choices. We employ a two-stage sorting model to empirically estimate the effects. In the first stage, individuals' risk perception and preference heterogeneity for the CRS activities are considered, while mean effects of flood risk and the CRS activities are estimated in the second stage. We then estimate heterogeneous marginal willingness to pay (WTP) for the CRS activities by category. Results show that age, ethnicity and race, educational attainment, and prior exposure to risk explain risk perception. We find significant values for the CRS-creditable mitigation activities, which provides empirical evidence for the benefits associated with the program. The marginal WTP for an additional credit point earned for public information activities, including hazard disclosure, is found to be the highest. Results also suggest that water amenities dominate flood risk. Thus, high amenity values may increase exposure to flood risk, and flood mitigation projects should be strategized in coastal regions accordingly. © 2015 Society for Risk Analysis.

  13. A global framework for future costs and benefits of river-flood protection in urban areas

    Science.gov (United States)

    Ward, Philip J.; Jongman, Brenden; Aerts, Jeroen C. J. H.; Bates, Paul D.; Botzen, Wouter J. W.; Diaz Loaiza, Andres; Hallegatte, Stephane; Kind, Jarl M.; Kwadijk, Jaap; Scussolini, Paolo; Winsemius, Hessel C.

    2017-09-01

    Floods cause billions of dollars of damage each year, and flood risks are expected to increase due to socio-economic development, subsidence, and climate change. Implementing additional flood risk management measures can limit losses, protecting people and livelihoods. Whilst several models have been developed to assess global-scale river-flood risk, methods for evaluating flood risk management investments globally are lacking. Here, we present a framework for assessing costs and benefits of structural flood protection measures in urban areas around the world. We demonstrate its use under different assumptions of current and future climate change and socio-economic development. Under these assumptions, investments in dykes may be economically attractive for reducing risk in large parts of the world, but not everywhere. In some regions, economically efficient investments could reduce future flood risk below today’s levels, in spite of climate change and economic growth. We also demonstrate the sensitivity of the results to different assumptions and parameters. The framework can be used to identify regions where river-flood protection investments should be prioritized, or where other risk-reducing strategies should be emphasized.

  14. May flood-poor periods be more dangerous than flood-rich periods?

    Science.gov (United States)

    Salinas, Jose Luis; Di Baldassarre, Giuliano; Viglione, Alberto; Kuil, Linda; Bloeschl, Guenter

    2014-05-01

    River floods are among the most devastating natural hazards experienced by populations that, since the earliest recorded civilisations, have settled in floodplains because they offer favourable conditions for trade, agriculture, and economic development. The occurrence of a flood may cause loss of lives and tremendous economic damages and, therefore, is rightly seen as a very negative event by the communities involved. Occurrence of many floods in a row is, of course, even more frustrating and is rightly considered a unbearable calamity. Unfortunately, the occurrence of many floods in a limited number of consecutive years is not unusual. In many places in the world, it has been observed that extreme floods do not arrive randomly but cluster in time into flood-poor and flood-rich periods consistent with the Hurst effect. If this is the case, when are the people more in danger? When should people be more scared? In flood-poor or flood-rich periods? In this work, a Socio-Hydrology model (Di Baldassarre et al., 2013; Viglione et al., 2014) is used to show that, maybe counter-intuitively, flood-poor periods may be more dangerous than flood-rich periods. The model is a conceptualisation of a hypothetical setting of a city at a river where a community evolves, making choices between flood management options on the floodplain. The most important feedbacks between the economic, political, technological and hydrological processes of the evolution of that community are represented in the model. In particular, the model also accounts in a dynamic way for the evolution of the the community awareness to flood risk. Occurrence of floods tends to increase peoples' recognition that their property is in an area that is potentially at risk of flooding, both at the scales of individuals and communities, which is one of the main reasons why flood coping actions are taken. It is shown through examples that frequent flood events may result in moderate damages because they ensure that the

  15. Swiss Re Global Flood Hazard Zones: Know your flood risk

    Science.gov (United States)

    Vinukollu, R. K.; Castaldi, A.; Mehlhorn, J.

    2012-12-01

    Floods, among all natural disasters, have a great damage potential. On a global basis, there is strong evidence of increase in the number of people affected and economic losses due to floods. For example, global insured flood losses have increased by 12% every year since 1970 and this is expected to further increase with growing exposure in the high risk areas close to rivers and coastlines. Recently, the insurance industry has been surprised by the large extent of losses, because most countries lack reliable hazard information. One example has been the 2011 Thailand floods where millions of people were affected and the total economic losses were 30 billion USD. In order to assess the flood risk across different regions and countries, the flood team at Swiss Re based on a Geomorphologic Regression approach, developed in house and patented, produced global maps of flood zones. Input data for the study was obtained from NASA's Shuttle Radar Topographic Mission (SRTM) elevation data, Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) and HydroSHEDS. The underlying assumptions of the approach are that naturally flowing rivers shape their channel and flood plain according to basin inherent forces and characteristics and that the flood water extent strongly depends on the shape of the flood plain. On the basis of the catchment characteristics, the model finally calculates the probability of a location to be flooded or not for a defined return period, which in the current study was set to 100 years. The data is produced at a 90-m resolution for latitudes 60S to 60N. This global product is now used in the insurance industry to inspect, inform and/or insure the flood risk across the world.

  16. Long-term changes in community assembly, resistance, and resilience following experimental floods.

    Science.gov (United States)

    Robinson, Christopher T

    2012-10-01

    This study examined the long-term changes in community assembly, resistance, and resilience of macroinvertebrates following 10 years of experimental floods in a flow regulated river. Physico-chemistry, macroinvertebrates, and periphyton biomass were monitored before and sequentially after each of 22 floods, and drift/seston was collected during six separate floods over the study period. The floods reduced the density and taxon richness of macroinvertebrates, and a nonmetric dimensional scaling (NMDS) analysis distinguished temporal shifts in community assembly. Resistance (measured as the relative lack of loss in density) tofloods varied among taxa, and the abundance of resistant taxa was related to the temporal changes in community assembly. Community resistance was inversely related to flood magnitude with all larger floods (> 25 m3/s, > 16-fold over baseflow) reducing densities by > 75% regardless of flood year, whereas smaller floods (floods. No relationship was found between flood magnitude and the relative loss in periphyton biomass. Resilience was defined as the recovery slope (positive slope of a parameter with time following each flood) and was unrelated to shifts in community assembly or resistance. Macroinvertebrate drift and seston demonstrated hysteresis (i.e., a temporal response in parameter quantity with change in discharge) during each flood, although larger floods typically had two peaks in both parameters. The first peak was a response to the initial increases in flow, whereas the second peak was associated with streambed disturbance (substrate mobility) and side-slope failure causing increased scour. Drift density was 3-9 times greater and that of seston 3-30 times greater during larger floods than smaller floods. These results demonstrate temporal shifts in macroinvertebrate community assembly toward a pre-dam assemblage following sequential floods in this flow regulated river, thus confirming the ecological role of habitat filtering in

  17. Unsupervised Tensor Mining for Big Data Practitioners.

    Science.gov (United States)

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  18. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  19. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  20. Floods

    Science.gov (United States)

    Floods are common in the United States. Weather such as heavy rain, thunderstorms, hurricanes, or tsunamis can ... is breached, or when a dam breaks. Flash floods, which can develop quickly, often have a dangerous ...

  1. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    Science.gov (United States)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  2. Probabilistic flood extent estimates from social media flood observations

    NARCIS (Netherlands)

    Brouwer, Tom; Eilander, Dirk; Van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-01-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from

  3. Probabilistic flood extent estimates from social media flood observations

    NARCIS (Netherlands)

    Brouwer, Tom; Eilander, Dirk; Van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-01-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, creates a growing need for accurate and timely flood maps. This research focussed on creating flood maps using user generated content from Twitter. Twitter data has

  4. Improving Global Flood Forecasting using Satellite Detected Flood Extent

    NARCIS (Netherlands)

    Revilla Romero, B.

    2016-01-01

    Flooding is a natural global phenomenon but in many cases is exacerbated by human activity. Although flooding generally affects humans in a negative way, bringing death, suffering, and economic impacts, it also has potentially beneficial effects. Early flood warning and forecasting systems, as well

  5. Green River Formation Water Flood Demonstration Project: Final report. [October 21, 1992-April, 30, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Deo, M.D. [Dept. of Chemical and Fuels Engineering, University of Utah, Salt Lake City (US); Dyer, J.E.; Lomax, J.D. [Inland Resources, Inc., Lomax Exploration Co., Salt Lake City, UT (US); Nielson, D.L.; Lutz, S.J. [Energy and Geoscience Institute at the University of Utah, Salt Lake City (US)

    1996-11-01

    The objectives were to understand the oil production mechanisms in the Monument Butte unit via reservoir characterization and reservoir simulations and to transfer the water flooding technology to similar units in the vicinity, particularly the Travis and the Boundary units. Comprehensive reservoir characterization and reservoir simulations of the Monument Butte, Travis and Boundary units were presented in the two published project yearly reports. The primary and the secondary production from the Monument Butte unit were typical of oil production from an undersaturated oil reservoir close to its bubble point. The water flood in the smaller Travis unit appeared affected by natural and possibly by large interconnecting hydraulic fractures. Water flooding the boundary unit was considered more complicated due to the presence of an oil water contact in one of the wells. The reservoir characterization activity in the project basically consisted of extraction and analysis of a full diameter c ore, Formation Micro Imaging logs from several wells and Magnetic Resonance Imaging logs from two wells. In addition, several side-wall cores were drilled and analyzed, oil samples from a number of wells were physically and chemically characterized (using gas chromatography), oil-water relative permeabilities were measured and pour points and cloud points of a few oil samples were determined. The reservoir modeling activity comprised of reservoir simulation of all the three units at different scales and near well-bore modeling of the wax precipitation effects. The reservoir characterization efforts identified new reservoirs in the Travis and the Boundary units. The reservoir simulation activities established the extent of pressurization of the sections of the reservoirs in the immediate vicinity of the Monument Butte unit. This resulted in a major expansion of the unit and the production from this expanded unit increased from about 300 barrels per day to about 2000 barrels per day.

  6. Green River Formation Water Flood Demonstration Project: Final report, October 21, 1992-April, 30, 1996

    International Nuclear Information System (INIS)

    Deo, M.D.; Dyer, J.E.; Lomax, J.D.; Nielson, D.L.; Lutz, S.J.

    1996-01-01

    The objectives were to understand the oil production mechanisms in the Monument Butte unit via reservoir characterization and reservoir simulations and to transfer the water flooding technology to similar units in the vicinity, particularly the Travis and the Boundary units. Comprehensive reservoir characterization and reservoir simulations of the Monument Butte, Travis and Boundary units were presented in the two published project yearly reports. The primary and the secondary production from the Monument Butte unit were typical of oil production from an undersaturated oil reservoir close to its bubble point. The water flood in the smaller Travis unit appeared affected by natural and possibly by large interconnecting hydraulic fractures. Water flooding the boundary unit was considered more complicated due to the presence of an oil water contact in one of the wells. The reservoir characterization activity in the project basically consisted of extraction and analysis of a full diameter c ore, Formation Micro Imaging logs from several wells and Magnetic Resonance Imaging logs from two wells. In addition, several side-wall cores were drilled and analyzed, oil samples from a number of wells were physically and chemically characterized (using gas chromatography), oil-water relative permeabilities were measured and pour points and cloud points of a few oil samples were determined. The reservoir modeling activity comprised of reservoir simulation of all the three units at different scales and near well-bore modeling of the wax precipitation effects. The reservoir characterization efforts identified new reservoirs in the Travis and the Boundary units. The reservoir simulation activities established the extent of pressurization of the sections of the reservoirs in the immediate vicinity of the Monument Butte unit. This resulted in a major expansion of the unit and the production from this expanded unit increased from about 300 barrels per day to about 2000 barrels per day

  7. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  8. Flood Label for buildings : a tool for more flood-resilient cities

    NARCIS (Netherlands)

    Hartmann, T.; Scheibel, Marc

    2016-01-01

    River floods are among the most expensive natural disasters in Europe. Traditional flood protection methods are not sufficient anymore. It is widely acknowledged in the scholarly debate and in practice of flood risk management that traditional flood protection measures such as dikes need to be

  9. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  10. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  11. Flooding in imagination vs flooding in vivo: A comparison with agoraphobics

    NARCIS (Netherlands)

    Emmelkamp, Paul M.G.; Wessels, Hemmy

    In this investigation of agoraphobic patients, 3 different flooding procedures were compared: (1) prolonged exposure in vivo, (2) flooding in the imagination by a ‘live’ therapist and (3) a combination of flooding in the imagination and flooding in vivo. After an intermediate-test all clients were

  12. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  13. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  14. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  15. Unexpected flood loss correlations across Europe

    Science.gov (United States)

    Booth, Naomi; Boyd, Jessica

    2017-04-01

    Floods don't observe country borders, as highlighted by major events across Europe that resulted in heavy economic and insured losses in 1999, 2002, 2009 and 2013. Flood loss correlations between some countries occur along multi-country river systems or between neighbouring nations affected by the same weather systems. However, correlations are not so obvious and whilst flooding in multiple locations across Europe may appear independent, for a re/insurer providing cover across the continent, these unexpected correlations can lead to high loss accumulations. A consistent, continental-scale method that allows quantification and comparison of losses, and identifies correlations in loss between European countries is therefore essential. A probabilistic model for European river flooding was developed that allows estimation of potential losses to pan-European property portfolios. By combining flood hazard and exposure information in a catastrophe modelling platform, we can consider correlations between river basins across Europe rather than being restricted to country boundaries. A key feature of the model is its statistical event set based on extreme value theory. Using historical river flow data, the event set captures spatial and temporal patterns of flooding across Europe and simulates thousands of events representing a full range of possible scenarios. Some known correlations were identified, such as between neighbouring Belgium and Luxembourg where 28% of events that affect either country produce a loss in both. However, our model identified some unexpected correlations including between Austria and Poland, and Poland and France, which are geographically distant. These correlations in flood loss may be missed by traditional methods and are key for re/insurers with risks in multiple countries. The model also identified that 46% of European river flood events affect more than one country. For more extreme events with a return period higher than 200 years, all events

  16. Dealing with Uncertainty in Flood Management Through Diversification

    Directory of Open Access Journals (Sweden)

    Jeroen C. J. H. Aerts

    2008-06-01

    Full Text Available This paper shows, through a numerical example, how to develop portfolios of flood management activities that generate the highest return under an acceptable risk for an area in the central part of the Netherlands. The paper shows a method based on Modern Portfolio Theory (MPT that contributes to developing flood management strategies. MPT aims at finding sets of investments that diversify risks thereby reducing the overall risk of the total portfolio of investments. This paper shows that through systematically combining four different flood protection measures in portfolios containing three or four measures; risk is reduced compared with portfolios that only contain one or two measures. Adding partly uncorrelated measures to the portfolio diversifies risk. We demonstrate how MPT encourages a systematic discussion of the relationship between the return and risk of individual flood mitigation activities and the return and risk of complete portfolios. It is also shown how important it is to understand the correlation of the returns of various flood management activities. The MPT approach, therefore, fits well with the notion of adaptive water management, which perceives the future as inherently uncertain. Through applying MPT on flood protection strategies current vulnerability will be reduced by diversifying risk.

  17. Flooding and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2011

    2011-01-01

    According to the Federal Emergency Management Agency, flooding is the nation's most common natural disaster. Some floods develop slowly during an extended period of rain or in a warming trend following a heavy snow. Flash floods can occur quickly, without any visible sign of rain. Catastrophic floods are associated with burst dams and levees,…

  18. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  19. The Financial Benefit of Early Flood Warnings in Europe

    Science.gov (United States)

    Pappenberger, Florian; Cloke, Hannah L.; Wetterhall, Fredrik; Parker, Dennis J.; Richardson, David; Thielen, Jutta

    2015-04-01

    Effective disaster risk management relies on science based solutions to close the gap between prevention and preparedness measures. The outcome of consultations on the UNIDSR post-2015 framework for disaster risk reduction highlight the need for cross-border early warning systems to strengthen the preparedness phases of disaster risk management in order to save people's lives and property and reduce the overall impact of severe events. In particular, continental and global scale flood forecasting systems provide vital information to various decision makers with which early warnings of floods can be made. Here the potential monetary benefits of early flood warnings using the example of the European Flood Awareness System (EFAS) are calculated based on pan-European Flood damage data and calculations of potential flood damage reductions. The benefits are of the order of 400 Euro for every 1 Euro invested. Because of the uncertainties which accompany the calculation, a large sensitivity analysis is performed in order to develop an envelope of possible financial benefits. Current EFAS system skill is compared against perfect forecasts to demonstrate the importance of further improving the skill of the forecasts. Improving the response to warnings is also essential in reaping the benefits of flood early warnings.

  20. Effect of Urban Green Spaces and Flooded Area Type on Flooding Probability

    Directory of Open Access Journals (Sweden)

    Hyomin Kim

    2016-01-01

    Full Text Available Countermeasures to urban flooding should consider long-term perspectives, because climate change impacts are unpredictable and complex. Urban green spaces have emerged as a potential option to reduce urban flood risks, and their effectiveness has been highlighted in notable urban water management studies. In this study, flooded areas in Seoul, Korea, were divided into four flooded area types by cluster analysis based on topographic and physical characteristics and verified using discriminant analysis. After division by flooded area type, logistic regression analysis was performed to determine how the flooding probability changes with variations in green space area. Type 1 included regions where flooding occurred in a drainage basin that had a flood risk management infrastructure (FRMI. In Type 2, the slope was steep; the TWI (Topographic Wetness Index was relatively low; and soil drainage was favorable. Type 3 represented the gentlest sloping areas, and these were associated with the highest TWI values. In addition, these areas had the worst soil drainage. Type 4 had moderate slopes, imperfect soil drainage and lower than average TWI values. We found that green spaces exerted a considerable influence on urban flooding probabilities in Seoul, and flooding probabilities could be reduced by over 50% depending on the green space area and the locations where green spaces were introduced. Increasing the area of green spaces was the most effective method of decreasing flooding probability in Type 3 areas. In Type 2 areas, the maximum hourly precipitation affected the flooding probability significantly, and the flooding probability in these areas was high despite the extensive green space area. These findings can contribute towards establishing guidelines for urban spatial planning to respond to urban flooding.

  1. Entering the ‘big data’ era in medicinal chemistry: molecular promiscuity analysis revisited

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2017-01-01

    The ‘big data’ concept plays an increasingly important role in many scientific fields. Big data involves more than unprecedentedly large volumes of data that become available. Different criteria characterizing big data must be carefully considered in computational data mining, as we discuss herein focusing on medicinal chemistry. This is a scientific discipline where big data is beginning to emerge and provide new opportunities. For example, the ability of many drugs to specifically interact with multiple targets, termed promiscuity, forms the molecular basis of polypharmacology, a hot topic in drug discovery. Compound promiscuity analysis is an area that is much influenced by big data phenomena. Different results are obtained depending on chosen data selection and confidence criteria, as we also demonstrate. PMID:28670471

  2. Using cost-benefit concepts in design floods improves communication of uncertainty

    Science.gov (United States)

    Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi

    2017-04-01

    Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs

  3. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  4. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  5. Data assimilation of citizen collected information for real-time flood hazard mapping

    Science.gov (United States)

    Sayama, T.; Takara, K. T.

    2017-12-01

    Many studies in data assimilation in hydrology have focused on the integration of satellite remote sensing and in-situ monitoring data into hydrologic or land surface models. For flood predictions also, recent studies have demonstrated to assimilate remotely sensed inundation information with flood inundation models. In actual flood disaster situations, citizen collected information including local reports by residents and rescue teams and more recently tweets via social media also contain valuable information. The main interest of this study is how to effectively use such citizen collected information for real-time flood hazard mapping. Here we propose a new data assimilation technique based on pre-conducted ensemble inundation simulations and update inundation depth distributions sequentially when local data becomes available. The propose method is composed by the following two-steps. The first step is based on weighting average of preliminary ensemble simulations, whose weights are updated by Bayesian approach. The second step is based on an optimal interpolation, where the covariance matrix is calculated from the ensemble simulations. The proposed method was applied to case studies including an actual flood event occurred. It considers two situations with more idealized one by assuming continuous flood inundation depth information is available at multiple locations. The other one, which is more realistic case during such a severe flood disaster, assumes uncertain and non-continuous information is available to be assimilated. The results show that, in the first idealized situation, the large scale inundation during the flooding was estimated reasonably with RMSE effective. Nevertheless, the applications of the proposed data assimilation method demonstrated a high potential of this method for assimilating citizen collected information for real-time flood hazard mapping in the future.

  6. Phantom cosmology without Big Rip singularity

    Energy Technology Data Exchange (ETDEWEB)

    Astashenok, Artyom V. [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation); Nojiri, Shin' ichi, E-mail: nojiri@phys.nagoya-u.ac.jp [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan); Kobayashi-Maskawa Institute for the Origin of Particles and the Universe, Nagoya University, Nagoya 464-8602 (Japan); Odintsov, Sergei D. [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan); Institucio Catalana de Recerca i Estudis Avancats - ICREA and Institut de Ciencies de l' Espai (IEEC-CSIC), Campus UAB, Facultat de Ciencies, Torre C5-Par-2a pl, E-08193 Bellaterra (Barcelona) (Spain); Tomsk State Pedagogical University, Tomsk (Russian Federation); Yurov, Artyom V. [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation)

    2012-03-23

    We construct phantom energy models with the equation of state parameter w which is less than -1, w<-1, but finite-time future singularity does not occur. Such models can be divided into two classes: (i) energy density increases with time ('phantom energy' without 'Big Rip' singularity) and (ii) energy density tends to constant value with time ('cosmological constant' with asymptotically de Sitter evolution). The disintegration of bound structure is confirmed in Little Rip cosmology. Surprisingly, we find that such disintegration (on example of Sun-Earth system) may occur even in asymptotically de Sitter phantom universe consistent with observational data. We also demonstrate that non-singular phantom models admit wormhole solutions as well as possibility of Big Trip via wormholes.

  7. Phantom cosmology without Big Rip singularity

    International Nuclear Information System (INIS)

    Astashenok, Artyom V.; Nojiri, Shin'ichi; Odintsov, Sergei D.; Yurov, Artyom V.

    2012-01-01

    We construct phantom energy models with the equation of state parameter w which is less than -1, w<-1, but finite-time future singularity does not occur. Such models can be divided into two classes: (i) energy density increases with time (“phantom energy” without “Big Rip” singularity) and (ii) energy density tends to constant value with time (“cosmological constant” with asymptotically de Sitter evolution). The disintegration of bound structure is confirmed in Little Rip cosmology. Surprisingly, we find that such disintegration (on example of Sun-Earth system) may occur even in asymptotically de Sitter phantom universe consistent with observational data. We also demonstrate that non-singular phantom models admit wormhole solutions as well as possibility of Big Trip via wormholes.

  8. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  9. Economic Levers for Mitigating Interest Flooding Attack in Named Data Networking

    Directory of Open Access Journals (Sweden)

    Licheng Wang

    2017-01-01

    Full Text Available As a kind of unwelcome, unavoidable, and malicious behavior, distributed denial of service (DDoS is an ongoing issue in today’s Internet as well as in some newly conceived future Internet architectures. Recently, a first step was made towards assessing DDoS attacks in Named Data Networking (NDN—one of the promising Internet architectures in the upcoming big data era. Among them, interest flooding attack (IFA becomes one of the main serious problems. Enlightened by the extensive study on the possibility of mitigating DDoS in today’s Internet by employing micropayments, in this paper we address the possibility of introducing economic levers, say, dynamic pricing mechanism, and so forth, for regulating IFA in NDN.

  10. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  11. -Omic and Electronic Health Record Big Data Analytics for Precision Medicine.

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D

    2017-02-01

    Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.

  12. Tangible Results and Progress in Flood Risks Management with the PACTES Initiative

    Science.gov (United States)

    Costes, Murielle; Abadie, Jean-Paul; Ducuing, Jean-Louis; Denier, Jean-Paul; Stéphane

    The PACTES project (Prévention et Anticipation des Crues au moyen des Techniques Spatiales), initiated by CNES and the French Ministry of Research, aims at improving flood risk management, over the following three main phases : - Prevention : support and facilitate the analysis of flood risks and socio-economic impacts (risk - Forecasting and alert : improve the capability to predict and anticipate the flooding event - Crisis management : allow better situation awareness, communication and sharing of In order to achieve its ambitious objectives, PACTES: - integrates state-of-the-art techniques and systems (integration of the overall processing chains, - takes advantage of integrating recent model developments in wheather forecasting, rainfall, In this approach, space technology is thus used in three main ways : - radar and optical earth observation data are used to produce Digital Elevation Maps, land use - earth observation data are also an input to wheather forecasting, together with ground sensors; - satellite-based telecommunication and mobile positioning. Started in December 2000, the approach taken in PACTES is to work closely with users such as civil security and civil protection organisms, fire fighter brigades and city councils for requirements gathering and during the validation phase. It has lead to the development and experimentation of an integrated pre-operational demonstrator, delivered to different types of operational users. Experimentation has taken place in three watersheds representative of different types of floods (flash and plain floods). After a breaf reminder of what the PACTES project organization and aims are, the PACTES integrated pre-operational demonstrator is presented. The main scientific inputs to flood risk management are summarized. Validation studies for the three watersheds covered by PACTES (Moselle, Hérault and Thoré) are detailed. Feedback on the PACTES tangible results on flood risk management from an user point of view

  13. Optical and Physical Methods for Mapping Flooding with Satellite Imagery

    Science.gov (United States)

    Fayne, Jessica Fayne; Bolten, John; Lakshmi, Venkat; Ahamed, Aakash

    2016-01-01

    Flood and surface water mapping is becoming increasingly necessary, as extreme flooding events worldwide can damage crop yields and contribute to billions of dollars economic damages as well as social effects including fatalities and destroyed communities (Xaio et al. 2004; Kwak et al. 2015; Mueller et al. 2016).Utilizing earth observing satellite data to map standing water from space is indispensable to flood mapping for disaster response, mitigation, prevention, and warning (McFeeters 1996; Brakenridge and Anderson 2006). Since the early 1970s(Landsat, USGS 2013), researchers have been able to remotely sense surface processes such as extreme flood events to help offset some of these problems. Researchers have demonstrated countless methods and modifications of those methods to help increase knowledge of areas at risk and areas that are flooded using remote sensing data from optical and radar systems, as well as free publically available and costly commercial datasets.

  14. Flood Risk and Flood hazard maps - Visualisation of hydrological risks

    International Nuclear Information System (INIS)

    Spachinger, Karl; Dorner, Wolfgang; Metzka, Rudolf; Serrhini, Kamal; Fuchs, Sven

    2008-01-01

    Hydrological models are an important basis of flood forecasting and early warning systems. They provide significant data on hydrological risks. In combination with other modelling techniques, such as hydrodynamic models, they can be used to assess the extent and impact of hydrological events. The new European Flood Directive forces all member states to evaluate flood risk on a catchment scale, to compile maps of flood hazard and flood risk for prone areas, and to inform on a local level about these risks. Flood hazard and flood risk maps are important tools to communicate flood risk to different target groups. They provide compiled information to relevant public bodies such as water management authorities, municipalities, or civil protection agencies, but also to the broader public. For almost each section of a river basin, run-off and water levels can be defined based on the likelihood of annual recurrence, using a combination of hydrological and hydrodynamic models, supplemented by an analysis of historical records and mappings. In combination with data related to the vulnerability of a region risk maps can be derived. The project RISKCATCH addressed these issues of hydrological risk and vulnerability assessment focusing on the flood risk management process. Flood hazard maps and flood risk maps were compiled for Austrian and German test sites taking into account existing national and international guidelines. These maps were evaluated by eye-tracking using experimental graphic semiology. Sets of small-scale as well as large-scale risk maps were presented to test persons in order to (1) study reading behaviour as well as understanding and (2) deduce the most attractive components that are essential for target-oriented risk communication. A cognitive survey asking for negative and positive aspects and complexity of each single map complemented the experimental graphic semiology. The results indicate how risk maps can be improved to fit the needs of different user

  15. Spatiotemporal hazard mapping of a flood event "migration" in a transboundary river basin as an operational tool in flood risk management

    Science.gov (United States)

    Perrou, Theodora; Papastergios, Asterios; Parcharidis, Issaak; Chini, Marco

    2017-10-01

    Flood disaster is one of the heaviest disasters in the world. It is necessary to monitor and evaluate the flood disaster in order to mitigate the consequences. As floods do not recognize borders, transboundary flood risk management is imperative in shared river basins. Disaster management is highly dependent on early information and requires data from the whole river basin. Based on the hypothesis that the flood events over the same area with same magnitude have almost identical evolution, it is crucial to develop a repository database of historical flood events. This tool, in the case of extended transboundary river basins, could constitute an operational warning system for the downstream area. The utility of SAR images for flood mapping, was demonstrated by previous studies but the SAR systems in orbit were not characterized by high operational capacity. Copernicus system will fill this gap in operational service for risk management, especially during emergency phase. The operational capabilities have been significantly improved by newly available satellite constellation, such as the Sentinel-1A AB mission, which is able to provide systematic acquisitions with a very high temporal resolution in a wide swath coverage. The present study deals with the monitoring of a transboundary flood event in Evros basin. The objective of the study is to create the "migration story" of the flooded areas on the basis of the evolution in time for the event occurred from October 2014 till May 2015. Flood hazard maps will be created, using SAR-based semi-automatic algorithms and then through the synthesis of the related maps in a GIS-system, a spatiotemporal thematic map of the event will be produced. The thematic map combined with TanDEM-X DEM, 12m/pixel spatial resolution, will define the non- affected areas which is a very useful information for the emergency planning and emergency response phases. The Sentinels meet the main requirements to be an effective and suitable

  16. Modeling of Flood Risk for the Continental United States

    Science.gov (United States)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  17. The Big Fish Down Under: Examining Moderators of the "Big-Fish-Little-Pond" Effect for Australia's High Achievers

    Science.gov (United States)

    Seaton, Marjorie; Marsh, Herbert W.; Yeung, Alexander Seeshing; Craven, Rhonda

    2011-01-01

    Big-fish-little-pond effect (BFLPE) research has demonstrated that academic self-concept is negatively affected by attending high-ability schools. This article examines data from large, representative samples of 15-year-olds from each Australian state, based on the three Program for International Student Assessment (PISA) databases that focus on…

  18. Floods and climate: emerging perspectives for flood risk assessment and management

    DEFF Research Database (Denmark)

    Merz, B.; Aerts, J.; Arnbjerg-Nielsen, Karsten

    2014-01-01

    context of floods. We come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management. (2) Statistical......, and this variation may be partially quantifiable and predictable, with the perspective of dynamic, climate-informed flood risk management. (4) Efforts are needed to fully account for factors that contribute to changes in all three risk components (hazard, exposure, vulnerability) and to better understand......Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction...

  19. Modeling and Analysis in Marine Big Data: Advances and Challenges

    Directory of Open Access Journals (Sweden)

    Dongmei Huang

    2015-01-01

    Full Text Available It is aware that big data has gathered tremendous attentions from academic research institutes, governments, and enterprises in all aspects of information sciences. With the development of diversity of marine data acquisition techniques, marine data grow exponentially in last decade, which forms marine big data. As an innovation, marine big data is a double-edged sword. On the one hand, there are many potential and highly useful values hidden in the huge volume of marine data, which is widely used in marine-related fields, such as tsunami and red-tide warning, prevention, and forecasting, disaster inversion, and visualization modeling after disasters. There is no doubt that the future competitions in marine sciences and technologies will surely converge into the marine data explorations. On the other hand, marine big data also brings about many new challenges in data management, such as the difficulties in data capture, storage, analysis, and applications, as well as data quality control and data security. To highlight theoretical methodologies and practical applications of marine big data, this paper illustrates a broad view about marine big data and its management, makes a survey on key methods and models, introduces an engineering instance that demonstrates the management architecture, and discusses the existing challenges.

  20. Novel flood risk assessment framework for rapid decision making

    Science.gov (United States)

    Valyrakis, Manousos; Koursari, Eftychia; Solley, Mark

    2016-04-01

    The impacts of catastrophic flooding, have significantly increased over the last few decades. This is due to primarily the increased urbanisation in ever-expanding mega-cities as well as due to the intensification both in magnitude and frequency of extreme hydrologic events. Herein a novel conceptual framework is presented that incorporates the use of real-time information to inform and update low dimensionality hydraulic models, to allow for rapid decision making towards preventing loss of life and safeguarding critical infrastructure. In particular, a case study from the recent UK floods in the area of Whitesands (Dumfries), is presented to demonstrate the utility of this approach. It is demonstrated that effectively combining a wealth of readily available qualitative information (such as crowdsourced visual documentation or using live data from sensing techniques), with existing quantitative data, can help appropriately update hydraulic models and reduce modelling uncertainties in future flood risk assessments. This approach is even more useful in cases where hydraulic models are limited, do not exist or were not needed before unpredicted dynamic modifications to the river system took place (for example in the case of reduced or eliminated hydraulic capacity due to blockages). The low computational cost and rapid assessment this framework offers, render it promising for innovating in flood management.

  1. On the flood forecasting at the Bulgarian part of Struma River Basin

    International Nuclear Information System (INIS)

    Dimitrov, Dobri

    2004-01-01

    Struma is a mountain river flowing from North to South, from Bulgaria through Greece up to the Aegean Sea. It generates flush floods of snow melt - rainfall type mainly in the late spring. Flood forecasting there is needed to improve the flood mitigation measures at the Bulgarian territory of the basin as well as for effective reservoir management downstream Bulgarian border, secure flood handling at Greek territory and generally decrease the flood hazard. The paper summarizes the range of activities in the basin including: - the installation of automatic telemetric hydro meteorological observation network; - review of the results of relevant past projects; - analysis of historical hydro meteorological data; - design and calibration of flood forecasting models; - demonstrating the possibility to issue flood warnings with certain lead time and accuracy; - recent efforts to increase the lead time of the hydrological forecasts, applying forecasts from High Resolution Limited Area meteorological models and other activities in the frame of the EC 5th FP EFFS project.(Author)

  2. Hydraulic survey and scour assessment of Bridge 524, Tanana River at Big Delta, Alaska

    Science.gov (United States)

    Heinrichs, Thomas A.; Langley, Dustin E.; Burrows, Robert L.; Conaway, Jeffrey S.

    2007-01-01

    Bathymetric and hydraulic data were collected August 26–28, 1996, on the Tanana River at Big Delta, Alaska, at the Richardson Highway bridge and Trans-Alaska Pipeline crossing. Erosion along the right (north) bank of the river between the bridge and the pipeline crossing prompted the data collection. A water-surface profile hydraulic model for the 100- and 500-year recurrence-interval floods was developed using surveyed information. The Delta River enters the Tanana immediately downstream of the highway bridge, causing backwater that extends upstream of the bridge. Four scenarios were considered to simulate the influence of the backwater on flow through the bridge. Contraction and pier scour were computed from model results. Computed values of pier scour were large, but the scour during a flood may actually be less because of mitigating factors. No bank erosion was observed at the time of the survey, a low-flow period. Erosion is likely to occur during intermediate or high flows, but the actual erosion processes are unknown at this time.

  3. Flood Resilient Systems and their Application for Flood Resilient Planning

    Science.gov (United States)

    Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.

    2012-04-01

    Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project

  4. Estimation of flood environmental effects using flood zone mapping techniques in Halilrood Kerman, Iran.

    Science.gov (United States)

    Boudaghpour, Siamak; Bagheri, Majid; Bagheri, Zahra

    2014-01-01

    High flood occurrences with large environmental damages have a growing trend in Iran. Dynamic movements of water during a flood cause different environmental damages in geographical areas with different characteristics such as topographic conditions. In general, environmental effects and damages caused by a flood in an area can be investigated from different points of view. The current essay is aiming at detecting environmental effects of flood occurrences in Halilrood catchment area of Kerman province in Iran using flood zone mapping techniques. The intended flood zone map was introduced in four steps. Steps 1 to 3 pave the way to calculate and estimate flood zone map in the understudy area while step 4 determines the estimation of environmental effects of flood occurrence. Based on our studies, wide range of accuracy for estimating the environmental effects of flood occurrence was introduced by using of flood zone mapping techniques. Moreover, it was identified that the existence of Jiroft dam in the study area can decrease flood zone from 260 hectares to 225 hectares and also it can decrease 20% of flood peak intensity. As a result, 14% of flood zone in the study area can be saved environmentally.

  5. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  6. Hydraulic Characteristics of Bedrock Constrictions and Evaluation of One- and Two-Dimensional Models of Flood Flow on the Big Lost River at the Idaho National Engineering and Environmental Laboratory, Idaho

    Science.gov (United States)

    Berenbrock, Charles; Rousseau, Joseph P.; Twining, Brian V.

    2007-01-01

    A 1.9-mile reach of the Big Lost River, between the Idaho National Engineering and Environmental Laboratory (INEEL) diversion dam and the Pioneer diversion structures, was investigated to evaluate the effects of streambed erosion and bedrock constrictions on model predictions of water-surface elevations. Two one-dimensional (1-D) models, a fixed-bed surface-water flow model (HEC-RAS) and a movable-bed surface-water flow and sediment-transport model (HEC-6), were used to evaluate these effects. The results of these models were compared to the results of a two-dimensional (2-D) fixed-bed model [Transient Inundation 2-Dimensional (TRIM2D)] that had previously been used to predict water-surface elevations for peak flows with sufficient stage and stream power to erode floodplain terrain features (Holocene inset terraces referred to as BLR#6 and BLR#8) dated at 300 to 500 years old, and an unmodified Pleistocene surface (referred to as the saddle area) dated at 10,000 years old; and to extend the period of record at the Big Lost River streamflow-gaging station near Arco for flood-frequency analyses. The extended record was used to estimate the magnitude of the 100-year flood and the magnitude of floods with return periods as long as 10,000 years. In most cases, the fixed-bed TRIM2D model simulated higher water-surface elevations, shallower flow depths, higher flow velocities, and higher stream powers than the fixed-bed HEC-RAS and movable-bed HEC-6 models for the same peak flows. The HEC-RAS model required flow increases of 83 percent [100 to 183 cubic meters per second (m3/s)], and 45 percent (100 to 145 m3/s) to match TRIM2D simulations of water-surface elevations at two paleoindicator sites that were used to determine peak flows (100 m3/s) with an estimated return period of 300 to 500 years; and an increase of 13 percent (150 to 169 m3/s) to match TRIM2D water-surface elevations at the saddle area that was used to establish the peak flow (150 m3/s) of a paleoflood

  7. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  8. Should seasonal rainfall forecasts be used for flood preparedness?

    Directory of Open Access Journals (Sweden)

    E. Coughlan de Perez

    2017-09-01

    Full Text Available In light of strong encouragement for disaster managers to use climate services for flood preparation, we question whether seasonal rainfall forecasts should indeed be used as indicators of the likelihood of flooding. Here, we investigate the primary indicators of flooding at the seasonal timescale across sub-Saharan Africa. Given the sparsity of hydrological observations, we input bias-corrected reanalysis rainfall into the Global Flood Awareness System to identify seasonal indicators of floodiness. Results demonstrate that in some regions of western, central, and eastern Africa with typically wet climates, even a perfect tercile forecast of seasonal total rainfall would provide little to no indication of the seasonal likelihood of flooding. The number of extreme events within a season shows the highest correlations with floodiness consistently across regions. Otherwise, results vary across climate regimes: floodiness in arid regions in southern and eastern Africa shows the strongest correlations with seasonal average soil moisture and seasonal total rainfall. Floodiness in wetter climates of western and central Africa and Madagascar shows the strongest relationship with measures of the intensity of seasonal rainfall. Measures of rainfall patterns, such as the length of dry spells, are least related to seasonal floodiness across the continent. Ultimately, identifying the drivers of seasonal flooding can be used to improve forecast information for flood preparedness and to avoid misleading decision-makers.

  9. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  10. Development of Integrated Flood Analysis System for Improving Flood Mitigation Capabilities in Korea

    Science.gov (United States)

    Moon, Young-Il; Kim, Jong-suk

    2016-04-01

    Recently, the needs of people are growing for a more safety life and secure homeland from unexpected natural disasters. Flood damages have been recorded every year and those damages are greater than the annual average of 2 trillion won since 2000 in Korea. It has been increased in casualties and property damages due to flooding caused by hydrometeorlogical extremes according to climate change. Although the importance of flooding situation is emerging rapidly, studies related to development of integrated management system for reducing floods are insufficient in Korea. In addition, it is difficult to effectively reduce floods without developing integrated operation system taking into account of sewage pipe network configuration with the river level. Since the floods result in increasing damages to infrastructure, as well as life and property, structural and non-structural measures should be urgently established in order to effectively reduce the flood. Therefore, in this study, we developed an integrated flood analysis system that systematized technology to quantify flood risk and flood forecasting for supporting synthetic decision-making through real-time monitoring and prediction on flash rain or short-term rainfall by using radar and satellite information in Korea. Keywords: Flooding, Integrated flood analysis system, Rainfall forecasting, Korea Acknowledgments This work was carried out with the support of "Cooperative Research Program for Agriculture Science & Technology Development (Project No. PJ011686022015)" Rural Development Administration, Republic of Korea

  11. The Total Risk Analysis of Large Dams under Flood Hazards

    Directory of Open Access Journals (Sweden)

    Yu Chen

    2018-02-01

    Full Text Available Dams and reservoirs are useful systems in water conservancy projects; however, they also pose a high-risk potential for large downstream areas. Flood, as the driving force of dam overtopping, is the main cause of dam failure. Dam floods and their risks are of interest to researchers and managers. In hydraulic engineering, there is a growing tendency to evaluate dam flood risk based on statistical and probabilistic methods that are unsuitable for the situations with rare historical data or low flood probability, so a more reasonable dam flood risk analysis method with fewer application restrictions is needed. Therefore, different from previous studies, this study develops a flood risk analysis method for large dams based on the concept of total risk factor (TRF used initially in dam seismic risk analysis. The proposed method is not affected by the adequacy of historical data or the low probability of flood and is capable of analyzing the dam structure influence, the flood vulnerability of the dam site, and downstream risk as well as estimating the TRF of each dam and assigning corresponding risk classes to each dam. Application to large dams in the Dadu River Basin, Southwestern China, demonstrates that the proposed method provides quick risk estimation and comparison, which can help local management officials perform more detailed dam safety evaluations for useful risk management information.

  12. Going beyond the flood insurance rate map: insights from flood hazard map co-production

    Science.gov (United States)

    Luke, Adam; Sanders, Brett F.; Goodrich, Kristen A.; Feldman, David L.; Boudreau, Danielle; Eguiarte, Ana; Serrano, Kimberly; Reyes, Abigail; Schubert, Jochen E.; AghaKouchak, Amir; Basolo, Victoria; Matthew, Richard A.

    2018-04-01

    Flood hazard mapping in the United States (US) is deeply tied to the National Flood Insurance Program (NFIP). Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM) such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1) legends that frame flood intensity both qualitatively and quantitatively, and (2) flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1) standing water depths following the flood, (2) the erosive potential of flowing water, and (3) pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating pluvial flood hazards

  13. Flood Risk Regional Flood Defences : Technical report

    NARCIS (Netherlands)

    Kok, M.; Jonkman, S.N.; Lendering, K.T.

    2015-01-01

    Historically the Netherlands have always had to deal with the threat of flooding, both from the rivers and the sea as well as from heavy rainfall. The country consists of a large amount of polders, which are low lying areas of land protected from flooding by embankments. These polders require an

  14. The index-flood and the GRADEX methods combination for flood frequency analysis.

    Science.gov (United States)

    Fuentes, Diana; Di Baldassarre, Giuliano; Quesada, Beatriz; Xu, Chong-Yu; Halldin, Sven; Beven, Keith

    2017-04-01

    Flood frequency analysis is used in many applications, including flood risk management, design of hydraulic structures, and urban planning. However, such analysis requires of long series of observed discharge data which are often not available in many basins around the world. In this study, we tested the usefulness of combining regional discharge and local precipitation data to estimate the event flood volume frequency curve for 63 catchments in Mexico, Central America and the Caribbean. This was achieved by combining two existing flood frequency analysis methods, the regionalization index-flood approach with the GRADEX method. For up to 10-years return period, similar shape of the scaled flood frequency curve for catchments with similar flood behaviour was assumed from the index-flood approach. For return periods larger than 10-years the probability distribution of rainfall and discharge volumes were assumed to be asymptotically and exponential-type functions with the same scale parameter from the GRADEX method. Results showed that if the mean annual flood (MAF), used as index-flood, is known, the index-flood approach performed well for up to 10 years return periods, resulting in 25% mean relative error in prediction. For larger return periods the prediction capability decreased but could be improved by the use of the GRADEX method. As the MAF is unknown at ungauged and short-period measured basins, we tested predicting the MAF using catchments climate-physical characteristics, and discharge statistics, the latter when observations were available for only 8 years. Only the use of discharge statistics resulted in acceptable predictions.

  15. Flood characteristics of the Haor area in Bangladesh

    Science.gov (United States)

    Suman, Asadusjjaman; Bhattacharya, Biswa

    2013-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Bangladesh is a country, which is frequently suffering from flooding. The current research is conducted in the framework of a project, which focuses on the flooding issues in the Haor region in the north-east of Bangladesh. A haor is a saucer-shaped depression, which is used during the dry period (December to mid-May) for agriculture and as a fishery during the wet period (June-November), and thereby presents a very interesting socio-economic perspective of flood risk management. Pre-monsoon flooding till mid-May causes agricultural loss and lot of distress whereas monsoon flooding brings benefits. The area is bordering India, thereby presenting trans-boundary issues as well, and is fed by some flashy Indian catchments. The area is drained mainly through the Surma-Kushiyara river system. The terrain generally is flat and the flashy characteristics die out within a short distance from the border. Limited studies on the region, particularly with the help of numerical models, have been carried out in the past. Therefore, an objective of the current research was to set up numerical models capable of reasonably emulating the physical system. Such models could, for example, associate different gauges to the spatio-temporal variation of hydrodynamic variables and help in carrying out a systemic study on the impact of climate changes. A 1D2D model, with one-dimensional model for the rivers (based on MIKE 11 modelling tool from Danish Hydraulic Institute) and a two

  16. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  17. The relevance of flood hazards and impacts in Turkey: What can be learned from different disaster loss databases?

    Science.gov (United States)

    Koc, Gamze; Thieken, Annegret H.

    2016-04-01

    classification system (IRDR, 2014). Furthermore, literature, news archives and the Global Active Archive of Large Flood Events - Dartmouth Flood Observatory (floodobservatory.colorado.edu) were used to complement loss data gaps of the databases. From 1960 to 2014, EM-DAT reported 35 flood events in Turkey (26.3 % of all natural hazards events), which caused 773 fatalities (the second most destructive type of natural hazard after earthquakes) and a total economic damage of US 2.2 billion. In contrast, TABB contained 1076 flood events (8.3 % of all natural hazards events), by which 795 people died. On this basis, floods are the third most destructive type of natural hazard -after earthquakes and extreme temperatures- for human losses in Turkey. A comparison of the two databases EM-DAT and TABB reveals big mismatches of the flood data, e.g. the reported number of events, number of affected people and economic loss, differ dramatically. It is concluded that the main reason for the big differences and contradicting numbers of different natural disaster databases is lack of standardization for data collection, peril classification and database thresholds (entry criteria). Since loss data collection is gaining more and more attention, e.g. in the Sendai Framework for Disaster Risk Reduction 2015-2030 (SFDRR), the study could offer substantial insights for flood risk mitigation and adaptation studies in Turkey. References Gall, M., Borden, K., Cutter, S.L. (2009) When do losses count? Six fallacies of loss data from natural hazards. Bulletin of the American Meteorological Society, 90(6), 799-809. Genç, F.S., (2007) Türkiye'de Kentleşme ve Doǧal Afet Riskleri ile İlişkisi, TMMOB Afet Sempozyumu. IRDR (2014) IRDR Peril Classification and Hazard Glossary. Report of the Data Group in the Integrated Research on Disaster Risk. (Available at: http://www.irdrinternational.org/2014/03/28/irdr-peril-classification-and-hazard-glossary).

  18. Improving flood risk mapping in Italy: the FloodRisk open-source software

    Science.gov (United States)

    Albano, Raffaele; Mancusi, Leonardo; Craciun, Iulia; Sole, Aurelia; Ozunu, Alexandru

    2017-04-01

    Time and again, floods around the world illustrate the devastating impact they can have on societies. Furthermore, the expectation that the flood damages can increase over time with climate, land-use change and social growth in flood prone-areas has raised the public and other stakeholders' (governments, international organization, re-insurance companies and emergency responders) awareness for the need to manage risks in order to mitigate their causes and consequences. In this light, the choice of appropriate measures, the assessment of the costs and effects of such measures, and their prioritization are crucial for decision makers. As a result, a priori flood risk assessment has become a key part of flood management practices with the aim of minimizing the total costs related to the risk management cycle. In this context, The EU Flood Directive 2007/60 requires the delineation of flood risk maps on the bases of most appropriate and advanced tools, with particular attention on limiting required economic efforts. The main aim of these risk maps is to provide the required knowledge for the development of flood risk management plans (FRMPs) by considering both costs and benefits of alternatives and results from consultation with all interested parties. In this context, this research project developed a free and open-source (FOSS) GIS software, called FloodRisk, to operatively support stakeholders in their compliance with the FRMPs. FloodRisk aims to facilitate the development of risk maps and the evaluation and management of current and future flood risk for multi-purpose applications. This new approach overcomes the limits of the expert-drive qualitative (EDQ) approach currently adopted in several European countries, such as Italy, which does not permit a suitable evaluation of the effectiveness of risk mitigation strategies, because the vulnerability component cannot be properly assessed. Moreover, FloodRisk is also able to involve the citizens in the flood

  19. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  20. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  1. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  2. Flood frequency analysis of historical flood data under stationary and non-stationary modelling

    Science.gov (United States)

    Machado, M. J.; Botero, B. A.; López, J.; Francés, F.; Díez-Herrero, A.; Benito, G.

    2015-06-01

    Historical records are an important source of information on extreme and rare floods and fundamental to establish a reliable flood return frequency. The use of long historical records for flood frequency analysis brings in the question of flood stationarity, since climatic and land-use conditions can affect the relevance of past flooding as a predictor of future flooding. In this paper, a detailed 400 yr flood record from the Tagus River in Aranjuez (central Spain) was analysed under stationary and non-stationary flood frequency approaches, to assess their contribution within hazard studies. Historical flood records in Aranjuez were obtained from documents (Proceedings of the City Council, diaries, chronicles, memoirs, etc.), epigraphic marks, and indirect historical sources and reports. The water levels associated with different floods (derived from descriptions or epigraphic marks) were computed into discharge values using a one-dimensional hydraulic model. Secular variations in flood magnitude and frequency, found to respond to climate and environmental drivers, showed a good correlation between high values of historical flood discharges and a negative mode of the North Atlantic Oscillation (NAO) index. Over the systematic gauge record (1913-2008), an abrupt change on flood magnitude was produced in 1957 due to constructions of three major reservoirs in the Tagus headwaters (Bolarque, Entrepeñas and Buendia) controlling 80% of the watershed surface draining to Aranjuez. Two different models were used for the flood frequency analysis: (a) a stationary model estimating statistical distributions incorporating imprecise and categorical data based on maximum likelihood estimators, and (b) a time-varying model based on "generalized additive models for location, scale and shape" (GAMLSS) modelling, which incorporates external covariates related to climate variability (NAO index) and catchment hydrology factors (in this paper a reservoir index; RI). Flood frequency

  3. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  4. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  5. Floods and climate: emerging perspectives for flood risk assessment and management

    NARCIS (Netherlands)

    Merz, B.; Aerts, J.C.J.H.; Arnbjerg-Nielsen, K.; Baldi, M.; Becker, A.; Bichet, A.; Blöschl, G.; Bouwer, L.M.; Brauer, A.; Cioffi, F.; Delgado, J.M.; Gocht, M.; Guzetti, F.; Harrigan, S.; Hirschboeck, K.; Kilsby, C.; Kron, W.; Kwon, H. -H.; Lall, U.; Merz, R.; Nissen, K.; Salvatti, P.; Swierczynski, T.; Ulbrich, U.; Viglione, A.; Ward, P.J.; Weiler, M.; Wilhelm, B.; Nied, M.

    2014-01-01

    Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction of

  6. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  7. Projections of Flood Risk using Credible Climate Signals in the Ohio River Basin

    Science.gov (United States)

    Schlef, K.; Robertson, A. W.; Brown, C.

    2017-12-01

    Estimating future hydrologic flood risk under non-stationary climate is a key challenge to the design of long-term water resources infrastructure and flood management strategies. In this work, we demonstrate how projections of large-scale climate patterns can be credibly used to create projections of long-term flood risk. Our study area is the northwest region of the Ohio River Basin in the United States Midwest. In the region, three major teleconnections have been previously demonstrated to affect synoptic patterns that influence extreme precipitation and streamflow: the El Nino Southern Oscillation, the Pacific North American pattern, and the Pacific Decadal Oscillation. These teleconnections are strongest during the winter season (January-March), which also experiences the greatest number of peak flow events. For this reason, flood events are defined as the maximum daily streamflow to occur in the winter season. For each gage in the region, the location parameter of a log Pearson type 3 distribution is conditioned on the first principal component of the three teleconnections to create a statistical model of flood events. Future projections of flood risk are created by forcing the statistical model with projections of the teleconnections from general circulation models selected for skill. We compare the results of our method to the results of two other methods: the traditional model chain (i.e., general circulation model projections to downscaling method to hydrologic model to flood frequency analysis) and that of using the historic trend. We also discuss the potential for developing credible projections of flood events for the continental United States.

  8. A framework for global river flood risk assessments

    Science.gov (United States)

    Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.

    2013-05-01

    There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from the EM

  9. A framework for global river flood risk assessments

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2013-05-01

    Full Text Available There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population. The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE. We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from

  10. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  11. Flood-Ring Formation and Root Development in Response to Experimental Flooding of Young Quercus robur Trees

    Science.gov (United States)

    Copini, Paul; den Ouden, Jan; Robert, Elisabeth M. R.; Tardif, Jacques C.; Loesberg, Walter A.; Goudzwaard, Leo; Sass-Klaassen, Ute

    2016-01-01

    Spring flooding in riparian forests can cause significant reductions in earlywood-vessel size in submerged stem parts of ring-porous tree species, leading to the presence of ‘flood rings’ that can be used as a proxy to reconstruct past flooding events, potentially over millennia. The mechanism of flood-ring formation and the relation with timing and duration of flooding are still to be elucidated. In this study, we experimentally flooded 4-year-old Quercus robur trees at three spring phenophases (late bud dormancy, budswell, and internode expansion) and over different flooding durations (2, 4, and 6 weeks) to a stem height of 50 cm. The effect of flooding on root and vessel development was assessed immediately after the flooding treatment and at the end of the growing season. Ring width and earlywood-vessel size and density were measured at 25- and 75-cm stem height and collapsed vessels were recorded. Stem flooding inhibited earlywood-vessel development in flooded stem parts. In addition, flooding upon budswell and internode expansion led to collapsed earlywood vessels below the water level. At the end of the growing season, mean earlywood-vessel size in the flooded stem parts (upon budswell and internode expansion) was always reduced by approximately 50% compared to non-flooded stem parts and 55% compared to control trees. This reduction was already present 2 weeks after flooding and occurred independent of flooding duration. Stem and root flooding were associated with significant root dieback after 4 and 6 weeks and mean radial growth was always reduced with increasing flooding duration. By comparing stem and root flooding, we conclude that flood rings only occur after stem flooding. As earlywood-vessel development was hampered during flooding, a considerable number of narrow earlywood vessels present later in the season, must have been formed after the actual flooding events. Our study indicates that root dieback, together with strongly reduced hydraulic

  12. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  13. After the flood is before the next flood - post event review of the Central European Floods of June 2013. Insights, recommendations and next steps for future flood prevention

    Science.gov (United States)

    Szoenyi, Michael; Mechler, Reinhard; McCallum, Ian

    2015-04-01

    In early June 2013, severe flooding hit Central and Eastern Europe, causing extensive damage, in particular along the Danube and Elbe main watersheds. The situation was particularly severe in Eastern Germany, Austria, Hungary and the Czech Republic. Based on the Post Event Review Capability (PERC) approach, developed by Zurich Insurance's Flood Resilience Program to provide independent review of large flood events, we examine what has worked well (best practice) and opportunities for further improvement. The PERC overall aims to thoroughly examine aspects of flood resilience, flood risk management and catastrophe intervention in order to help build back better after events and learn for future events. As our research from post event analyses shows a lot of losses are in fact avoidable by taking the right measures pre-event and these measures are economically - efficient with a return of 4 Euro on losses saved for every Euro invested in prevention on average (Wharton/IIASA flood resilience alliance paper on cost benefit analysis, Mechler et al. 2014) and up to 10 Euros for certain countries. For the 2013 flood events we provide analysis on the following aspects and in general identify a number of factors that worked in terms of reducing the loss and risk burden. 1. Understanding risk factors of the Central European Floods 2013 We review the precursors leading up to the floods in June, with an extremely wet May 2013 and an atypical V-b weather pattern that brought immense precipitation in a very short period to the watersheds of Elbe, Donau and partially the Rhine in the D-A-CH countries and researched what happened during the flood and why. Key questions we asked revolve around which protection and risk reduction approaches worked well and which did not, and why. 2. Insights and recommendations from the post event review The PERC identified a number of risk factors, which need attention if risk is to be reduced over time. • Yet another "100-year flood" - risk

  14. Urban flood return period assessment through rainfall-flood response modelling

    DEFF Research Database (Denmark)

    Murla, Damian; Thorndahl, Søren Liedtke

    Intense rainfall can often cause severe floods, especially in urbanized areas, where population density or large impermeable areas are found. In this context, floods can generate a direct impact in a social-environmental-economic viewpoint. Traditionally, in design of Urban Drainage Systems (UDS......), correlation between return period (RP) of a given rainfall and RP of its consequent flood has been assumed to be linear (e.g.DS/EN752 (2008)). However, this is not always the case. Complex UDS, where diverse hydraulic infrastructures are often found, increase the heterogeneity of system response, which may...... cause an alteration of the mentioned correlation. Consequently, reliability on future urban planning, design and resilience against floods may be also affected by this misassumption. In this study, an assessment of surface flood RP across rainfall RP has been carried out at Lystrup, a urbanized...

  15. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  16. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  17. Combining information from multiple flood projections in a hierarchical Bayesian framework

    Science.gov (United States)

    Le Vine, Nataliya

    2016-04-01

    This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  18. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  19. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  20. Going beyond the flood insurance rate map: insights from flood hazard map co-production

    Directory of Open Access Journals (Sweden)

    A. Luke

    2018-04-01

    Full Text Available Flood hazard mapping in the United States (US is deeply tied to the National Flood Insurance Program (NFIP. Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1 legends that frame flood intensity both qualitatively and quantitatively, and (2 flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1 standing water depths following the flood, (2 the erosive potential of flowing water, and (3 pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating

  1. Forecasting in an integrated surface water-ground water system: The Big Cypress Basin, South Florida

    Science.gov (United States)

    Butts, M. B.; Feng, K.; Klinting, A.; Stewart, K.; Nath, A.; Manning, P.; Hazlett, T.; Jacobsen, T.

    2009-04-01

    The South Florida Water Management District (SFWMD) manages and protects the state's water resources on behalf of 7.5 million South Floridians and is the lead agency in restoring America's Everglades - the largest environmental restoration project in US history. Many of the projects to restore and protect the Everglades ecosystem are part of the Comprehensive Everglades Restoration Plan (CERP). The region has a unique hydrological regime, with close connection between surface water and groundwater, and a complex managed drainage network with many structures. Added to the physical complexity are the conflicting needs of the ecosystem for protection and restoration, versus the substantial urban development with the accompanying water supply, water quality and flood control issues. In this paper a novel forecasting and real-time modelling system is presented for the Big Cypress Basin. The Big Cypress Basin includes 272 km of primary canals and 46 water control structures throughout the area that provide limited levels of flood protection, as well as water supply and environmental quality management. This system is linked to the South Florida Water Management District's extensive real-time (SCADA) data monitoring and collection system. Novel aspects of this system include the use of a fully distributed and integrated modeling approach and a new filter-based updating approach for accurately forecasting river levels. Because of the interaction between surface- and groundwater a fully integrated forecast modeling approach is required. Indeed, results for the Tropical Storm Fay in 2008, the groundwater levels show an extremely rapid response to heavy rainfall. Analysis of this storm also shows that updating levels in the river system can have a direct impact on groundwater levels.

  2. Multivariate pluvial flood damage models

    International Nuclear Information System (INIS)

    Van Ootegem, Luc; Verhofstadt, Elsy; Van Herck, Kristine; Creten, Tom

    2015-01-01

    Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks

  3. Multivariate pluvial flood damage models

    Energy Technology Data Exchange (ETDEWEB)

    Van Ootegem, Luc [HIVA — University of Louvain (Belgium); SHERPPA — Ghent University (Belgium); Verhofstadt, Elsy [SHERPPA — Ghent University (Belgium); Van Herck, Kristine; Creten, Tom [HIVA — University of Louvain (Belgium)

    2015-09-15

    Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks.

  4. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  5. A European Flood Database: facilitating comprehensive flood research beyond administrative boundaries

    Directory of Open Access Journals (Sweden)

    J. Hall

    2015-06-01

    Full Text Available The current work addresses one of the key building blocks towards an improved understanding of flood processes and associated changes in flood characteristics and regimes in Europe: the development of a comprehensive, extensive European flood database. The presented work results from ongoing cross-border research collaborations initiated with data collection and joint interpretation in mind. A detailed account of the current state, characteristics and spatial and temporal coverage of the European Flood Database, is presented. At this stage, the hydrological data collection is still growing and consists at this time of annual maximum and daily mean discharge series, from over 7000 hydrometric stations of various data series lengths. Moreover, the database currently comprises data from over 50 different data sources. The time series have been obtained from different national and regional data sources in a collaborative effort of a joint European flood research agreement based on the exchange of data, models and expertise, and from existing international data collections and open source websites. These ongoing efforts are contributing to advancing the understanding of regional flood processes beyond individual country boundaries and to a more coherent flood research in Europe.

  6. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  7. Floods in Colorado

    Science.gov (United States)

    Follansbee, Robert; Sawyer, Leon R.

    1948-01-01

    The first records of floods in Colorado antedated the settlement of the State by about 30 years. These were records of floods on the Arkansas and Republican Rivers in 1826. Other floods noted by traders, hunters and emigrants, some of whom were on their way to the Far West, occurred in 1844 on the Arkansas River, and by inference on the South Platte River. Other early floods were those on the Purgatoire, the Lower Arkansas, and the San Juan Rivers about 1859. The most serious flood since settlement began was that on the Arkansas River during June 1921, which caused the loss of about 100 lives and an estimated property loss of $19,000,000. Many floods of lesser magnitude have occurred, and some of these have caused loss of life and very considerable property damage. Topography is the chief factor in determining the location of storms and resulting floods. These occur most frequently on the eastern slope of the Front Range. In the mountains farther west precipitation is insufficient to cause floods except during periods of melting snow, in June. In the southwestern part of the State, where precipitation during periods of melting snow is insufficient to cause floods, the severest floods yet experienced resulted from heavy rains in September 1909 and October 1911. In the eastern foothills region, usually below an altitude of about 7,500 feet and extending for a distance of about 50 miles east of the mountains, is a zone subject to rainfalls of great intensity known as cloudbursts. These cloudbursts are of short duration and are confined to very small areas. At times the intensity is so great as to make breathing difficult for those exposed to a storm. The areas of intense rainfall are so small that Weather Bureau precipitation stations have not been located in them. Local residents, being cloudburst conscious, frequently measure the rainfall in receptacles in their yards, and such records constitute the only source of information regarding the intensity. A flood

  8. Socio-hydrological modelling of floods: investigating community resilience, adaptation capacity and risk

    Science.gov (United States)

    Ciullo, Alessio; Viglione, Alberto; Castellarin, Attilio

    2016-04-01

    Changes in flood risk occur because of changes in climate and hydrology, and in societal exposure and vulnerability. Research on change in flood risk has demonstrated that the mutual interactions and continuous feedbacks between floods and societies has to be taken into account in flood risk management. The present work builds on an existing conceptual model of an hypothetical city located in the proximity of a river, along whose floodplains the community evolves over time. The model reproduces the dynamic co-evolution of four variables: flooding, population density of the flooplain, amount of structural protection measures and memory of floods. These variables are then combined in a way to mimic the temporal change of community resilience, defined as the (inverse of the) amount of time for the community to recover from a shock, and adaptation capacity, defined as ratio between damages due to subsequent events. Also, temporal changing exposure, vulnerability and probability of flooding are also modelled, which results in a dynamically varying flood-risk. Examples are provided that show how factors such as collective memory and risk taking attitude influence the dynamics of community resilience, adaptation capacity and risk.

  9. Sustainable flood memories, lay knowledges and the development of community resilience to future flood risk

    Directory of Open Access Journals (Sweden)

    McEwen Lindsey

    2016-01-01

    Full Text Available Shifts to devolved flood risk management in the UK pose questions about how the changing role of floodplain residents in community-led adaptation planning can be supported and strengthened. This paper shares insights from an interdisciplinary research project that has proposed the concept of ‘sustainable flood memory’ in the context of effective flood risk management. The research aimed to increase understanding of whether and how flood memories from the UK Summer 2007 extreme floods provide a platform for developing lay knowledges and flood resilience. The project investigated what factors link flood memory and lay knowledges of flooding, and how these connect and disconnect during and after flood events. In particular, and relation to flood governance directions, we sought to explore how such memories might play a part in individual and community resilience. The research presented here explores some key themes drawn from semi-structured interviews with floodplain residents with recent flood experiences in contrasting demographic and physical settings in the lower River Severn catchment. These include changing practices in making flood memories and materialising flood knowledge and the roles of active remembering and active forgetting.

  10. Flood Impacts on People: from Hazard to Risk Maps

    Science.gov (United States)

    Arrighi, C.; Castelli, F.

    2017-12-01

    The mitigation of adverse consequences of floods on people is crucial for civil protection and public authorities. According to several studies, in the developed countries the majority of flood-related fatalities occurs due to inappropriate high risk behaviours such as driving and walking in floodwaters. In this work both the loss of stability of vehicles and pedestrians in floodwaters are analysed. Flood hazard is evaluated, based on (i) a 2D inundation model of an urban area, (ii) 3D hydrodynamic simulations of water flows around vehicles and human body and (iii) a dimensional analysis of experimental activity. Exposure and vulnerability of vehicles and population are assessed exploiting several sources of open GIS data in order to produce risk maps for a testing case study. The results show that a significant hazard to vehicles and pedestrians exists in the study area. Particularly high is the hazard to vehicles, which are likely to be swept away by flood flow, possibly aggravate damages to structures and infrastructures and locally alter the flood propagation. Exposure and vulnerability analysis identifies some structures such as schools and public facilities, which may attract several people. Moreover, some shopping facilities in the area, which attract both vehicular and pedestrians' circulation are located in the highest flood hazard zone.The application of the method demonstrates that, at municipal level, such risk maps can support civil defence strategies and education to active citizenship, thus contributing to flood impact reduction to population.

  11. Building regional early flood warning systems by AI techniques

    Science.gov (United States)

    Chang, F. J.; Chang, L. C.; Amin, M. Z. B. M.

    2017-12-01

    Building early flood warning system is essential for the protection of the residents against flood hazards and make actions to mitigate the losses. This study implements AI technology for forecasting multi-step-ahead regional flood inundation maps during storm events. The methodology includes three major schemes: (1) configuring the self-organizing map (SOM) to categorize a large number of regional inundation maps into a meaningful topology; (2) building dynamic neural networks to forecast multi-step-ahead average inundated depths (AID); and (3) adjusting the weights of the selected neuron in the constructed SOM based on the forecasted AID to obtain real-time regional inundation maps. The proposed models are trained, and tested based on a large number of inundation data sets collected in regions with the most frequent and serious flooding in the river basin. The results appear that the SOM topological relationships between individual neurons and their neighbouring neurons are visible and clearly distinguishable, and the hybrid model can continuously provide multistep-ahead visible regional inundation maps with high resolution during storm events, which have relatively small RMSE values and high R2 as compared with numerical simulation data sets. The computing time is only few seconds, and thereby leads to real-time regional flood inundation forecasting and make early flood inundation warning system. We demonstrate that the proposed hybrid ANN-based model has a robust and reliable predictive ability and can be used for early warning to mitigate flood disasters.

  12. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  13. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  14. Long-term reactions of plants and macroinvertebrates to extreme floods in floodplain grasslands.

    Science.gov (United States)

    Ilg, Christiane; Dziock, Frank; Foeckler, Francis; Follner, Klaus; Gerisch, Michael; Glaeser, Judith; Rink, Anke; Schanowski, Arno; Scholz, Mathias; Deichner, Oskar; Henle, Klaus

    2008-09-01

    Extreme summertime flood events are expected to become more frequent in European rivers due to climate change. In temperate areas, where winter floods are common, extreme floods occurring in summer, a period of high physiological activity, may seriously impact floodplain ecosystems. Here we report on the effects of the 2002 extreme summer flood on flora and fauna of the riverine grasslands of the Middle Elbe (Germany), comparing pre- and post-flooding data collected by identical methods. Plants, mollusks, and carabid beetles differed considerably in their response in terms of abundance and diversity. Plants and mollusks, displaying morphological and behavioral adaptations to flooding, showed higher survival rates than the carabid beetles, the adaptation strategies of which were mainly linked to life history. Our results illustrate the complexity of responses of floodplain organisms to extreme flood events. They demonstrate that the efficiency of resistance and resilience strategies is widely dependent on the mode of adaptation.

  15. Determining the Financial Impact of Flood Hazards in Ungaged Basins

    Science.gov (United States)

    Cotterman, K. A.; Gutenson, J. L.; Pradhan, N. R.; Byrd, A.

    2017-12-01

    Many portions of the Earth lack adequate authoritative or in situ data that is of great value in determining natural hazard vulnerability from both anthropogenic and physical perspective. Such locations include the majority of developing nations, which do not possess adequate warning systems and protective infrastructure. The lack of warning and protection from natural hazards make these nations vulnerable to the destructive power of events such as floods. The goal of this research is to demonstrate an initial workflow with which to characterize flood financial hazards with global datasets and crowd-sourced, non-authoritative data in ungagged river basins. This workflow includes the hydrologic and hydraulic response of the watershed to precipitation, characterized by the physics-based modeling application Gridded Surface-Subsurface Hydrologic Analysis (GSSHA) model. In addition, data infrastructure and resources are available to approximate the human impact of flooding. Open source, volunteer geographic information (VGI) data can provide global coverage of elements at risk of flooding. Additional valuation mechanisms can then translate flood exposure into percentage and financial damage to each building. The combinations of these tools allow the authors to remotely assess flood hazards with minimal computational, temporal, and financial overhead. This combination of deterministic and stochastic modeling provides the means to quickly characterize watershed flood vulnerability and will allow emergency responders and planners to better understand the implications of flooding, both spatially and financially. In either a planning, real-time, or forecasting scenario, the system will assist the user in understanding basin flood vulnerability and increasing community resiliency and preparedness.

  16. A new modelling framework and mitigation measures for increased resilience to flooding

    Science.gov (United States)

    Valyrakis, Manousos; Alexakis, Athanasios; Solley, Mark

    2015-04-01

    Flooding in rivers and estuaries is amongst the most significant challenges our society has yet to tackle effectively. Use of floodwall systems is one of the potential measures that can be used to mitigate the detrimental socio-economical and ecological impacts and alleviate the associated costs of flooding. This work demonstrates the utility of such systems for a case study via appropriate numerical simulations, in addition to conducting scaled flume experiments towards obtaining a better understanding of the performance and efficiency of the flood-wall systems. At first, the results of several characteristic inundation modeling scenarios and flood mitigation options, for a flood-prone region in Scotland. In particular, the history and hydrology of the area are discussed and the assumptions and hydraulic model input (model geometry including instream hydraulic structures -such as bridges and weirs- river and floodplain roughness, initial and boundary conditions) are presented, followed by the model results. Emphasis is given on the potential improvements brought about by mitigating flood risk using flood-wall systems. Further, the implementation of the floodwall in mitigating flood risk is demonstrated via appropriate numerical modeling, utilizing HEC-RAS to simulate the effect of a river's rising stage during a flood event, for a specific area. The later part of this work involves the design, building and utilization of a scaled physical model of a flood-wall system. These experiments are carried out at one of the research flumes in the Water Engineering laboratory of the University of Glasgow. These involve an experimental investigation where the increase of force applied on the floodwall is measured for different degrees of deflection of the water in the stream, under the maximum flow discharge that can be carried through without exceeding the floodwall height (and accounting for the effect of super-elevation). These results can be considered upon the

  17. Adaptation to flood risk: Results of international paired flood event studies

    NARCIS (Netherlands)

    Kreibich, Heidi; Di Baldassarre, G.; Vorogushyn, Sergiy; Aerts, J.C.J.H.; Apel, H.; Aronica, G.T.; Arnbjerg-Nielsen, K.; Bouwer, L.; Bubeck, P.; Caloiero, Tommaso; Chinh, Do. T.; Cortès, Maria; Gain, A.K.; Giampá, Vincenzo; Kuhlicke, C; Kundzewicz, Z.W.; Carmen Llasat, M; Mård, Johanna; Matczak, Piotr; Mazzoleni, Maurizio; Molinari, Daniela; Dung, N.V.; Petrucci, Olga; Schröter, Kai; Slager, Kymo; Thieken, A.H.; Ward, P.J.; Merz, B.

    2017-01-01

    As flood impacts are increasing in large parts of the world, understanding the primary drivers of changes in risk is essential for effective adaptation. To gain more knowledge on the basis of empirical case studies, we analyze eight paired floods, that is, consecutive flood events that occurred in

  18. Advances in mobile cloud computing and big data in the 5G era

    CERN Document Server

    Mastorakis, George; Dobre, Ciprian

    2017-01-01

    This book reports on the latest advances on the theories, practices, standards and strategies that are related to the modern technology paradigms, the Mobile Cloud computing (MCC) and Big Data, as the pillars and their association with the emerging 5G mobile networks. The book includes 15 rigorously refereed chapters written by leading international researchers, providing the readers with technical and scientific information about various aspects of Big Data and Mobile Cloud Computing, from basic concepts to advanced findings, reporting the state-of-the-art on Big Data management. It demonstrates and discusses methods and practices to improve multi-source Big Data manipulation techniques, as well as the integration of resources availability through the 3As (Anywhere, Anything, Anytime) paradigm, using the 5G access technologies.

  19. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  20. Prehistoric floods on the Tennessee River—Assessing the use of stratigraphic records of past floods for improved flood-frequency analysis

    Science.gov (United States)

    Harden, Tessa M.; O'Connor, Jim E.

    2017-06-14

    Stratigraphic analysis, coupled with geochronologic techniques, indicates that a rich history of large Tennessee River floods is preserved in the Tennessee River Gorge area. Deposits of flood sediment from the 1867 peak discharge of record (460,000 cubic feet per second at Chattanooga, Tennessee) are preserved at many locations throughout the study area at sites with flood-sediment accumulation. Small exposures at two boulder overhangs reveal evidence of three to four other floods similar in size, or larger, than the 1867 flood in the last 3,000 years—one possibly as much or more than 50 percent larger. Records of floods also are preserved in stratigraphic sections at the mouth of the gorge at Williams Island and near Eaves Ferry, about 70 river miles upstream of the gorge. These stratigraphic records may extend as far back as about 9,000 years ago, giving a long history of Tennessee River floods. Although more evidence is needed to confirm these findings, a more in-depth comprehensive paleoflood study is feasible for the Tennessee River.

  1. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  2. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  3. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  4. Floods in Serbia in the 1999-2009 period: Hydrological analysis and flood protection measures

    Directory of Open Access Journals (Sweden)

    Milanović Ana

    2010-01-01

    Full Text Available The review on greatest floods recorded in Vojvodina and central Serbia within the period from 1999 to 2009 is given in this paper. For 13 hydrological stations, that recorded the greatest floods for the present period, probability of occurrence of these floods has been accomplished. Based on analysis of time series of discharge and water level maximum, performed by applying probability theory and mathematical statistics, and calculated theoretical probability distribution function of floods, probability of occurrence of flood has been obtained. Most often the best agreement with the empirical distribution function had a Log-Pearson III, Pearson III distribution. These results can be used for dimensioning of hydro-technical objects for flood protection. The most significant causes for floods recorded in this period were melting of snow and intensive rainfall. In this paper the current situation of flood protection and future development of flood protection measures were also presented. .

  5. Social sensing of floods in the UK.

    Science.gov (United States)

    Arthur, Rudy; Boulton, Chris A; Shotton, Humphrey; Williams, Hywel T P

    2018-01-01

    "Social sensing" is a form of crowd-sourcing that involves systematic analysis of digital communications to detect real-world events. Here we consider the use of social sensing for observing natural hazards. In particular, we present a case study that uses data from a popular social media platform (Twitter) to detect and locate flood events in the UK. In order to improve data quality we apply a number of filters (timezone, simple text filters and a naive Bayes 'relevance' filter) to the data. We then use place names in the user profile and message text to infer the location of the tweets. These two steps remove most of the irrelevant tweets and yield orders of magnitude more located tweets than we have by relying on geo-tagged data. We demonstrate that high resolution social sensing of floods is feasible and we can produce high-quality historical and real-time maps of floods using Twitter.

  6. Spatial coherence of flood-rich and flood-poor periods across Germany

    Science.gov (United States)

    Merz, Bruno; Dung, Nguyen Viet; Apel, Heiko; Gerlitz, Lars; Schröter, Kai; Steirou, Eva; Vorogushyn, Sergiy

    2018-04-01

    Despite its societal relevance, the question whether fluctuations in flood occurrence or magnitude are coherent in space has hardly been addressed in quantitative terms. We investigate this question for Germany by analysing fluctuations in annual maximum series (AMS) values at 68 discharge gauges for the common time period 1932-2005. We find remarkable spatial coherence across Germany given its different flood regimes. For example, there is a tendency that flood-rich/-poor years in sub-catchments of the Rhine basin, which are dominated by winter floods, coincide with flood-rich/-poor years in the southern sub-catchments of the Danube basin, which have their dominant flood season in summer. Our findings indicate that coherence is caused rather by persistence in catchment wetness than by persistent periods of higher/lower event precipitation. Further, we propose to differentiate between event-type and non-event-type coherence. There are quite a number of hydrological years with considerable non-event-type coherence, i.e. AMS values of the 68 gauges are spread out through the year but in the same magnitude range. Years with extreme flooding tend to be of event-type and non-coherent, i.e. there is at least one precipitation event that affects many catchments to various degree. Although spatial coherence is a remarkable phenomenon, and large-scale flooding across Germany can lead to severe situations, extreme magnitudes across the whole country within one event or within one year were not observed in the investigated period.

  7. Sex-specific responses to winter flooding, spring waterlogging and post-flooding recovery in Populus deltoides.

    Science.gov (United States)

    Miao, Ling-Feng; Yang, Fan; Han, Chun-Yu; Pu, Yu-Jin; Ding, Yang; Zhang, Li-Jia

    2017-05-31

    Winter flooding events are common in some rivers and streams due to dam constructions, and flooding and waterlogging inhibit the growth of trees in riparian zones. This study investigated sex-specific morphological, physiological and ultrastructural responses to various durations of winter flooding and spring waterlogging stresses, and post-flooding recovery characteristics in Populus deltoides. There were no significant differences in the morphological, ultrastructural and the majority of physiological traits in trees subjected to medium and severe winter flooding stresses, suggesting that males and females of P. deltoides were winter flooding tolerant, and insensitive to winter flooding duration. Males were more tolerant to winter flooding stress in terms of photosynthesis and chlorophyll fluorescence than females. Females displayed greater oxidative damage due to flooding stress than males. Males developed more efficient antioxidant enzymatic systems to control reactive oxygen species. Both sexes had similarly strong post-flooding recovery capabilities in terms of plant growth, and physiological and ultrastructural parameters. However, Males had better recovery capabilities in terms of pigment content. These results increase the understanding of poplars's adaptation to winter flooding stress. They also elucidate sex-specific differences in response to flooding stress during the dormant season, and during post-flooding recovery periods.

  8. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  9. Channel Shallowing as Mitigation of Coastal Flooding

    Directory of Open Access Journals (Sweden)

    Philip M. Orton

    2015-07-01

    Full Text Available Here, we demonstrate that reductions in the depth of inlets or estuary channels can be used to reduce or prevent coastal flooding. A validated hydrodynamic model of Jamaica Bay, New York City (NYC, is used to test nature-based adaptation measures in ameliorating flooding for NYC's two largest historical coastal flood events. In addition to control runs with modern bathymetry, three altered landscape scenarios are tested: (1 increasing the area of wetlands to their 1879 footprint and bathymetry, but leaving deep shipping channels unaltered; (2 shallowing all areas deeper than 2 m in the bay to be 2 m below Mean Low Water; (3 shallowing only the narrowest part of the inlet to the bay. These three scenarios are deliberately extreme and designed to evaluate the leverage each approach exerts on water levels. They result in peak water level reductions of 0.3%, 15%, and 6.8% for Hurricane Sandy, and 2.4%, 46% and 30% for the Category-3 hurricane of 1821, respectively (bay-wide averages. These results suggest that shallowing can provide greater flood protection than wetland restoration, and it is particularly effective at reducing "fast-pulse" storm surges that rise and fall quickly over several hours, like that of the 1821 storm. Nonetheless, the goal of flood mitigation must be weighed against economic, navigation, and ecological needs, and practical concerns such as the availability of sediment.

  10. Priming the Pump for Big Data at Sentara Healthcare.

    Science.gov (United States)

    Kern, Howard P; Reagin, Michael J; Reese, Bertram S

    2016-01-01

    Today's healthcare organizations are facing significant demands with respect to managing population health, demonstrating value, and accepting risk for clinical outcomes across the continuum of care. The patient's environment outside the walls of the hospital and physician's office-and outside the electronic health record (EHR)-has a substantial impact on clinical care outcomes. The use of big data is key to understanding factors that affect the patient's health status and enhancing clinicians' ability to anticipate how the patient will respond to various therapies. Big data is essential to delivering sustainable, highquality, value-based healthcare, as well as to the success of new models of care such as clinically integrated networks (CINs) and accountable care organizations.Sentara Healthcare, based in Norfolk, Virginia, has been an early adopter of the technologies that have readied us for our big data journey: EHRs, telehealth-supported electronic intensive care units, and telehealth primary care support through MDLIVE. Although we would not say Sentara is at the cutting edge of the big data trend, it certainly is among the fast followers. Use of big data in healthcare is still at an early stage compared with other industries. Tools for data analytics are maturing, but traditional challenges such as heightened data security and limited human resources remain the primary focus for regional health systems to improve care and reduce costs. Sentara primarily makes actionable use of big data in our CIN, Sentara Quality Care Network, and at our health plan, Optima Health. Big data projects can be expensive, and justifying the expense organizationally has often been easier in times of crisis. We have developed an analytics strategic plan separate from but aligned with corporate system goals to ensure optimal investment and management of this essential asset.

  11. Return period assessment of urban pluvial floods through modelling of rainfall–flood response

    DEFF Research Database (Denmark)

    Tuyls, Damian Murla; Thorndahl, Søren Liedtke; Rasmussen, Michael Robdrup

    2018-01-01

    Intense rainfall in urban areas can often generate severe flood impacts. Consequently, it is crucial to design systems to minimize potential flood damages. Traditional, simple design of urban drainage systems assumes agreement between rainfall return period and its consequent flood return period......; however, this does not always apply. Hydraulic infrastructures found in urban drainage systems can increase system heterogeneity and perturb the impact of severe rainfall response. In this study, a surface flood return period assessment was carried out at Lystrup (Denmark), which has received the impact...... of flooding in recent years. A 35 years' rainfall dataset together with a coupled 1D/2D surface and network model was used to analyse and assess flood return period response. Results show an ambiguous relation between rainfall and flood return periods indicating that linear rainfall–runoff relationships will...

  12. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  13. Flood Risk Management in the People’s Republic of China: Learning to Live with Flood Risk

    OpenAIRE

    Asian Development Bank (ADB); Asian Development Bank (ADB); Asian Development Bank (ADB); Asian Development Bank (ADB)

    2012-01-01

    This publication presents a shift in the People’s Republic of China from flood control depending on structural measures to integrated flood management using both structural and non-structural measures. The core of the new concept of integrated flood management is flood risk management. Flood risk management is based on an analysis of flood hazard, exposure to flood hazard, and vulnerability of people and property to danger. It is recommended that people learn to live with flood risks, gaining...

  14. Influence of spreading urbanization in flood areas on flood damage in Slovenia

    International Nuclear Information System (INIS)

    Komac, B; Zorn, M; Natek, K

    2008-01-01

    Damage caused by natural disasters in Slovenia is frequently linked to the ignoring of natural factors in spatial planning. Historically, the construction of buildings and settlements avoided dangerous flood areas, but later we see increasing construction in dangerous areas. During the floods in 1990, the most affected buildings were located on ill-considered locations, and the majority was built in more recent times. A similar situation occurred during the floods of September 2007. Comparing the effects of these floods, we determined that damage was always greater due to the urbanization of flood areas. This process furthermore increasingly limits the 'manoeuvring space' for water management authorities, who due to the torrential nature of Slovenia's rivers can not ensure the required level of safety from flooding for unsuitably located settlements and infrastructure. Every year, the Environmental Agency of the Republic of Slovenia issues more than one thousand permits for interventions in areas that affect the water regime, and through decrees the government allows construction in riparian zones, which is supposedly forbidden by the Law on Water. If we do not take measures with more suitable policies for spatial planning, we will no long have the possibility in future to reduce the negative consequences of floods. Given that torrential floods strike certain Slovene regions every three years on average and that larger floods occur at least once a decade, it is senseless to lay the blame on climate change.

  15. FloodProBE: technologies for improved safety of the built environment in relation to flood events

    International Nuclear Information System (INIS)

    Ree, C.C.D.F. van; Van, M.A.; Heilemann, K.; Morris, M.W.; Royet, P.; Zevenbergen, C.

    2011-01-01

    The FloodProBE project started as a FP7 research project in November 2009. Floods, together with wind related storms, are considered the major natural hazard in the EU in terms of risk to people and assets. In order to adapt urban areas (in river and coastal zones) to prevent flooding or to be better prepared for floods, decision makers need to determine how to upgrade flood defences and increasing flood resilience of protected buildings and critical infrastructure (power supplies, communications, water, transport, etc.) and assess the expected risk reduction from these measures. The aim of the FloodProBE-project is to improve knowledge on flood resilience and flood protection performance for balancing investments in flood risk management in urban areas. To this end, technologies, methods and tools for assessment purposes and for the adaptation of new and existing buildings and critical infrastructure are developed, tested and disseminated. Three priority areas are addressed by FloodProBE. These are: (i) vulnerability of critical infrastructure and high-density value assets including direct and indirect damage, (ii) the assessment and reliability of urban flood defences including the use of geophysical methods and remote sensing techniques and (iii) concepts and technologies for upgrading weak links in flood defences as well as construction technologies for flood proofing buildings and infrastructure networks to increase the flood resilience of the urban system. The primary impact of FloodProBE in advancing knowledge in these areas is an increase in the cost-effectiveness (i.e. performance) of new and existing flood protection structures and flood resilience measures.

  16. Flood Finder: Mobile-based automated water level estimation and mapping during floods

    International Nuclear Information System (INIS)

    Pongsiriyaporn, B; Jariyavajee, C; Laoharawee, N; Narkthong, N; Pitichat, T; Goldin, S E

    2014-01-01

    Every year, Southeast Asia faces numerous flooding disasters, resulting in very high human and economic loss. Responding to a sudden flood is difficult due to the lack of accurate and up-to- date information about the incoming water status. We have developed a mobile application called Flood Finder to solve this problem. Flood Finder allows smartphone users to measure, share and search for water level information at specified locations. The application uses image processing to compute the water level from a photo taken by users. The photo must be of a known reference object with a standard size. These water levels are more reliable and consistent than human estimates since they are derived from an algorithmic measuring function. Flood Finder uploads water level readings to the server, where they can be searched and mapped by other users via the mobile phone app or standard browsers. Given the widespread availability of smartphones in Asia, Flood Finder can provide more accurate and up-to-date information for better preparation for a flood disaster as well as life safety and property protection

  17. A National Assessment of Changes in Flood Exposure in the United States

    Science.gov (United States)

    Lam, N.; Qiang, Y.; Cai, H.; Zou, L.

    2017-12-01

    Analyzing flood exposure and its temporal trend is the first step toward understanding flood risk, flood hazard, and flood vulnerability. This presentation is based on a national, county-based study assessing the changes in population and urban areas in high-risk flood zones from 2001-2011 in the contiguous United States. Satellite land use land cover data, Federal Emergency Management Agency (FEMA)'s 100-year flood maps, and census data were used to extract the proportion of developed (urban) land in flood zones by county in the two time points, and indices of difference were calculated. Local Moran's I statistic was applied to identify hotspots of increase in urban area in flood zones, and geographically weighted regression was used to estimate the population in flood zones from the land cover data. Results show that in 2011, an estimate of about 25.3 million people (8.3% of the total population) lived in the high-risk flood zones. Nationally, the ratio of urban development in flood zones is less than the ratio of land in flood zones, implying that Americans were responsive to flood hazards by avoiding development in flood zones. However, this trend varied from place to place, with coastal counties having less urban development in flood zones than the inland counties. Furthermore, the contrast between coastal and inland counties increased during 2001-2011. Finally, several exceptions from the trend (hotspots) were detected, most notably New York City and Miami where significant increases in urban development in flood zones were found. This assessment provides important baseline information on the spatial patterns of flood exposure and their changes from 2001-2011. The study pinpoints regions that may need further investigations and better policy to reduce the overall flood risks. Methodologically, the study demonstrates that pixelated land cover data can be integrated with other natural and human data to investigate important societal problems. The same

  18. Estimation of Internal Flooding Frequency for Screening Analysis of Flooding PSA

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Yang, Jun Eon

    2005-01-01

    The purpose of this paper is to estimate the internal frequency for the quantitative screening analysis of the flooding PSA (Probabilistic Safety Assessment) with the appropriate data and estimation method. In the case of the existing flood PSA for domestic NPPs (Nuclear Power Plant), the screening analysis was performed firstly and then detailed analysis was performed for the area not screened out. For the quantitative screening analysis, the plant area based flood frequency by MLE (Maximum Likelihood Estimation) method was used, while the component based flood frequency is used for the detailed analysis. The existing quantitative screening analysis for domestic NPPs have used data from all LWRs (Light Water Reactor), namely PWR (Pressurized Water Reactor) and BWR (Boiling Water Reactor) for the internal flood frequency of the auxiliary building and turbine building. However, in the case of the primary auxiliary building, the applicability of the data from all LWRs needs to be examined carefully because of the significant difference in equipments between the PWR and BWR structure. NUREG/CR-5750 suggested the Bayesian update method with Jeffrey's noninformative prior to estimate the initiating event frequency for the flood. It, however, did not describe any procedure of the flood PSA. Recently, Fleming and Lydell suggested the internal flooding frequency in the unit of the plant operation year-pipe length (in meter) by pipe size of each specific system which is susceptible to the flooding such as the service water system and the circulating water system. They used the failure rate, the rupture conditional probability given the failure to estimate the internal flooding frequency, and the Bayesian update to reduce uncertainties. To perform the quantitative screening analysis with the method, it requires pipe length by each pipe size of the specific system per each divided area to change the concept of the component based frequency to the concept of the plant area

  19. Amplification of flood frequencies with local sea level rise and emerging flood regimes

    Science.gov (United States)

    Buchanan, Maya K.; Oppenheimer, Michael; Kopp, Robert E.

    2017-06-01

    The amplification of flood frequencies by sea level rise (SLR) is expected to become one of the most economically damaging impacts of climate change for many coastal locations. Understanding the magnitude and pattern by which the frequency of current flood levels increase is important for developing more resilient coastal settlements, particularly since flood risk management (e.g. infrastructure, insurance, communications) is often tied to estimates of flood return periods. The Intergovernmental Panel on Climate Change’s Fifth Assessment Report characterized the multiplication factor by which the frequency of flooding of a given height increases (referred to here as an amplification factor; AF). However, this characterization neither rigorously considered uncertainty in SLR nor distinguished between the amplification of different flooding levels (such as the 10% versus 0.2% annual chance floods); therefore, it may be seriously misleading. Because both historical flood frequency and projected SLR are uncertain, we combine joint probability distributions of the two to calculate AFs and their uncertainties over time. Under probabilistic relative sea level projections, while maintaining storm frequency fixed, we estimate a median 40-fold increase (ranging from 1- to 1314-fold) in the expected annual number of local 100-year floods for tide-gauge locations along the contiguous US coastline by 2050. While some places can expect disproportionate amplification of higher frequency events and thus primarily a greater number of historically precedented floods, others face amplification of lower frequency events and thus a particularly fast growing risk of historically unprecedented flooding. For example, with 50 cm of SLR, the 10%, 1%, and 0.2% annual chance floods are expected respectively to recur 108, 335, and 814 times as often in Seattle, but 148, 16, and 4 times as often in Charleston, SC.

  20. Enhancement of global flood damage assessments using building material based vulnerability curves

    Science.gov (United States)

    Englhardt, Johanna; de Ruiter, Marleen; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    This study discusses the development of an enhanced approach for flood damage and risk assessments using vulnerability curves that are based on building material information. The approach draws upon common practices in earthquake vulnerability assessments, and is an alternative for land-use or building occupancy approach in flood risk assessment models. The approach is of particular importance for studies where there is a large variation in building material, such as large scale studies or studies in developing countries. A case study of Ethiopia is used to demonstrate the impact of the different methodological approaches on direct damage assessments due to flooding. Generally, flood damage assessments use damage curves for different land-use or occupancy types (i.e. urban or residential and commercial classes). However, these categories do not necessarily relate directly to vulnerability of damage by flood waters. For this, the construction type and building material may be more important, as is used in earthquake risk assessments. For this study, we use building material classification data of the PAGER1 project to define new building material based vulnerability classes for flood damage. This approach will be compared to the widely applied land-use based vulnerability curves such as used by De Moel et al. (2011). The case of Ethiopia demonstrates and compares the feasibility of this novel flood vulnerability method on a country level which holds the potential to be scaled up to a global level. The study shows that flood vulnerability based on building material also allows for better differentiation between flood damage in urban and rural settings, opening doors to better link to poverty studies when such exposure data is available. Furthermore, this new approach paves the road to the enhancement of multi-risk assessments as the method enables the comparison of vulnerability across different natural hazard types that also use material-based vulnerability curves

  1. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  2. Distillation Column Flooding Predictor

    Energy Technology Data Exchange (ETDEWEB)

    George E. Dzyacky

    2010-11-23

    The Flooding Predictor™ is a patented advanced control technology proven in research at the Separations Research Program, University of Texas at Austin, to increase distillation column throughput by over 6%, while also increasing energy efficiency by 10%. The research was conducted under a U. S. Department of Energy Cooperative Agreement awarded to George Dzyacky of 2ndpoint, LLC. The Flooding Predictor™ works by detecting the incipient flood point and controlling the column closer to its actual hydraulic limit than historical practices have allowed. Further, the technology uses existing column instrumentation, meaning no additional refining infrastructure is required. Refiners often push distillation columns to maximize throughput, improve separation, or simply to achieve day-to-day optimization. Attempting to achieve such operating objectives is a tricky undertaking that can result in flooding. Operators and advanced control strategies alike rely on the conventional use of delta-pressure instrumentation to approximate the column’s approach to flood. But column delta-pressure is more an inference of the column’s approach to flood than it is an actual measurement of it. As a consequence, delta pressure limits are established conservatively in order to operate in a regime where the column is never expected to flood. As a result, there is much “left on the table” when operating in such a regime, i.e. the capacity difference between controlling the column to an upper delta-pressure limit and controlling it to the actual hydraulic limit. The Flooding Predictor™, an innovative pattern recognition technology, controls columns at their actual hydraulic limit, which research shows leads to a throughput increase of over 6%. Controlling closer to the hydraulic limit also permits operation in a sweet spot of increased energy-efficiency. In this region of increased column loading, the Flooding Predictor is able to exploit the benefits of higher liquid

  3. The state of the art of flood forecasting - Hydrological Ensemble Prediction Systems

    Science.gov (United States)

    Thielen-Del Pozo, J.; Pappenberger, F.; Salamon, P.; Bogner, K.; Burek, P.; de Roo, A.

    2010-09-01

    , has become evident. However, despite the demonstrated advantages, worldwide the incorporation of HEPS in operational flood forecasting is still limited. The applicability of HEPS for smaller river basins was tested in MAP D-Phase, an acronym for "Demonstration of Probabilistic Hydrological and Atmospheric Simulation of flood Events in the Alpine region" which was launched in 2005 as a Forecast Demonstration Project of World Weather Research Programme of WMO, and entered a pre-operational and still active testing phase in 2007. In Europe, a comparatively high number of EPS driven systems for medium-large rivers exist. National flood forecasting centres of Sweden, Finland and the Netherlands, have already implemented HEPS in their operational forecasting chain, while in other countries including France, Germany, Czech Republic and Hungary, hybrids or experimental chains have been installed. As an example of HEPS, the European Flood Alert System (EFAS) is being presented. EFAS provides medium-range probabilistic flood forecasting information for large trans-national river basins. It incorporates multiple sets of weather forecast including different types of EPS and deterministic forecasts from different providers. EFAS products are evaluated and visualised as exceedance of critical levels only - both in forms of maps and time series. Different sources of uncertainty and its impact on the flood forecasting performance for every grid cell has been tested offline but not yet incorporated operationally into the forecasting chain for computational reasons. However, at stations where real-time discharges are available, a hydrological uncertainty processor is being applied to estimate the total predictive uncertainty from the hydrological and input uncertainties. Research on long-term EFAS results has shown the need for complementing statistical analysis with case studies for which examples will be shown.

  4. Impacts of repetitive floods and satisfaction with flood relief efforts: A case study of the flood-prone districts in Thailand’s Ayutthaya province

    Directory of Open Access Journals (Sweden)

    Nawhath Thanvisitthpon

    2017-01-01

    Full Text Available This research investigates the impacts of the repetitive flooding on the inhabitants of the four flood-prone districts in Thailand’s central province of Ayutthaya: Pranakorn Si Ayutthaya, Sena, Bang Ban, and Pak Hai. In addition, the residents’ satisfaction levels with the flood relief efforts and operations of the local authorities were examined and analyzed. The research revealed that most local residents have adapted to co-exist with the repetitive floods, an example of which is the elevation of the houses a few meters above the ground where the living quarter is on the upper level. The findings also indicated that the repetitive flooding incurred substantial post-flood repair costs, in light of the low income-earning capabilities of the locals. However, the flood-recovery financial aids was incommensurate with the actual expenditures, contributing to the lowest average satisfaction score among the inhabitants with regard to the adequacy of the post-flood repair and restoration financial aid. Furthermore, the research identified the differences between districts on the satisfaction with the flood relief efforts. The disparity could be attributed to the extent of coordination and participation of the local residents and their local leaders in the flood-related measures.

  5. Urban pluvial flood prediction

    DEFF Research Database (Denmark)

    Thorndahl, Søren Liedtke; Nielsen, Jesper Ellerbæk; Jensen, David Getreuer

    2016-01-01

    Flooding produced by high-intensive local rainfall and drainage system capacity exceedance can have severe impacts in cities. In order to prepare cities for these types of flood events – especially in the future climate – it is valuable to be able to simulate these events numerically both...... historically and in real-time. There is a rather untested potential in real-time prediction of urban floods. In this paper radar data observations with different spatial and temporal resolution, radar nowcasts of 0–2 h lead time, and numerical weather models with lead times up to 24 h are used as inputs...... to an integrated flood and drainage systems model in order to investigate the relative difference between different inputs in predicting future floods. The system is tested on a small town Lystrup in Denmark, which has been flooded in 2012 and 2014. Results show it is possible to generate detailed flood maps...

  6. FLOOD MENACE IN KADUNA METROPOLIS: IMPACTS ...

    African Journals Online (AJOL)

    Dr A.B.Ahmed

    damage, causes of flooding, human response to flooding and severity of ... from moving out. Source of ... Man responds to flood hazards through adjustment, flood abatement ... action to minimize or ameliorate flood hazards; flood abatement.

  7. Urban flood return period assessment through rainfall-flood response modelling

    Science.gov (United States)

    Murla Tuyls, Damian; Thorndahl, Søren

    2017-04-01

    Intense rainfall can often cause severe floods, especially in urbanized areas, where population density or large impermeable areas are found. In this context, floods can generate a direct impact in a social-environmental-economic viewpoint. Traditionally, in design of Urban Drainage Systems (UDS), correlation between return period (RP) of a given rainfall and RP of its consequent flood has been assumed to be linear (e.g. DS/EN752 (2008)). However, this is not always the case. Complex UDS, where diverse hydraulic infrastructures are often found, increase the heterogeneity of system response, which may cause an alteration of the mentioned correlation. Consequently, reliability on future urban planning, design and resilience against floods may be also affected by this misassumption. In this study, an assessment of surface flood RP across rainfall RP has been carried out at Lystrup, a urbanized catchment area of 440ha and 10.400inhab. located in Jutland (Denmark), which has received the impact of several pluvial flooding in the last recent years. A historical rainfall dataset from the last 35 years from two different rain gauges located at 2 and 10 km from the study area has been provided by the Danish Wastewater Pollution Committee and the Danish Meteorological Institute (DMI). The most extreme 25 rainfall events have been selected through a two-step multi-criteria procedure, ensuring an adequate variability of rainfall, from extreme high peak storms with a short duration to moderate rainfall with longer duration. In addition, a coupled 1D/2D surface and network UDS model of the catchment area developed in an integrated MIKE URBAN and MIKE Flood model (DHI 2014), considering both permeable and impermeable areas, in combination with a DTM (2x2m res.) has been used to study and assess in detail flood RP. Results show an ambiguous relation between rainfall RP and flood response. Local flood levels, flood area and volume RP estimates should therefore not be neglected in

  8. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  9. Flood-inundation and flood-mitigation modeling of the West Branch Wapsinonoc Creek Watershed in West Branch, Iowa

    Science.gov (United States)

    Cigrand, Charles V.

    2018-03-26

    The U.S. Geological Survey (USGS) in cooperation with the city of West Branch and the Herbert Hoover National Historic Site of the National Park Service assessed flood-mitigation scenarios within the West Branch Wapsinonoc Creek watershed. The scenarios are intended to demonstrate several means of decreasing peak streamflows and improving the conveyance of overbank flows from the West Branch Wapsinonoc Creek and its tributary Hoover Creek where they flow through the city and the Herbert Hoover National Historic Site located within the city.Hydrologic and hydraulic models of the watershed were constructed to assess the flood-mitigation scenarios. To accomplish this, the models used the U.S. Army Corps of Engineers Hydrologic Engineering Center-Hydrologic Modeling System (HEC–HMS) version 4.2 to simulate the amount of runoff and streamflow produced from single rain events. The Hydrologic Engineering Center-River Analysis System (HEC–RAS) version 5.0 was then used to construct an unsteady-state model that may be used for routing streamflows, mapping areas that may be inundated during floods, and simulating the effects of different measures taken to decrease the effects of floods on people and infrastructure.Both models were calibrated to three historic rainfall events that produced peak streamflows ranging between the 2-year and 10-year flood-frequency recurrence intervals at the USGS streamgage (05464942) on Hoover Creek. The historic rainfall events were calibrated by using data from two USGS streamgages along with surveyed high-water marks from one of the events. The calibrated HEC–HMS model was then used to simulate streamflows from design rainfall events of 24-hour duration ranging from a 20-percent to a 1-percent annual exceedance probability. These simulated streamflows were incorporated into the HEC–RAS model.The unsteady-state HEC–RAS model was calibrated to represent existing conditions within the watershed. HEC–RAS model simulations with the

  10. Do flood risk perceptions provide useful insights for flood risk management? Findings from central Vietnam

    OpenAIRE

    Bubeck, P.; Botzen, W.J.W.; Suu, L.T.T.; Aerts, J.C.J.H.

    2012-01-01

    Following the renewed attention for non-structural flood risk reduction measures implemented at the household level, there has been an increased interest in individual flood risk perceptions. The reason for this is the commonly-made assumption that flood risk perceptions drive the motivation of individuals to undertake flood risk mitigation measures, as well as the public's demand for flood protection, and therefore provide useful insights for flood risk management. This study empirically exa...

  11. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  12. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  13. High-resolution flood modeling of urban areas using MSN_Flood

    Directory of Open Access Journals (Sweden)

    Michael Hartnett

    2017-07-01

    Full Text Available Although existing hydraulic models have been used to simulate and predict urban flooding, most of these models are inadequate due to the high spatial resolution required to simulate flows in urban floodplains. Nesting high-resolution subdomains within coarser-resolution models is an efficient solution for enabling simultaneous calculation of flooding due to tides, surges, and high river flows. MSN_Flood has been developed to incorporate moving boundaries around nested domains, permitting alternate flooding and drying along the boundary and in the interior of the domain. Ghost cells adjacent to open boundary cells convert open boundaries, in effect, into internal boundaries. The moving boundary may be multi-segmented and non-continuous, with recirculating flow across the boundary. When combined with a bespoke adaptive interpolation scheme, this approach facilitates a dynamic internal boundary. Based on an alternating-direction semi-implicit finite difference scheme, MSN_Flood was used to hindcast a major flood event in Cork City resulting from the combined pressures of fluvial, tidal, and storm surge processes. The results show that the model is computationally efficient, as the 2-m high-resolution nest is used only in the urban flooded region. Elsewhere, lower-resolution nests are used. The results also show that the model is highly accurate when compared with measured data. The model is capable of incorporating nested sub-domains when the nested boundary is multi-segmented and highly complex with lateral gradients of elevation and velocities. This is a major benefit when modelling urban floodplains at very high resolution.

  14. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    ., 2008) The methodology includes estimates of flood probabilities due to coastal- and fluvial-driven processes occurring individually or jointly, mechanisms of flooding and their impacts on urban environment. Various flood scenarios are examined in order to demonstrate that this methodology is necessary to quantify the important physical processes in coastal flood predictions. Cork City, located on the south of Ireland subject to frequent coastal-fluvial flooding, is used as a study case.

  15. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  16. Interpreting the impact of flood forecasts by combining policy analysis studies and flood defence

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    Full Text Available Flood forecasting is necessary to save lives and reduce damages. Reducing damages is important to save livelihoods and to reduce the recovery time. Flood alerts should contain expected time of the event, location and extent of the event. A flood alert is not only one message but part of a rehearsed flow of information using multiple canals. First people have to accept the fact that there might be a threat and what the threat is about. People need a reference to understand the situation and be aware of possible measures they can take to assure their own safety and reduce damages. Information to the general public has to be consistent with the information used by emergency services and has to be very clear about consequences and context of possible measures (as shelter in place or preventive evacuation. Emergency services should monitor how the public is responding to adapt their communication en operation during a crisis. Flood warnings and emergency services are often coordinated by different government organisations. This is an extra handicap for having consistent information out on time for people to use. In an information based society, where everyone has twitter, email and a camera, public organisations may have to trust the public more and send out the correct information as it comes in. In the Netherlands Rijkswaterstaat, the National Water Authority and the National Public Works Department, is responsible for or involved in forecasting in case of floods, policy studies on flood risk, policy studies on maintenance, assessment and design of flood defences, elaborating rules and regulations for flood defences, advice on crisis management to the national government and for maintaining the main infrastructure in the Netherlands (high ways and water ways. The Water Management Center in the Netherlands (WMCN has developed a number of models to provide flood forecasts. WMCN is run for and by all managers of flood defences and is hosted by

  17. Iowa Flood Information System

    Science.gov (United States)

    Demir, I.; Krajewski, W. F.; Goska, R.; Mantilla, R.; Weber, L. J.; Young, N.

    2011-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 500 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities

  18. Predicting floods with Flickr tags.

    Science.gov (United States)

    Tkachenko, Nataliya; Jarvis, Stephen; Procter, Rob

    2017-01-01

    Increasingly, user generated content (UGC) in social media postings and their associated metadata such as time and location stamps are being used to provide useful operational information during natural hazard events such as hurricanes, storms and floods. The main advantage of these new sources of data are twofold. First, in a purely additive sense, they can provide much denser geographical coverage of the hazard as compared to traditional sensor networks. Second, they provide what physical sensors are not able to do: By documenting personal observations and experiences, they directly record the impact of a hazard on the human environment. For this reason interpretation of the content (e.g., hashtags, images, text, emojis, etc) and metadata (e.g., keywords, tags, geolocation) have been a focus of much research into social media analytics. However, as choices of semantic tags in the current methods are usually reduced to the exact name or type of the event (e.g., hashtags '#Sandy' or '#flooding'), the main limitation of such approaches remains their mere nowcasting capacity. In this study we make use of polysemous tags of images posted during several recent flood events and demonstrate how such volunteered geographic data can be used to provide early warning of an event before its outbreak.

  19. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  20. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  1. Design flood hydrographs from the relationship between flood peak and volume

    Directory of Open Access Journals (Sweden)

    L. Mediero

    2010-12-01

    Full Text Available Hydrological frequency analyses are usually focused on flood peaks. Flood volumes and durations have not been studied as extensively, although there are many practical situations, such as when designing a dam, in which the full hydrograph is of interest. A flood hydrograph may be described by a multivariate function of the peak, volume and duration. Most standard bivariate and trivariate functions do not produce univariate three-parameter functions as marginal distributions, however, three-parameter functions are required to fit highly skewed data, such as flood peak and flood volume series. In this paper, the relationship between flood peak and hydrograph volume is analysed to overcome this problem. A Monte Carlo experiment was conducted to generate an ensemble of hydrographs that maintain the statistical properties of marginal distributions of the peaks, volumes and durations. This ensemble can be applied to determine the Design Flood Hydrograph (DFH for a reservoir, which is not a unique hydrograph, but rather a curve in the peak-volume space. All hydrographs on that curve have the same return period, which can be understood as the inverse of the probability to exceed a certain water level in the reservoir in any given year. The procedure can also be applied to design the length of the spillway crest in terms of the risk of exceeding a given water level in the reservoir.

  2. Flood action plans

    International Nuclear Information System (INIS)

    Slopek, R.J.

    1995-01-01

    Safe operating procedures developed by TransAlta Utilities for dealing with flooding, resulting from upstream dam failures or extreme rainfalls, were presented. Several operating curves developed by Monenco AGRA were described, among them the No Overtopping Curve (NOC), the Safe Filling Curve (SFC), the No Spill Curve (NSC) and the Guaranteed Fill Curve (GFC). The concept of an operational comfort zone was developed and defined. A flood action plan for all operating staff was created as a guide in case of a flooding incident. Staging of a flood action plan workshop was described. Dam break scenarios pertinent to the Bow River were developed for subsequent incorporation into a Flood Action Plan Manual. Evaluation of the technical presentations made during workshops were found them to have been effective in providing operating staff with a better understanding of the procedures that they would perform in an emergency. 8 figs

  3. Flood scour monitoring system using fiber Bragg grating sensors

    Science.gov (United States)

    Lin, Yung Bin; Lai, Jihn Sung; Chang, Kuo Chun; Li, Lu Sheng

    2006-12-01

    The exposure and subsequent undermining of pier/abutment foundations through the scouring action of a flood can result in the structural failure of a bridge. Bridge scour is one of the leading causes of bridge failure. Bridges subject to periods of flood/high flow require monitoring during those times in order to protect the traveling public. In this study, an innovative scour monitoring system using button-like fiber Bragg grating (FBG) sensors was developed and applied successfully in the field during the Aere typhoon period in 2004. The in situ FBG scour monitoring system has been demonstrated to be robust and reliable for real-time scour-depth measurements, and to be valid for indicating depositional depth at the Dadu Bridge. The field results show that this system can function well and survive a typhoon flood.

  4. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  5. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  6. The effect of floods on anemia among reproductive age women in Afghanistan.

    Science.gov (United States)

    Oskorouchi, Hamid Reza; Nie, Peng; Sousa-Poza, Alfonso

    2018-01-01

    This study uses biomarker information from the 2013 National Nutrition Survey Afghanistan and satellite precipitation driven modeling results from the Global Flood Monitoring System to analyze how floods affect the probability of anemia in Afghan women of reproductive age (15-49). In addition to establishing a causal relation between the two by exploiting the quasi-random variation of floods in different districts and periods, the analysis demonstrates that floods have a significant positive effect on the probability of anemia through two possible transmission mechanisms. The first is a significant effect on inflammation, probably related to water borne diseases carried by unsafe drinking water, and the second is a significant negative effect on retinol concentrations. Because the effect of floods on anemia remains significant even after we control for anemia's most common causes, we argue that the condition may also be affected by elevated levels of psychological stress.

  7. The development of flood map in Malaysia

    Science.gov (United States)

    Zakaria, Siti Fairus; Zin, Rosli Mohamad; Mohamad, Ismail; Balubaid, Saeed; Mydin, Shaik Hussein; MDR, E. M. Roodienyanto

    2017-11-01

    In Malaysia, flash floods are common occurrences throughout the year in flood prone areas. In terms of flood extent, flash floods affect smaller areas but because of its tendency to occur in densely urbanized areas, the value of damaged property is high and disruption to traffic flow and businesses are substantial. However, in river floods especially the river floods of Kelantan and Pahang, the flood extent is widespread and can extend over 1,000 square kilometers. Although the value of property and density of affected population is lower, the damage inflicted by these floods can also be high because the area affected is large. In order to combat these floods, various flood mitigation measures have been carried out. Structural flood mitigation alone can only provide protection levels from 10 to 100 years Average Recurrence Intervals (ARI). One of the economically effective non-structural approaches in flood mitigation and flood management is using a geospatial technology which involves flood forecasting and warning services to the flood prone areas. This approach which involves the use of Geographical Information Flood Forecasting system also includes the generation of a series of flood maps. There are three types of flood maps namely Flood Hazard Map, Flood Risk Map and Flood Evacuation Map. Flood Hazard Map is used to determine areas susceptible to flooding when discharge from a stream exceeds the bank-full stage. Early warnings of incoming flood events will enable the flood victims to prepare themselves before flooding occurs. Properties and life's can be saved by keeping their movable properties above the flood levels and if necessary, an early evacuation from the area. With respect to flood fighting, an early warning with reference through a series of flood maps including flood hazard map, flood risk map and flood evacuation map of the approaching flood should be able to alert the organization in charge of the flood fighting actions and the authority to

  8. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  9. Effects of a flooding event on a threatened black bear population in Louisiana

    Science.gov (United States)

    O'Connell-Goode, Kaitlin C.; Lowe, Carrie L.; Clark, Joseph D.

    2014-01-01

    The Louisiana black bear, Ursus americanus luteolus, is listed as threatened under the Endangered Species Act as a result of habitat loss and human-related mortality. Information on population-level responses of large mammals to flooding events is scarce, and we had a unique opportunity to evaluate the viability of the Upper Atchafalaya River Basin (UARB) black bear population before and after a significant flooding event. We began collecting black bear hair samples in 2007 for a DNA mark-recapture study to estimate abundance (N) and apparent survival (φ). In 2011, the Morganza Spillway was opened to divert floodwaters from the Mississippi River through the UARB, inundating > 50% of our study area, potentially impacting recovery of this important bear population. To evaluate the effects of this flooding event on bear population dynamics, we used a robust design multistate model to estimate changes in transition rates from the flooded area to non-flooded area (ψF→NF) before (2007–2010), during (2010–2011) and after (2011–2012) the flood. Average N across all years of study was 63.2 (SE = 5.2), excluding the year of the flooding event. Estimates of ψF→NF increased from 0.014 (SE = 0.010; meaning that 1.4% of the bears moved from the flooded area to non-flooded areas) before flooding to 0.113 (SE = 0.045) during the flood year, and then decreased to 0.028 (SE= 0.035) after the flood. Although we demonstrated a flood effect on transition rates as hypothesized, the effect was small (88.7% of the bears remained in the flooded area during flooding) and φ was unchanged, suggesting that the 2011 flooding event had minimal impact on survival and site fidelity.

  10. Flood susceptibility analysis through remote sensing, GIS and frequency ratio model

    Science.gov (United States)

    Samanta, Sailesh; Pal, Dilip Kumar; Palsamanta, Babita

    2018-05-01

    Papua New Guinea (PNG) is saddled with frequent natural disasters like earthquake, volcanic eruption, landslide, drought, flood etc. Flood, as a hydrological disaster to humankind's niche brings about a powerful and often sudden, pernicious change in the surface distribution of water on land, while the benevolence of flood manifests in restoring the health of the thalweg from excessive siltation by redistributing the fertile sediments on the riverine floodplains. In respect to social, economic and environmental perspective, flood is one of the most devastating disasters in PNG. This research was conducted to investigate the usefulness of remote sensing, geographic information system and the frequency ratio (FR) for flood susceptibility mapping. FR model was used to handle different independent variables via weighted-based bivariate probability values to generate a plausible flood susceptibility map. This study was conducted in the Markham riverine precinct under Morobe province in PNG. A historical flood inventory database of PNG resource information system (PNGRIS) was used to generate 143 flood locations based on "create fishnet" analysis. 100 (70%) flood sample locations were selected randomly for model building. Ten independent variables, namely land use/land cover, elevation, slope, topographic wetness index, surface runoff, landform, lithology, distance from the main river, soil texture and soil drainage were used into the FR model for flood vulnerability analysis. Finally, the database was developed for areas vulnerable to flood. The result demonstrated a span of FR values ranging from 2.66 (least flood prone) to 19.02 (most flood prone) for the study area. The developed database was reclassified into five (5) flood vulnerability zones segmenting on the FR values, namely very low (less that 5.0), low (5.0-7.5), moderate (7.5-10.0), high (10.0-12.5) and very high susceptibility (more than 12.5). The result indicated that about 19.4% land area as `very high

  11. Does Implementation of Big Data Analytics Improve Firms’ Market Value? Investors’ Reaction in Stock Market

    Directory of Open Access Journals (Sweden)

    Hansol Lee

    2017-06-01

    Full Text Available Recently, due to the development of social media, multimedia, and the Internet of Things (IoT, various types of data have increased. As the existing data analytics tools cannot cover this huge volume of data, big data analytics becomes one of the emerging technologies for business today. Considering that big data analytics is an up-to-date term, in the present study, we investigated the impact of implementing big data analytics in the short-term perspective. We used an event study methodology to investigate the changes in stock price caused by announcements on big data analytics solution investment. A total of 54 investment announcements of firms publicly traded in NASDAQ and NYSE from 2010 to 2015 were collected. Our results empirically demonstrate that announcement of firms’ investment on big data solution leads to positive stock market reactions. In addition, we also found that investments on small vendors’ solution with industry-oriented functions tend to result in higher abnormal returns than those on big vendors’ solution with general functions. Finally, our results also suggest that stock market investors highly evaluate big data analytics investments of big firms as compared to those of small firms.

  12. Use of documentary sources on past flood events for flood risk management and land planning

    Science.gov (United States)

    Cœur, Denis; Lang, Michel

    2008-09-01

    The knowledge of past catastrophic events can improve flood risk mitigation policy, with a better awareness against risk. As such historical information is usually available in Europe for the past five centuries, historians are able to understand how past society dealt with flood risk, and hydrologists can include information on past floods into an adapted probabilistic framework. In France, Flood Risk Mitigation Maps are based either on the largest historical known flood event or on the 100-year flood event if it is greater. Two actions can be suggested in terms of promoting the use of historical information for flood risk management: (1) the development of a regional flood data base, with both historical and current data, in order to get a good feedback on recent events and to improve the flood risk education and awareness; (2) the commitment to keep a persistent/perennial management of a reference network of hydrometeorological observations for climate change studies.

  13. Can companies benefit from Big Science? Science and Industry

    CERN Document Server

    Autio, Erkko; Bianchi-Streit, M

    2003-01-01

    Several studies have indicated that there are significant returns on financial investment via "Big Science" centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields - for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm's organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings i...

  14. Evaluation of Flooding Risk and Engineering Protection Against Floods for Ulan-Ude

    Science.gov (United States)

    Borisova, T. A.

    2017-11-01

    The report presents the results of the study on analysis and risk assessment in relation to floods for Ulan-Ude and provides the developed recommendations of the activities for engineering protection of the population and economic installations. The current situation is reviewed and the results of the site survey are shown to identify the challenges and areas of negative water influence along with the existing security system. The report presents a summary of floods and index risk assessment. The articles describes the scope of eventual flooding, underflooding and enumerates the economic installations inside the urban areas’ research-based zones of flooding at the rated levels of water to identify the likeliness of exceedance. The assessment of damage from flood equal to 1% is shown.

  15. Big Biomedical data as the key resource for discovery science

    Energy Technology Data Exchange (ETDEWEB)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W.; Price, Nathan D.; Glusman, Gustavo; Heavner, Benjamin D.; Dinov, Ivo D.; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-07-21

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.

  16. The Big Mac Standard: A statistical Illustration

    OpenAIRE

    Yukinobu Kitamura; Hiroshi Fujiki

    2004-01-01

    We demonstrate a statistical procedure for selecting the most suitable empirical model to test an economic theory, using the example of the test for purchasing power parity based on the Big Mac Index. Our results show that supporting evidence for purchasing power parity, conditional on the Balassa-Samuelson effect, depends crucially on the selection of models, sample periods and economies used for estimations.

  17. A contribution to improved flood magnitude estimation in base of palaeoflood record and climatic implications – Guadiana River (Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    J. A. Ortega

    2009-02-01

    Full Text Available The Guadiana River has a significant record of historical floods, but the systematic data record is only 59 years. From layers left by ancient floods we know about we can add new data to the record, and we can estimate maximum discharges of other floods only known by the moment of occurrence and by the damages caused. A hydraulic model has been performed in the area of Pulo de Lobo and calibrated by means of the rating curve of Pulo do Lobo Station. The palaeofloods have been dated by means of 14C y 137Cs. As non-systematic information has been used in order to calculate distribution functions, the quantiles have changed with respect to the same function when using systematic information. The results show a variation in the curves that can be blamed on the human transformations responsible for changing the hydrologic conditions as well as on the latest climate changes. High magnitude floods are related to cold periods, especially at transitional moments of change from cold to warm periods. This tendency has changed from the last medium-high magnitude flood, which took place in a systematic period. Both reasons seem to justify a change in the frequency curves indicating a recent decrease in the return period of big floods over 8000 m3 s−1. The palaeofloods indicate a bigger return period for the same water level discharge thus showing the river basin reference values in its natural condition previous to the transformation of the basin caused by anthropic action.

  18. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  19. Big endothelin changes the cellular miRNA environment in TMOb osteoblasts and increases mineralization.

    Science.gov (United States)

    Johnson, Michael G; Kristianto, Jasmin; Yuan, Baozhi; Konicke, Kathryn; Blank, Robert

    2014-08-01

    Endothelin (ET1) promotes the growth of osteoblastic breast and prostate cancer metastases. Conversion of big ET1 to mature ET1, catalyzed primarily by endothelin converting enzyme 1 (ECE1), is necessary for ET1's biological activity. We previously identified the Ece1, locus as a positional candidate gene for a pleiotropic quantitative trait locus affecting femoral size, shape, mineralization, and biomechanical performance. We exposed TMOb osteoblasts continuously to 25 ng/ml big ET1. Cells were grown for 6 days in growth medium and then switched to mineralization medium for an additional 15 days with or without big ET1, by which time the TMOb cells form mineralized nodules. We quantified mineralization by alizarin red staining and analyzed levels of miRNAs known to affect osteogenesis. Micro RNA 126-3p was identified by search as a potential regulator of sclerostin (SOST) translation. TMOb cells exposed to big ET1 showed greater mineralization than control cells. Big ET1 repressed miRNAs targeting transcripts of osteogenic proteins. Big ET1 increased expression of miRNAs that target transcripts of proteins that inhibit osteogenesis. Big ET1 increased expression of 126-3p 121-fold versus control. To begin to assess the effect of big ET1 on SOST production we analyzed both SOST transcription and protein production with and without the presence of big ET1 demonstrating that transcription and translation were uncoupled. Our data show that big ET1 signaling promotes mineralization. Moreover, the results suggest that big ET1's osteogenic effects are potentially mediated through changes in miRNA expression, a previously unrecognized big ET1 osteogenic mechanism.

  20. Regional flood reconstruction in Kullu District (Himachal Pradesh, India): implication for Disaster Risk Management

    Science.gov (United States)

    Ballesteros-Cánovas, Juan Antonio; Stoffel, Markus; Trappmann, Daniel; Shekhar, Mayank; Bhattacharyya, Amalava

    2016-04-01

    Floods are a common natural hazard in the Western Indian Himalayas. They usually occur when humid monsoon airs are lifted along the Himalayan relief, thereby creating intense orographic rainfall and runoff, a process which is often enhanced by simultaneous snowmelt. Monsoon floods are considered a major threat in the region and frequently affect inhabited valleys, disturbing the status quo of communities, stressing the future welfare and condition of their economic development. Given the assumption that ongoing and future climatic changes may impact on monsoon patterns and extreme precipitation, the implementation of adaptation policies in this region is critically needed in order to improve local resilience of Himalayan communities. However, its success implementation is highly dependent on system knowledge and hence reliable baseline data of past disasters. In this communication, we demonstrate how newly gained knowledge on past flood incidents may improve flood hazard and risk assessments. Based on growth-ring analysis of trees growing in the floodplains and other, more classical paleo-hydrology techniques, we reconstruct the regional flood activity for the last decades. This information is then included as non-systematic data into the regional flood frequency by using Bayesian Markov Monte Carlo Chain algorithms, so as to analyse the impact of the additional data on flood hazard assessments. Moreover, through a detailed analysis of three flood risk hotspots, we demonstrate how the newly gained knowledge on past flood disasters derived from indirect proxies can explain failures in the implementation of disaster risk management (DRM). Our methodology allowed identification of thirty-four unrecorded flood events at the study sites located in the upper reaches since the early 20th century, and thus completion of the existing flood history in the region based on flow measurements in the lower part of the catchment. We observe that 56% of the floods occurred

  1. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  2. BigWig and BigBed: enabling browsing of large distributed datasets.

    Science.gov (United States)

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  3. Review of the flood risk management system in Germany after the major flood in 2013

    Directory of Open Access Journals (Sweden)

    Annegret H. Thieken

    2016-06-01

    Full Text Available Widespread flooding in June 2013 caused damage costs of €6 to 8 billion in Germany, and awoke many memories of the floods in August 2002, which resulted in total damage of €11.6 billion and hence was the most expensive natural hazard event in Germany up to now. The event of 2002 does, however, also mark a reorientation toward an integrated flood risk management system in Germany. Therefore, the flood of 2013 offered the opportunity to review how the measures that politics, administration, and civil society have implemented since 2002 helped to cope with the flood and what still needs to be done to achieve effective and more integrated flood risk management. The review highlights considerable improvements on many levels, in particular (1 an increased consideration of flood hazards in spatial planning and urban development, (2 comprehensive property-level mitigation and preparedness measures, (3 more effective flood warnings and improved coordination of disaster response, and (4 a more targeted maintenance of flood defense systems. In 2013, this led to more effective flood management and to a reduction of damage. Nevertheless, important aspects remain unclear and need to be clarified. This particularly holds for balanced and coordinated strategies for reducing and overcoming the impacts of flooding in large catchments, cross-border and interdisciplinary cooperation, the role of the general public in the different phases of flood risk management, as well as a transparent risk transfer system. Recurring flood events reveal that flood risk management is a continuous task. Hence, risk drivers, such as climate change, land-use changes, economic developments, or demographic change and the resultant risks must be investigated at regular intervals, and risk reduction strategies and processes must be reassessed as well as adapted and implemented in a dialogue with all stakeholders.

  4. Towards a Flood Severity Index

    Science.gov (United States)

    Kettner, A.; Chong, A.; Prades, L.; Brakenridge, G. R.; Muir, S.; Amparore, A.; Slayback, D. A.; Poungprom, R.

    2017-12-01

    Flooding is the most common natural hazard worldwide, affecting 21 million people every year. In the immediate moments following a flood event, humanitarian actors like the World Food Program need to make rapid decisions ( 72 hrs) on how to prioritize affected areas impacted by such an event. For other natural disasters like hurricanes/cyclones and earthquakes, there are industry-recognized standards on how the impacted areas are to be classified. Shake maps, quantifying peak ground motion, from for example the US Geological Survey are widely used for assessing earthquakes. Similarly, cyclones are tracked by Joint Typhoon Warning Center (JTWC) and Global Disaster Alert and Coordination System (GDACS) who release storm nodes and tracks (forecasted and actual), with wind buffers and classify the event according to the Saffir-Simpson Hurricane Wind Scale. For floods, the community is usually able to acquire unclassified data of the flood extent as identified from satellite imagery. Most often no water discharge hydrograph is available to classify the event into recurrence intervals simply because there is no gauging station, or the gauging station was unable to record the maximum discharge due to overtopping or flood damage. So, the question remains: How do we methodically turn a flooded area into classified areas of different gradations of impact? Here, we present a first approach towards developing a global applicable flood severity index. The flood severity index is set up such that it considers relatively easily obtainable physical parameters in a short period of time like: flood frequency (relating the current flood to historical events) and magnitude, as well as land cover, slope, and where available pre-event simulated flood depth. The scale includes categories ranging from very minor flooding to catastrophic flooding. We test and evaluate the postulated classification scheme against a set of past flood events. Once a severity category is determined, socio

  5. Hydrological Modelling using HEC-HMS for Flood Risk Assessment of Segamat Town, Malaysia

    Science.gov (United States)

    Romali, N. S.; Yusop, Z.; Ismail, A. Z.

    2018-03-01

    This paper presents an assessment of the applicability of using Hydrologic Modelling System developed by the Hydrologic Engineering Center (HEC-HMS) for hydrological modelling of Segamat River. The objective of the model application is to assist in the assessment of flood risk by providing the peak flows of 2011 Segamat flood for the generation of flood mapping of Segamat town. The capability of the model was evaluated by comparing the historical observed data with the simulation results of the selected flood events. The model calibration and validation efficiency was verified using Nash-Sutcliffe model efficiency coefficient. The results demonstrate the interest to implement the hydrological model for assessing flood risk where the simulated peak flow result is in agreement with historical observed data. The model efficiency of the calibrated and validated exercises is 0.90 and 0.76 respectively, which is acceptable.

  6. Riparian plant community responses to increased flooding: a meta-analysis.

    Science.gov (United States)

    Garssen, Annemarie G; Baattrup-Pedersen, Annette; Voesenek, Laurentius A C J; Verhoeven, Jos T A; Soons, Merel B

    2015-08-01

    A future higher risk of severe flooding of streams and rivers has been projected to change riparian plant community composition and species richness, but the extent and direction of the expected change remain uncertain. We conducted a meta-analysis to synthesize globally available experimental evidence and assess the effects of increased flooding on (1) riparian adult plant and seedling survival, (2) riparian plant biomass and (3) riparian plant species composition and richness. We evaluated which plant traits are of key importance for the response of riparian plant species to flooding. We identified and analysed 53 papers from ISI Web of Knowledge which presented quantitative experimental results on flooding treatments and corresponding control situations. Our meta-analysis demonstrated how longer duration of flooding, greater depth of flooding and, particularly, their combination reduce seedling survival of most riparian species. Plant height above water level, ability to elongate shoots and plasticity in root porosity were decisive for adult plant survival and growth during longer periods of flooding. Both 'quiescence' and 'escape' proved to be successful strategies promoting riparian plant survival, which was reflected in the wide variation in survival (full range between 0 and 100%) under fully submerged conditions, while plants that protrude above the water level (>20 cm) almost all survive. Our survey confirmed that the projected increase in the duration and depth of flooding periods is sufficient to result in species shifts. These shifts may lead to increased or decreased riparian species richness depending on the nutrient, climatic and hydrological status of the catchment. Species richness was generally reduced at flooded sites in nutrient-rich catchments and sites that previously experienced relatively stable hydrographs (e.g. rain-fed lowland streams). Species richness usually increased at sites in desert and semi-arid climate regions (e.g. intermittent

  7. Constructing risks – Internalisation of flood risks in the flood risk management plan

    NARCIS (Netherlands)

    Roos, Matthijs; Hartmann, T.; Spit, T.J.M.; Johann, Georg

    Traditional flood protection methods have focused efforts on different measures to keep water out of floodplains. However, the European Flood Directive challenges this paradigm (Hartmann and Driessen, 2013). Accordingly, flood risk management plans should incorporate measures brought about by

  8. Legitimizing differentiated flood protection levels

    NARCIS (Netherlands)

    Thomas, Hartmann; Spit, Tejo

    2016-01-01

    The European flood risk management plan is a new instrument introduced by the Floods Directive. It introduces a spatial turn and a scenario approach in flood risk management, ultimately leading to differentiated flood protection levels on a catchment basis. This challenges the traditional sources of

  9. Flood risk management in Flanders: from flood risk objectives to appropriate measures through state assessment

    Directory of Open Access Journals (Sweden)

    Verbeke Sven

    2016-01-01

    Full Text Available In compliance with the EU Flood Directive to reduce flood risk, flood risk management objectives are indispensable for the delineation of necessary measures. In Flanders, flood risk management objectives are part of the environmental objectives which are judicially integrated by the Decree on Integrated Water Policy. Appropriate objectives were derived by supporting studies and extensive consultation on a local, regional and policy level. Under a general flood risk objective sub-objectives are formulated for different aspects: water management and safety, shipping, ecology, and water supply. By developing a risk matrix, it is possible to assess the current state of flood risk and to judge where action is needed to decrease the risk. Three different states of flood risk are distinguished: a acceptable risk, where no action is needed, b intermediate risk where the risk should be reduced by cost efficient actions, and c unacceptable risk, where action is necessary. For each particular aspect, the severity of the consequences of flooding is assessed by quantifiable indicators, such as economic risk, people at risk and ecological flood tolerance. The framework also allows evaluating the effects of the implemented measures and the autonomous development such as climate change and land use change. This approach gives a quantifiable assessment of state, and enables a prioritization of flood risk measures for the reduction of flood risk in a cost efficient and sustainable way.

  10. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  11. How useful are Swiss flood insurance data for flood vulnerability assessments?

    Science.gov (United States)

    Röthlisberger, Veronika; Bernet, Daniel; Zischg, Andreas; Keiler, Margreth

    2015-04-01

    The databases of Swiss flood insurance companies build a valuable but to date rarely used source of information on physical flood vulnerability. Detailed insights into the Swiss flood insurance system are crucial for using the full potential of the different databases for research on flood vulnerability. Insurance against floods in Switzerland is a federal system, the modalities are manly regulated on cantonal level. However there are some common principles that apply throughout Switzerland. First of all coverage against floods (and other particular natural hazards) is an integral part of every fire insurance policy for buildings or contents. This coupling of insurance as well as the statutory obligation to insure buildings in most of the cantons and movables in some of the cantons lead to a very high penetration. Second, in case of damage, the reinstatement costs (value as new) are compensated and third there are no (or little) deductible and co-pay. High penetration and the fact that the compensations represent a large share of the direct, tangible losses of the individual policy holders make the databases of the flood insurance companies a comprehensive and therefore valuable data source for flood vulnerability research. Insurance companies not only store electronically data about losses (typically date, amount of claims payment, cause of damage, identity of the insured object or policyholder) but also about insured objects. For insured objects the (insured) value and the details on the policy and its holder are the main feature to record. On buildings the insurance companies usually computerize additional information such as location, volume, year of construction or purpose of use. For the 19 (of total 26) cantons with a cantonal monopoly insurer the data of these insurance establishments have the additional value to represent (almost) the entire building stock of the respective canton. Spatial referenced insurance data can be used for many aspects of

  12. Mitigating flood exposure

    Science.gov (United States)

    Shultz, James M; McLean, Andrew; Herberman Mash, Holly B; Rosen, Alexa; Kelly, Fiona; Solo-Gabriele, Helena M; Youngs Jr, Georgia A; Jensen, Jessica; Bernal, Oscar; Neria, Yuval

    2013-01-01

    Introduction. In 2011, following heavy winter snowfall, two cities bordering two rivers in North Dakota, USA faced major flood threats. Flooding was foreseeable and predictable although the extent of risk was uncertain. One community, Fargo, situated in a shallow river basin, successfully mitigated and prevented flooding. For the other community, Minot, located in a deep river valley, prevention was not possible and downtown businesses and one-quarter of the homes were inundated, in the city’s worst flood on record. We aimed at contrasting the respective hazards, vulnerabilities, stressors, psychological risk factors, psychosocial consequences, and disaster risk reduction strategies under conditions where flood prevention was, and was not, possible. Methods. We applied the “trauma signature analysis” (TSIG) approach to compare the hazard profiles, identify salient disaster stressors, document the key components of disaster risk reduction response, and examine indicators of community resilience. Results. Two demographically-comparable communities, Fargo and Minot, faced challenging river flood threats and exhibited effective coordination across community sectors. We examined the implementation of disaster risk reduction strategies in situations where coordinated citizen action was able to prevent disaster impact (hazard avoidance) compared to the more common scenario when unpreventable disaster strikes, causing destruction, harm, and distress. Across a range of indicators, it is clear that successful mitigation diminishes both physical and psychological impact, thereby reducing the trauma signature of the event. Conclusion. In contrast to experience of historic flooding in Minot, the city of Fargo succeeded in reducing the trauma signature by way of reducing risk through mitigation. PMID:28228985

  13. Why are decisions in flood disaster management so poorly supported by information from flood models?

    NARCIS (Netherlands)

    Leskens, Anne; Brugnach, Marcela Fabiana; Hoekstra, Arjen Ysbert; Schuurmans, W.

    2014-01-01

    Flood simulation models can provide practitioners of Flood Disaster Management with sophisticated estimates of floods. Despite the advantages that flood simulation modeling may provide, experiences have proven that these models are of limited use. Until now, this problem has mainly been investigated

  14. Big data and high-performance analytics in structural health monitoring for bridge management

    Science.gov (United States)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  15. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  16. Case studies of extended model-based flood forecasting: prediction of dike strength and flood impacts

    Science.gov (United States)

    Stuparu, Dana; Bachmann, Daniel; Bogaard, Tom; Twigt, Daniel; Verkade, Jan; de Bruijn, Karin; de Leeuw, Annemargreet

    2017-04-01

    Flood forecasts, warning and emergency response are important components in flood risk management. Most flood forecasting systems use models to translate weather predictions to forecasted discharges or water levels. However, this information is often not sufficient for real time decisions. A sound understanding of the reliability of embankments and flood dynamics is needed to react timely and reduce the negative effects of the flood. Where are the weak points in the dike system? When, how much and where the water will flow? When and where is the greatest impact expected? Model-based flood impact forecasting tries to answer these questions by adding new dimensions to the existing forecasting systems by providing forecasted information about: (a) the dike strength during the event (reliability), (b) the flood extent in case of an overflow or a dike failure (flood spread) and (c) the assets at risk (impacts). This work presents three study-cases in which such a set-up is applied. Special features are highlighted. Forecasting of dike strength. The first study-case focusses on the forecast of dike strength in the Netherlands for the river Rhine branches Waal, Nederrijn and IJssel. A so-called reliability transformation is used to translate the predicted water levels at selected dike sections into failure probabilities during a flood event. The reliability of a dike section is defined by fragility curves - a summary of the dike strength conditional to the water level. The reliability information enhances the emergency management and inspections of embankments. Ensemble forecasting. The second study-case shows the setup of a flood impact forecasting system in Dumfries, Scotland. The existing forecasting system is extended with a 2D flood spreading model in combination with the Delft-FIAT impact model. Ensemble forecasts are used to make use of the uncertainty in the precipitation forecasts, which is useful to quantify the certainty of a forecasted flood event. From global

  17. Multi-dimensional perspectives of flood risk - using a participatory framework to develop new approaches to flood risk communication

    Science.gov (United States)

    Rollason, Edward; Bracken, Louise; Hardy, Richard; Large, Andy

    2017-04-01

    Flooding is a major hazard across Europe which, since, 1998 has caused over €52 million in damages and displaced over half a million people. Climate change is predicted to increase the risks posed by flooding in the future. The 2007 EU Flood Directive cemented the use of flood risk maps as a central tool in understanding and communicating flood risk. Following recent flooding in England, an urgent need to integrate people living at risk from flooding into flood management approaches, encouraging flood resilience and the up-take of resilient activities has been acknowledged. The effective communication of flood risk information plays a major role in allowing those at risk to make effective decisions about flood risk and increase their resilience, however, there are emerging concerns over the effectiveness of current approaches. The research presented explores current approaches to flood risk communication in England and the effectiveness of these methods in encouraging resilient actions before and during flooding events. The research also investigates how flood risk communications could be undertaken more effectively, using a novel participatory framework to integrate the perspectives of those living at risk. The research uses co-production between local communities and researchers in the environmental sciences, using a participatory framework to bring together local knowledge of flood risk and flood communications. Using a local competency group, the research explores what those living at risk from flooding want from flood communications in order to develop new approaches to help those at risk understand and respond to floods. Suggestions for practice are refined by the communities to co-produce recommendations. The research finds that current approaches to real-time flood risk communication fail to forecast the significance of predicted floods, whilst flood maps lack detailed information about how floods occur, or use scientific terminology which people at risk

  18. Flood management: prediction of microbial contamination in large-scale floods in urban environments.

    Science.gov (United States)

    Taylor, Jonathon; Lai, Ka Man; Davies, Mike; Clifton, David; Ridley, Ian; Biddulph, Phillip

    2011-07-01

    With a changing climate and increased urbanisation, the occurrence and the impact of flooding is expected to increase significantly. Floods can bring pathogens into homes and cause lingering damp and microbial growth in buildings, with the level of growth and persistence dependent on the volume and chemical and biological content of the flood water, the properties of the contaminating microbes, and the surrounding environmental conditions, including the restoration time and methods, the heat and moisture transport properties of the envelope design, and the ability of the construction material to sustain the microbial growth. The public health risk will depend on the interaction of these complex processes and the vulnerability and susceptibility of occupants in the affected areas. After the 2007 floods in the UK, the Pitt review noted that there is lack of relevant scientific evidence and consistency with regard to the management and treatment of flooded homes, which not only put the local population at risk but also caused unnecessary delays in the restoration effort. Understanding the drying behaviour of flooded buildings in the UK building stock under different scenarios, and the ability of microbial contaminants to grow, persist, and produce toxins within these buildings can help inform recovery efforts. To contribute to future flood management, this paper proposes the use of building simulations and biological models to predict the risk of microbial contamination in typical UK buildings. We review the state of the art with regard to biological contamination following flooding, relevant building simulation, simulation-linked microbial modelling, and current practical considerations in flood remediation. Using the city of London as an example, a methodology is proposed that uses GIS as a platform to integrate drying models and microbial risk models with the local building stock and flood models. The integrated tool will help local governments, health authorities

  19. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  20. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    Science.gov (United States)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  1. Effects of climate variability on global scale flood risk

    Science.gov (United States)

    Ward, P.; Dettinger, M. D.; Kummu, M.; Jongman, B.; Sperna Weiland, F.; Winsemius, H.

    2013-12-01

    In this contribution we demonstrate the influence of climate variability on flood risk. Globally, flooding is one of the worst natural hazards in terms of economic damages; Munich Re estimates global losses in the last decade to be in excess of $240 billion. As a result, scientifically sound estimates of flood risk at the largest scales are increasingly needed by industry (including multinational companies and the insurance industry) and policy communities. Several assessments of global scale flood risk under current and conditions have recently become available, and this year has seen the first studies assessing how flood risk may change in the future due to global change. However, the influence of climate variability on flood risk has as yet hardly been studied, despite the fact that: (a) in other fields (drought, hurricane damage, food production) this variability is as important for policy and practice as long term change; and (b) climate variability has a strong influence in peak riverflows around the world. To address this issue, this contribution illustrates the influence of ENSO-driven climate variability on flood risk, at both the globally aggregated scale and the scale of countries and large river basins. Although it exerts significant and widespread influences on flood peak discharges in many parts of the world, we show that ENSO does not have a statistically significant influence on flood risk once aggregated to global totals. At the scale of individual countries, though, strong relationships exist over large parts of the Earth's surface. For example, we find particularly strong anomalies of flood risk in El Niño or La Niña years (compared to all years) in southern Africa, parts of western Africa, Australia, parts of Central Eurasia (especially for El Niño), the western USA (especially for La Niña), and parts of South America. These findings have large implications for both decadal climate-risk projections and long-term future climate change

  2. Flood-proof motors

    Energy Technology Data Exchange (ETDEWEB)

    Schmitt, Marcus [AREVA NP GmbH, Erlangen (Germany)

    2013-07-01

    Even before the Fukushima event occurred some German nuclear power plants (NPP) have considered flooding scenarios. As a result of one of these studies, AREVA performed an upgrade project in NPP Isar 1 with flood-proof motors as a replacement of existing air-cooled low-voltage and high-voltage motors of the emergency cooling chain. After the Fukushima event, in which the cooling chains failed, the topic flood-proof equipment gets more and more into focus. This compact will introduce different kinds of flood-proof electrical motors which are currently installed or planned for installation into NPPs over the world. Moreover the process of qualification, as it was performed during the project in NPP Isar 1, will be shown. (orig.)

  3. Flood-proof motors

    International Nuclear Information System (INIS)

    Schmitt, Marcus

    2013-01-01

    Even before the Fukushima event occurred some German nuclear power plants (NPP) have considered flooding scenarios. As a result of one of these studies, AREVA performed an upgrade project in NPP Isar 1 with flood-proof motors as a replacement of existing air-cooled low-voltage and high-voltage motors of the emergency cooling chain. After the Fukushima event, in which the cooling chains failed, the topic flood-proof equipment gets more and more into focus. This compact will introduce different kinds of flood-proof electrical motors which are currently installed or planned for installation into NPPs over the world. Moreover the process of qualification, as it was performed during the project in NPP Isar 1, will be shown. (orig.)

  4. Numerical analysis of the big bounce in loop quantum cosmology

    International Nuclear Information System (INIS)

    Laguna, Pablo

    2007-01-01

    Loop quantum cosmology (LQC) homogeneous models with a massless scalar field show that the big-bang singularity can be replaced by a big quantum bounce. To gain further insight on the nature of this bounce, we study the semidiscrete loop quantum gravity Hamiltonian constraint equation from the point of view of numerical analysis. For illustration purposes, we establish a numerical analogy between the quantum bounces and reflections in finite difference discretizations of wave equations triggered by the use of nonuniform grids or, equivalently, reflections found when solving numerically wave equations with varying coefficients. We show that the bounce is closely related to the method for the temporal update of the system and demonstrate that explicit time-updates in general yield bounces. Finally, we present an example of an implicit time-update devoid of bounces and show back-in-time, deterministic evolutions that reach and partially jump over the big-bang singularity

  5. Hurricane Harvey Riverine Flooding: Part 1 - Reconstruction of Hurricane Harvey Flooding for Harris County, TX using a GPU-accelerated 2D flood model for post-flood hazard analysis

    Science.gov (United States)

    Kalyanapu, A. J.; Dullo, T. T.; Gangrade, S.; Kao, S. C.; Marshall, R.; Islam, S. R.; Ghafoor, S. K.

    2017-12-01

    Hurricane Harvey that made landfall in the southern Texas this August is one of the most destructive hurricanes during the 2017 hurricane season. During its active period, many areas in coastal Texas region received more than 40 inches of rain. This downpour caused significant flooding resulting in about 77 casualties, displacing more than 30,000 people, inundating hundreds of thousands homes and is currently estimated to have caused more than $70 billion in direct damage. One of the significantly affected areas is Harris County where the city of Houston, TX is located. Covering over two HUC-8 drainage basins ( 2702 mi2), this county experienced more than 80% of its annual average rainfall during this event. This study presents an effort to reconstruct flooding caused by extreme rainfall due to Hurricane Harvey in Harris County, Texas. This computationally intensive task was performed at a 30-m spatial resolution using a rapid flood model called Flood2D-GPU, a graphics processing unit (GPU) accelerated model, on Oak Ridge National Laboratory's (ORNL) Titan Supercomputer. For this task, the hourly rainfall estimates from the National Center for Environmental Prediction Stage IV Quantitative Precipitation Estimate were fed into the Variable Infiltration Capacity (VIC) hydrologic model and Routing Application for Parallel computation of Discharge (RAPID) routing model to estimate flow hydrographs at 69 locations for Flood2D-GPU simulation. Preliminary results of the simulation including flood inundation extents, maps of flood depths and inundation duration will be presented. Future efforts will focus on calibrating and validating the simulation results and assessing the flood damage for better understanding the impacts made by Hurricane Harvey.

  6. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  7. Earth Science Capability Demonstration Project

    Science.gov (United States)

    Cobleigh, Brent

    2006-01-01

    A viewgraph presentation reviewing the Earth Science Capability Demonstration Project is shown. The contents include: 1) ESCD Project; 2) Available Flight Assets; 3) Ikhana Procurement; 4) GCS Layout; 5) Baseline Predator B Architecture; 6) Ikhana Architecture; 7) UAV Capability Assessment; 8) The Big Picture; 9) NASA/NOAA UAV Demo (5/05 to 9/05); 10) NASA/USFS Western States Fire Mission (8/06); and 11) Suborbital Telepresence.

  8. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  9. Ecosystem Approach To Flood Disaster Risk Reduction

    Directory of Open Access Journals (Sweden)

    RK Kamble

    2013-12-01

    Full Text Available India is one of the ten worst disaster prone countries of the world. The country is prone to disasters due to number of factors; both natural and anthropogenic, including adverse geo-climatic conditions, topographical features, environmental degradation, population growth, urbanisation, industrlisation, non-scientific development practices etc. The factors either in original or by accelerating the intensity and frequency of disasters are responsible for heavy toll of human lives and disrupting the life support systems in the country. India has 40 million hectares of the flood-prone area, on an average, flood affect an area of around 7.5 million hectares per year. Knowledge of environmental systems and processes are key factors in the management of disasters, particularly the hydro-metrological ones. Management of flood risk and disaster is a multi-dimensional affair that calls for interdisciplinary approach. Ecosystem based disaster risk reduction builds on ecosystem management principles, strategies and tools in order to maximise ecosystem services for risk reduction. This perspective takes into account the integration of social and ecological systems, placing people at the centre of decision making. The present paper has been attempted to demonstrate how ecosystem-based approach can help in flood disaster risk reduction. International Journal of Environment, Volume-2, Issue-1, Sep-Nov 2013, Pages 70-82 DOI: http://dx.doi.org/10.3126/ije.v2i1.9209

  10. Development of high-resolution multi-scale modelling system for simulation of coastal-fluvial urban flooding

    Science.gov (United States)

    Comer, Joanne; Indiana Olbert, Agnieszka; Nash, Stephen; Hartnett, Michael

    2017-02-01

    Urban developments in coastal zones are often exposed to natural hazards such as flooding. In this research, a state-of-the-art, multi-scale nested flood (MSN_Flood) model is applied to simulate complex coastal-fluvial urban flooding due to combined effects of tides, surges and river discharges. Cork city on Ireland's southwest coast is a study case. The flood modelling system comprises a cascade of four dynamically linked models that resolve the hydrodynamics of Cork Harbour and/or its sub-region at four scales: 90, 30, 6 and 2 m. Results demonstrate that the internalization of the nested boundary through the use of ghost cells combined with a tailored adaptive interpolation technique creates a highly dynamic moving boundary that permits flooding and drying of the nested boundary. This novel feature of MSN_Flood provides a high degree of choice regarding the location of the boundaries to the nested domain and therefore flexibility in model application. The nested MSN_Flood model through dynamic downscaling facilitates significant improvements in accuracy of model output without incurring the computational expense of high spatial resolution over the entire model domain. The urban flood model provides full characteristics of water levels and flow regimes necessary for flood hazard identification and flood risk assessment.

  11. Examination of flood characteristics at selected streamgages in the Meramec River Basin, eastern Missouri, December 2015–January 2016

    Science.gov (United States)

    Holmes, Robert R.; Koenig, Todd A.; Rydlund, Jr., Paul H.; Heimann, David C.

    2016-09-13

    OverviewHeavy rainfall resulted in major flooding in the Meramec River Basin in eastern Missouri during late December 2015 through early January 2016. Cumulative rainfall from December 14 to 29, 2015, ranged from 7.6 to 12.3 inches at selected precipitation stations in the basin with flooding driven by the heaviest precipitation (3.9–9.7 inches) between December 27 and 29, 2015. Financial losses from flooding included damage to homes and other structures, damage to roads, and debris removal. Eight of 11 counties in the basin were declared a Federal Disaster Area.The U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers and St. Louis Metropolitan Sewer District, operates multiple streamgages along the Meramec River and its primary tributaries including the Bourbeuse River and Big River. The period of record for streamflow at streamgages in the basin included in this report ranges from 24 to 102 years. Instrumentation in a streamgage shelter automatically makes observations of stage using a variety of methods (submersible pressure transducer, non-submersible pressure transducer, or non-contact radar). These observations are recorded autonomously at a predetermined programmed frequency (typically either 15 or 30 minutes) dependent on drainage-area size and concomitant flashiness of the stream. Although stage data are important, streamflow data are equally or more important for streamflow forecasting, water-quality constituent loads computation, flood-frequency analysis, and flood mitigation planning. Streamflows are computed from recorded stage data using an empirically determined relation between stage and streamflow termed a “rating.” Development and verification of the rating requires periodic onsite discrete measurements of streamflow throughout time and over the range of stages to define local hydraulic conditions.The purpose of this report is to examine characteristics of flooding that occurred in the Meramec River Basin in

  12. Testing an innovative framework for flood forecasting, monitoring and mapping in Europe

    Science.gov (United States)

    Dottori, Francesco; Kalas, Milan; Lorini, Valerio; Wania, Annett; Pappenberger, Florian; Salamon, Peter; Ramos, Maria Helena; Cloke, Hannah; Castillo, Carlos

    2017-04-01

    Between May and June 2016, France was hit by severe floods, particularly in the Loire and Seine river basins. In this work, we use this case study to test an innovative framework for flood forecasting, mapping and monitoring. More in detail, the system integrates in real-time two components of the Copernicus Emergency mapping services, namely the European Flood Awareness System and the satellite-based Rapid Mapping, with new procedures for rapid risk assessment and social media and news monitoring. We explore in detail the performance of each component of the system, demonstrating the improvements in respect to stand-alone flood forecasting and monitoring systems. We show how the performances of the forecasting component can be refined using the real-time feedback from social media monitoring to identify which areas were flooded, to evaluate the flood intensity, and therefore to correct impact estimations. Moreover, we show how the integration with impact forecast and social media monitoring can improve the timeliness and efficiency of satellite based emergency mapping, and reduce the chances of missing areas where flooding is already happening. These results illustrate how the new integrated approach leads to a better and earlier decision making and a timely evaluation of impacts.

  13. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  14. The Problem with Big Data: Operating on Smaller Datasets to Bridge the Implementation Gap.

    Science.gov (United States)

    Mann, Richard P; Mushtaq, Faisal; White, Alan D; Mata-Cervantes, Gabriel; Pike, Tom; Coker, Dalton; Murdoch, Stuart; Hiles, Tim; Smith, Clare; Berridge, David; Hinchliffe, Suzanne; Hall, Geoff; Smye, Stephen; Wilkie, Richard M; Lodge, J Peter A; Mon-Williams, Mark

    2016-01-01

    Big datasets have the potential to revolutionize public health. However, there is a mismatch between the political and scientific optimism surrounding big data and the public's perception of its benefit. We suggest a systematic and concerted emphasis on developing models derived from smaller datasets to illustrate to the public how big data can produce tangible benefits in the long term. In order to highlight the immediate value of a small data approach, we produced a proof-of-concept model predicting hospital length of stay. The results demonstrate that existing small datasets can be used to create models that generate a reasonable prediction, facilitating health-care delivery. We propose that greater attention (and funding) needs to be directed toward the utilization of existing information resources in parallel with current efforts to create and exploit "big data."

  15. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  16. Tax Haven Networks and the Role of the Big 4 Accountancy Firms

    OpenAIRE

    Jones, Christopher M; Temouri, Yama; Cobham, Alex

    2017-01-01

    This paper investigates the association between the Big 4 accountancy firms and the extent to which multinational enterprises build, manage and maintain their networks of tax haven subsidiaries. We extend internalisation theory and derive a number of hypotheses that are tested using count models on firm-level data. Our key findings demonstrate that there is a strong correlation and causal link between the size of an MNE’s tax haven network and their use of the Big 4. We therefore argue that p...

  17. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  18. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  19. Flood Mitigation and Response: Comparing the Great Midwest Floods of 1993 and 2008

    Science.gov (United States)

    2010-12-01

    Robert Holmes and Heidi Koontz , “Two 500-Year Floods Within 15 Years—What are the Odds?,” http://64.233.167.104/custom?q...implies a 1-in- 100 (or 1 percent) chance a flood of that magnitude will occur in a given year. Robert Holmes and Heidi Koontz , “Two 500-Year Floods...Fact Sheet 2004-3024 (U.S. Geological Survey: May2004). 92 ______ and Koontz , Heidi. “Two 500-Year Floods Within 15 Years—What are the Odds

  20. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  1. GIS Support for Flood Rescue

    DEFF Research Database (Denmark)

    Liang, Gengsheng; Mioc, Darka; Anton, François

    2007-01-01

    Under flood events, the ground traffic is blocked in and around the flooded area due to damages to roads and bridges. The traditional transportation network may not always help people to make a right decision for evacuation. In order to provide dynamic road information needed for flood rescue, we...... to retrieve the shortest and safest route in Fredericton road network during flood event. It enables users to make a timely decision for flood rescue. We are using Oracle Spatial to deal with emergency situations that can be applied to other constrained network applications as well....... developed an adaptive web-based transportation network application using Oracle technology. Moreover, the geographic relationships between the road network and flood areas are taken into account. The overlay between the road network and flood polygons is computed on the fly. This application allows users...

  2. Demonstrator Flood Control Room : Inventarisatie van de wensen van de verschillende Deltares onderdelen en een hierop gebaseerd ontwerp

    NARCIS (Netherlands)

    Boertjens, G.J.; Attema-van Waas, A.R.; Guikema, M.; Schilder, C.M.C.; Veen, M.J. van der

    2009-01-01

    Op basis van het uitgevoerde onderzoek trekt TNO de volgende conclusies: • De bestaande ruimte die Deltares op het oog heeft voor de realisatie van de trainingsruimte is klein. Een eerste fase van de gewenste Flood Control Room is realiseerbaar in deze ruimte, met inachtneming dat niet alle

  3. Assessment of flood Response Characteristics to Urbanization and extreme flood events-Typhoons at Cheongju, Chungbuk

    Science.gov (United States)

    Chang, HyungJoon; Lee, Hyosang; Hwang, Myunggyu; Jang, Sukhwan

    2016-04-01

    The changes of land use influence on the flood characteristics, which depend on rainfall runoff procedures in the catchment. This study assesses the changes of flood characteristics due to land use changes between 1997 and 2012. The catchment model (HEC-HMS) is calibrated with flood events of 1990's and 2000's respectively, then the design rainfall of 100, 200, 500year return period are applied to this model, which represent the catchment in 1990's and 2000's, to assess the flood peaks. Then the extreme flood events (i.e., 6 typhoon events) are applied to assess the flood responses. The results of comparison between 1990's and 2000's show that the flood peak and level of 2000's are increasing and time to peak of 2000's is decreasing comparing to those of 1990's :3% to 78% increase in flood peak, 3% in flood level and 10.2% to 16% decrease in time to peak in 100year return period flood. It is due to decreasing of the farmland area (2.18%), mountainous area (8.88%), and increasing of the urbanization of the area (5.86%). This study also estimates the responses to extreme flood events. The results of 2000's show that the increasing of the flood peak and time to peak comparing to 1990's. It indicates that the extreme rainfall is more responsible at unurbanized catchment ( 2000's), which resulting with a 11% increasing of the peak volume. Acknowledgement This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  4. “Expect More Floods In 2013”: An analysis of flood preparedness in ...

    African Journals Online (AJOL)

    In 2013, the Nigerian Meteorological Agency (NIMET) issued a prediction of heavy rainfall with consequent flooding in some major cities of Nigeria particularly Ibadan. In light of the country's previous flood experiences, citizens and government were promptly alerted and advised to be fully prepared for imminent floods.

  5. Hurricane Harvey Riverine Flooding: Part 2: Integration of Heterogeneous Earth Observation Data for Comparative Analysis with High-Resolution Inundation Boundaries Reconstructed from Flood2D-GPU Model

    Science.gov (United States)

    Jackson, C.; Sava, E.; Cervone, G.

    2017-12-01

    Hurricane Harvey has been noted as the wettest cyclone on record for the US as well as the most destructive (so far) for the 2017 hurricane season. An entire year worth of rainfall occurred over the course of a few days. The city of Houston was greatly impacted as the storm lingered over the city for five days, causing a record-breaking 50+ inches of rain as well as severe damage from flooding. Flood model simulations were performed to reconstruct the event in order to better understand, assess, and predict flooding dynamics for the future. Additionally, number of remote sensing platforms, and on ground instruments that provide near real-time data have also been used for flood identification, monitoring, and damage assessment. Although both flood models and remote sensing techniques are able to identify inundated areas, rapid and accurate flood prediction at a high spatio-temporal resolution remains a challenge. Thus a methodological approach which fuses the two techniques can help to better validate what is being modeled and observed. Recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed emergency responders to more efficiently extract increasingly precise and relevant knowledge from the available information. In this work the use of multiple sources of contributed data, coupled with remotely sensed and open source geospatial datasets is demonstrated to generate an understanding of potential damage assessment for the floods after Hurricane Harvey in Harris County, Texas. The feasibility of integrating multiple sources at different temporal and spatial resolutions into hydrodynamic models for flood inundation simulations is assessed. Furthermore the contributed datasets are compared against a reconstructed flood extent generated from the Flood2D-GPU model.

  6. Benchmarking flood models from space in near real-time: accommodating SRTM height measurement errors with low resolution flood imagery

    Science.gov (United States)

    Schumann, G.; di Baldassarre, G.; Alsdorf, D.; Bates, P. D.

    2009-04-01

    In February 2000, the Shuttle Radar Topography Mission (SRTM) measured the elevation of most of the Earth's surface with spatially continuous sampling and an absolute vertical accuracy greater than 9 m. The vertical error has been shown to change with topographic complexity, being less important over flat terrain. This allows water surface slopes to be measured and associated discharge volumes to be estimated for open channels in large basins, such as the Amazon. Building on these capabilities, this paper demonstrates that near real-time coarse resolution radar imagery of a recent flood event on a 98 km reach of the River Po (Northern Italy) combined with SRTM terrain height data leads to a water slope remarkably similar to that derived by combining the radar image with highly accurate airborne laser altimetry. Moreover, it is shown that this space-borne flood wave approximation compares well to a hydraulic model and thus allows the performance of the latter, calibrated on a previous event, to be assessed when applied to an event of different magnitude in near real-time. These results are not only of great importance to real-time flood management and flood forecasting but also support the upcoming Surface Water and Ocean Topography (SWOT) mission that will routinely provide water levels and slopes with higher precision around the globe.

  7. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  8. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  9. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  10. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  11. Paleoflood Data, Extreme Floods and Frequency: Data and Models for Dam Safety Risk Scenarios

    Science.gov (United States)

    England, J. F.; Godaire, J.; Klinger, R.

    2007-12-01

    Extreme floods and probability estimates are crucial components in dam safety risk analysis and scenarios for water-resources decision making. The field-based collection of paleoflood data provides needed information on the magnitude and probability of extreme floods at locations of interest in a watershed or region. The stratigraphic record present along streams in the form of terrace and floodplain deposits represent direct indicators of the magnitude of large floods on a river, and may provide 10 to 100 times longer records than conventional stream gaging records of large floods. Paleoflood data is combined with gage and historical streamflow estimates to gain insights to flood frequency scaling, model extrapolations and uncertainty, and provide input scenarios to risk analysis event trees. We illustrate current data collection and flood frequency modeling approaches via case studies in the western United States, including the American River in California and the Arkansas River in Colorado. These studies demonstrate the integration of applied field geology, hydraulics, and surface-water hydrology. Results from these studies illustrate the gains in information content on extreme floods, provide data- based means to separate flood generation processes, guide flood frequency model extrapolations, and reduce uncertainties. These data and scenarios strongly influence water resources management decisions.

  12. Flooding correlations in narrow channel

    International Nuclear Information System (INIS)

    Kim, S. H.; Baek, W. P.; Chang, S. H.

    1999-01-01

    Heat transfer in narrow gap is considered as important phenomena in severe accidents in nuclear power plants. Also in heat removal of electric chip. Critical heat flux(CHF) in narrow gap limits the maximum heat transfer rate in narrow channel. In case of closed bottom channel, flooding limited CHF occurrence is observed. Flooding correlations will be helpful to predict the CHF in closed bottom channel. In present study, flooding data for narrow channel geometry were collected and the work to recognize the effect of the span, w and gap size, s were performed. And new flooding correlations were suggested for high-aspect-ratio geometry. Also, flooding correlation was applied to flooding limited CHF data

  13. Flood Hazard Area

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  14. Flood Hazard Boundaries

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  15. Base Flood Elevation

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  16. Hurricane coastal flood analysis using multispectral spectral images

    Science.gov (United States)

    Ogashawara, I.; Ferreira, C.; Curtarelli, M. P.

    2013-12-01

    for the non-flooded area the NDWI increased after the hurricane landfall. The average value varied from 0.15 to 0.43 and the median value from 0.13 to 0.43. These results demonstrate that these differences can be explored for the mapping of flood areas. As NDWI was developed to quantify the amount of water in the leaf of the plants, the increase of the value is expected within the amount of water that the leaf will absorb. However in flooded areas the amount of water is so high that it is possible that the reflectance follows the water spectral behavior absorbing more than reflecting in the Near Infrared region. Thus, remote sensing techniques showed to be powerful tools since they could characterize flooded areas. However further studies are needed, applying and validating these techniques for other regions and different storms. Optical remote sensing is promising for many applications, since it will be an open door to studies of spatial and temporal analysis of the flood impacts mainly in areas with remote access and with a lack of in situ data.

  17. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  18. Flood control design requirements and flood evaluation methods of inland nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Ailing; Wang Ping; Zhu Jingxing

    2011-01-01

    Effect of flooding is one of the key safety factors and environmental factors in inland nuclear power plant sitting. Up to now, the rule of law and standard systems are established for the selection of nuclear power plant location and flood control requirements in China. In this paper flood control standards of China and other countries are introduced. Several inland nuclear power plants are taken as examples to thoroughly discuss the related flood evaluation methods. The suggestions are also put forward in the paper. (authors)

  19. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  20. Integrating human behaviour dynamics into flood disaster risk assessment

    Science.gov (United States)

    Aerts, J. C. J. H.; Botzen, W. J.; Clarke, K. C.; Cutter, S. L.; Hall, J. W.; Merz, B.; Michel-Kerjan, E.; Mysiak, J.; Surminski, S.; Kunreuther, H.

    2018-03-01

    The behaviour of individuals, businesses, and government entities before, during, and immediately after a disaster can dramatically affect the impact and recovery time. However, existing risk-assessment methods rarely include this critical factor. In this Perspective, we show why this is a concern, and demonstrate that although initial efforts have inevitably represented human behaviour in limited terms, innovations in flood-risk assessment that integrate societal behaviour and behavioural adaptation dynamics into such quantifications may lead to more accurate characterization of risks and improved assessment of the effectiveness of risk-management strategies and investments. Such multidisciplinary approaches can inform flood-risk management policy development.

  1. Risk Analysis of Reservoir Flood Routing Calculation Based on Inflow Forecast Uncertainty

    Directory of Open Access Journals (Sweden)

    Binquan Li

    2016-10-01

    Full Text Available Possible risks in reservoir flood control and regulation cannot be objectively assessed by deterministic flood forecasts, resulting in the probability of reservoir failure. We demonstrated a risk analysis of reservoir flood routing calculation accounting for inflow forecast uncertainty in a sub-basin of Huaihe River, China. The Xinanjiang model was used to provide deterministic flood forecasts, and was combined with the Hydrologic Uncertainty Processor (HUP to quantify reservoir inflow uncertainty in the probability density function (PDF form. Furthermore, the PDFs of reservoir water level (RWL and the risk rate of RWL exceeding a defined safety control level could be obtained. Results suggested that the median forecast (50th percentiles of HUP showed better agreement with observed inflows than the Xinanjiang model did in terms of the performance measures of flood process, peak, and volume. In addition, most observations (77.2% were bracketed by the uncertainty band of 90% confidence interval, with some small exceptions of high flows. Results proved that this framework of risk analysis could provide not only the deterministic forecasts of inflow and RWL, but also the fundamental uncertainty information (e.g., 90% confidence band for the reservoir flood routing calculation.

  2. Flood analyses for Department of Energy Y-12, ORNL and K-25 Plants. Flood analyses in support of flood emergency planning

    International Nuclear Information System (INIS)

    1995-05-01

    The study involved defining the flood potential and local rainfall depth and duration data for the Department of Energy's (DOE) Y-12, Oak Ridge National Laboratory (ORNL), and K-25 plants. All three plants are subject to flooding from the Clinch River. In addition, the Y-12 plant is subject to flooding from East Fork Poplar and Bear Creeks, the ORNL plant from Whiteoak Creek and Melton Branch, and the K-25 plant from Poplar Creek. Determination of flood levels included consideration of both rainfall events and postulated failures of Norris and Melton Hill Dams in seismic events

  3. Fault tree analysis for urban flooding

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.; Van Gelder, P.H.A.J.M.

    2008-01-01

    Traditional methods to evaluate flood risk mostly focus on storm events as the main cause of flooding. Fault tree analysis is a technique that is able to model all potential causes of flooding and to quantify both the overall probability of flooding and the contributions of all causes of flooding to

  4. Flood-resilient waterfront development in New York City: bridging flood insurance, building codes, and flood zoning.

    Science.gov (United States)

    Aerts, Jeroen C J H; Botzen, W J Wouter

    2011-06-01

    Waterfronts are attractive areas for many-often competing-uses in New York City (NYC) and are seen as multifunctional locations for economic, environmental, and social activities on the interface between land and water. The NYC waterfront plays a crucial role as a first line of flood defense and in managing flood risk and protecting the city from future climate change and sea-level rise. The city of New York has embarked on a climate adaptation program (PlaNYC) outlining the policies needed to anticipate the impacts of climate change. As part of this policy, the Department of City Planning has recently prepared Vision 2020: New York City Comprehensive Waterfront Plan for the over 500 miles of NYC waterfront (NYC-DCP, 2011). An integral part of the vision is to improve resilience to climate change and sea-level rise. This study seeks to provide guidance for advancing the goals of NYC Vision 2020 by assessing how flood insurance, flood zoning, and building code policies can contribute to waterfront development that is more resilient to climate change. © 2011 New York Academy of Sciences.

  5. Forecast-based Integrated Flood Detection System for Emergency Response and Disaster Risk Reduction (Flood-FINDER)

    Science.gov (United States)

    Arcorace, Mauro; Silvestro, Francesco; Rudari, Roberto; Boni, Giorgio; Dell'Oro, Luca; Bjorgo, Einar

    2016-04-01

    Most flood prone areas in the globe are mainly located in developing countries where making communities more flood resilient is a priority. Despite different flood forecasting initiatives are now available from academia and research centers, what is often missing is the connection between the timely hazard detection and the community response to warnings. In order to bridge the gap between science and decision makers, UN agencies play a key role on the dissemination of information in the field and on capacity-building to local governments. In this context, having a reliable global early warning system in the UN would concretely improve existing in house capacities for Humanitarian Response and the Disaster Risk Reduction. For those reasons, UNITAR-UNOSAT has developed together with USGS and CIMA Foundation a Global Flood EWS called "Flood-FINDER". The Flood-FINDER system is a modelling chain which includes meteorological, hydrological and hydraulic models that are accurately linked to enable the production of warnings and forecast inundation scenarios up to three weeks in advance. The system is forced with global satellite derived precipitation products and Numerical Weather Prediction outputs. The modelling chain is based on the "Continuum" hydrological model and risk assessments produced for GAR2015. In combination with existing hydraulically reconditioned SRTM data and 1D hydraulic models, flood scenarios are derived at multiple scales and resolutions. Climate and flood data are shared through a Web GIS integrated platform. First validation of the modelling chain has been conducted through a flood hindcasting test case, over the Chao Phraya river basin in Thailand, using multi temporal satellite-based analysis derived for the exceptional flood event of 2011. In terms of humanitarian relief operations, the EO-based services of flood mapping in rush mode generally suffer from delays caused by the time required for their activation, programming, acquisitions and

  6. Applying the Flood Vulnerability Index as a Knowledge base for flood risk assessment

    NARCIS (Netherlands)

    Balica, S-F.

    2012-01-01

    Floods are one of the most common and widely distributed natural risks to life and property worldwide. An important part of modern flood risk management is to evaluate vulnerability to floods. This evaluation can be done only by using a parametric approach. Worldwide there is a need to enhance our

  7. Iowa Flood Information System: Towards Integrated Data Management, Analysis and Visualization

    Science.gov (United States)

    Demir, I.; Krajewski, W. F.; Goska, R.; Mantilla, R.; Weber, L. J.; Young, N.

    2012-04-01

    in advance to help minimize damage of floods. This presentation provides an overview and live demonstration of the tools and interfaces in the IFIS developed to date to provide a platform for one-stop access to flood related data, visualizations, flood conditions, and forecast.

  8. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  9. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  11. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  12. Indirect Damage of Urban Flooding: Investigation of Flood-Induced Traffic Congestion Using Dynamic Modeling

    Directory of Open Access Journals (Sweden)

    Jingxuan Zhu

    2018-05-01

    Full Text Available In many countries, industrialization has led to rapid urbanization. Increased frequency of urban flooding is one consequence of the expansion of urban areas which can seriously affect the productivity and livelihoods of urban residents. Therefore, it is of vital importance to study the effects of rainfall and urban flooding on traffic congestion and driver behavior. In this study, a comprehensive method to analyze the influence of urban flooding on traffic congestion was developed. First, a flood simulation was conducted to predict the spatiotemporal distribution of flooding based on Storm Water Management Model (SWMM and TELAMAC-2D. Second, an agent-based model (ABM was used to simulate driver behavior during a period of urban flooding, and a car-following model was established. Finally, in order to study the mechanisms behind how urban flooding affects traffic congestion, the impact of flooding on urban traffic was investigated based on a case study of the urban area of Lishui, China, covering an area of 4.4 km2. It was found that for most events, two-hour rainfall has a certain impact on traffic congestion over a five-hour period, with the greatest impact during the hour following the cessation of the rain. Furthermore, the effects of rainfall with 10- and 20-year return periods were found to be similar and small, whereas the effects with a 50-year return period were obvious. Based on a combined analysis of hydrology and transportation, the proposed methods and conclusions could help to reduce traffic congestion during flood seasons, to facilitate early warning and risk management of urban flooding, and to assist users in making informed decisions regarding travel.

  13. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  14. Modeling urbanized watershed flood response changes with distributed hydrological model: key hydrological processes, parameterization and case studies

    Science.gov (United States)

    Chen, Y.

    2017-12-01

    urbanization, and the results show urbanization has big impact on the watershed flood responses. The peak flow increased a few times after urbanization which is much higher than previous reports.

  15. Estimating design flood and HEC-RAS modelling approach for flood analysis in Bojonegoro city

    Science.gov (United States)

    Prastica, R. M. S.; Maitri, C.; Hermawan, A.; Nugroho, P. C.; Sutjiningsih, D.; Anggraheni, E.

    2018-03-01

    Bojonegoro faces flood every year with less advanced prevention development. Bojonegoro city development could not peak because the flood results material losses. It affects every sectors in Bojonegoro: education, politics, economy, social, and infrastructure development. This research aims to analyse and to ensure that river capacity has high probability to be the main factor of flood in Bojonegoro. Flood discharge analysis uses Nakayasu synthetic unit hydrograph for period of 5 years, 10 years, 25 years, 50 years, and 100 years. They would be compared to the water maximum capacity that could be loaded by downstream part of Bengawan Solo River in Bojonegoro. According to analysis result, Bengawan Solo River in Bojonegoro could not able to load flood discharges. Another method used is HEC-RAS analysis. The conclusion that shown by HEC-RAS analysis has the same view. It could be observed that flood water loading is more than full bank capacity elevation in the river. To conclude, the main factor that should be noticed by government to solve flood problem is river capacity.

  16. Quantifying riverine and storm-surge flood risk by single-family residence: application to Texas.

    Science.gov (United States)

    Czajkowski, Jeffrey; Kunreuther, Howard; Michel-Kerjan, Erwann

    2013-12-01

    The development of catastrophe models in recent years allows for assessment of the flood hazard much more effectively than when the federally run National Flood Insurance Program (NFIP) was created in 1968. We propose and then demonstrate a methodological approach to determine pure premiums based on the entire distribution of possible flood events. We apply hazard, exposure, and vulnerability analyses to a sample of 300,000 single-family residences in two counties in Texas (Travis and Galveston) using state-of-the-art flood catastrophe models. Even in zones of similar flood risk classification by FEMA there is substantial variation in exposure between coastal and inland flood risk. For instance, homes in the designated moderate-risk X500/B zones in Galveston are exposed to a flood risk on average 2.5 times greater than residences in X500/B zones in Travis. The results also show very similar average annual loss (corrected for exposure) for a number of residences despite their being in different FEMA flood zones. We also find significant storm-surge exposure outside of the FEMA designated storm-surge risk zones. Taken together these findings highlight the importance of a microanalysis of flood exposure. The process of aggregating risk at a flood zone level-as currently undertaken by FEMA-provides a false sense of uniformity. As our analysis indicates, the technology to delineate the flood risks exists today. © 2013 Society for Risk Analysis.

  17. BIG GEO DATA MANAGEMENT: AN EXPLORATION WITH SOCIAL MEDIA AND TELECOMMUNICATIONS OPEN DATA

    Directory of Open Access Journals (Sweden)

    C. Arias Munoz

    2016-06-01

    Full Text Available The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.

  18. Big Geo Data Management: AN Exploration with Social Media and Telecommunications Open Data

    Science.gov (United States)

    Arias Munoz, C.; Brovelli, M. A.; Corti, S.; Zamboni, G.

    2016-06-01

    The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed) made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.

  19. Flood Water Segmentation from Crowdsourced Images

    Science.gov (United States)

    Nguyen, J. K.; Minsker, B. S.

    2017-12-01

    In the United States, 176 people were killed by flooding in 2015. Along with the loss of human lives is the economic cost which is estimated to be $4.5 billion per flood event. Urban flooding has become a recent concern due to the increase in population, urbanization, and global warming. As more and more people are moving into towns and cities with infrastructure incapable of coping with floods, there is a need for more scalable solutions for urban flood management.The proliferation of camera-equipped mobile devices have led to a new source of information for flood research. In-situ photographs captured by people provide information at the local level that remotely sensed images fail to capture. Applications of crowdsourced images to flood research required understanding the content of the image without the need for user input. This paper addresses the problem of how to automatically segment a flooded and non-flooded region in crowdsourced images. Previous works require two images taken at similar angle and perspective of the location when it is flooded and when it is not flooded. We examine three different algorithms from the computer vision literature that are able to perform segmentation using a single flood image without these assumptions. The performance of each algorithm is evaluated on a collection of labeled crowdsourced flood images. We show that it is possible to achieve a segmentation accuracy of 80% using just a single image.

  20. Numerical Analysis of Flood modeling of upper Citarum River under Extreme Flood Condition

    Science.gov (United States)

    Siregar, R. I.

    2018-02-01

    This paper focuses on how to approach the numerical method and computation to analyse flood parameters. Water level and flood discharge are the flood parameters solved by numerical methods approach. Numerical method performed on this paper for unsteady flow conditions have strengths and weaknesses, among others easily applied to the following cases in which the boundary irregular flow. The study area is in upper Citarum Watershed, Bandung, West Java. This paper uses computation approach with Force2 programming and HEC-RAS to solve the flow problem in upper Citarum River, to investigate and forecast extreme flood condition. Numerical analysis based on extreme flood events that have occurred in the upper Citarum watershed. The result of water level parameter modeling and extreme flood discharge compared with measurement data to analyse validation. The inundation area about flood that happened in 2010 is about 75.26 square kilometres. Comparing two-method show that the FEM analysis with Force2 programs has the best approach to validation data with Nash Index is 0.84 and HEC-RAS that is 0.76 for water level. For discharge data Nash Index obtained the result analysis use Force2 is 0.80 and with use HEC-RAS is 0.79.

  1. Flood Hazards - A National Threat

    Science.gov (United States)

    ,

    2006-01-01

    In the late summer of 2005, the remarkable flooding brought by Hurricane Katrina, which caused more than $200 billion in losses, constituted the costliest natural disaster in U.S. history. However, even in typical years, flooding causes billions of dollars in damage and threatens lives and property in every State. Natural processes, such as hurricanes, weather systems, and snowmelt, can cause floods. Failure of levees and dams and inadequate drainage in urban areas can also result in flooding. On average, floods kill about 140 people each year and cause $6 billion in property damage. Although loss of life to floods during the past half-century has declined, mostly because of improved warning systems, economic losses have continued to rise due to increased urbanization and coastal development.

  2. A satellite and model based flood inundation climatology of Australia

    Science.gov (United States)

    Schumann, G.; Andreadis, K.; Castillo, C. J.

    2013-12-01

    To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.

  3. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    Full Text Available The WTI2017 project is responsible for the development of flood defence assessment tools for the 3600 km of Dutch primary flood defences, dikes/levees, dunes and hydraulic structures. These tools are necessary, as per January 1st 2017, the new flood risk management policy for the Netherlands will be implemented. Then, the seven decades old design practice (maximum water level methodology of 1958 and two decades old safety standards (and maximum hydraulic load methodology of 1996 will formally be replaced by a more risked based approach for the national policy in flood risk management. The formal flood defence assessment is an important part of this new policy, especially for flood defence managers, since national and regional funding for reinforcement is based on this assessment. This new flood defence policy is based on a maximum allowable probability of flooding. For this, a maximum acceptable individual risk was determined at 1/100 000 per year, this is the probability of life loss of for every protected area in the Netherlands. Safety standards of flood defences were then determined based on this acceptable individual risk. The results were adjusted based on information from cost -benefit analysis, societal risk and large scale societal disruption due to the failure of critical infrastructure e.g. power stations. The resulting riskbased flood defence safety standards range from a 300 to a 100 000 year return period for failure. Two policy studies, WV21 (Safety from floods in the 21st century and VNK-2 (the National Flood Risk in 2010 provided the essential information to determine the new risk based safety standards for flood defences. The WTI2017 project will provide the safety assessment tools based on these new standards and is thus an essential element for the implementation of this policy change. A major issue to be tackled was the development of user-friendly tools, as the new assessment is to be carried out by personnel of the

  4. Declining vulnerability to river floods and the global benefits of adaptation.

    Science.gov (United States)

    Jongman, Brenden; Winsemius, Hessel C; Aerts, Jeroen C J H; Coughlan de Perez, Erin; van Aalst, Maarten K; Kron, Wolfgang; Ward, Philip J

    2015-05-05

    The global impacts of river floods are substantial and rising. Effective adaptation to the increasing risks requires an in-depth understanding of the physical and socioeconomic drivers of risk. Whereas the modeling of flood hazard and exposure has improved greatly, compelling evidence on spatiotemporal patterns in vulnerability of societies around the world is still lacking. Due to this knowledge gap, the effects of vulnerability on global flood risk are not fully understood, and future projections of fatalities and losses available today are based on simplistic assumptions or do not include vulnerability. We show for the first time (to our knowledge) that trends and fluctuations in vulnerability to river floods around the world can be estimated by dynamic high-resolution modeling of flood hazard and exposure. We find that rising per-capita income coincided with a global decline in vulnerability between 1980 and 2010, which is reflected in decreasing mortality and losses as a share of the people and gross domestic product exposed to inundation. The results also demonstrate that vulnerability levels in low- and high-income countries have been converging, due to a relatively strong trend of vulnerability reduction in developing countries. Finally, we present projections of flood losses and fatalities under 100 individual scenario and model combinations, and three possible global vulnerability scenarios. The projections emphasize that materialized flood risk largely results from human behavior and that future risk increases can be largely contained using effective disaster risk reduction strategies.

  5. Flash Flood Detection in Urban Cities Using Ultrasonic and Infrared Sensors

    KAUST Repository

    Mousa, Mustafa; Zhang, Xiangliang; Claudel, Christian

    2016-01-01

    Floods are the most common type of natural disaster. Often leading to loss of lives and properties in the thousands yearly. Among these events, urban flash floods are particularly deadly because of the short timescales on which they occur, and because of the population density of cities. Since most flood casualties are caused by a lack of information on the impending flood (type, location, severity), sensing these events is critical to generate accurate and detailed warnings and short term forecasts. However, no dedicated flash flood sensing systems, that could monitor the propagation of flash floods, in real time, currently exist in cities. In the present paper, firstly a new sensing device that can simultaneously monitor urban flash floods and traffic congestion has been presented. This sensing device is based on the combination of ultrasonic range-finding with remote temperature sensing, and can sense both phenomena with a high degree of accuracy, using a combination of L1-regularized reconstruction and artificial neural networks to process measurement data. Secondly, corresponding algorithms have been implemented on a low-power wireless sensor platform, and their performance in water level estimation in a 6 months test involving four different sensors is illustrated. The results demonstrate that urban water levels can be reliably estimated with error less than 2 cm, and that the preprocessing and machine learning schemes can run in real-time on currently available wireless sensor platforms.

  6. Flash Flood Detection in Urban Cities Using Ultrasonic and Infrared Sensors

    KAUST Repository

    Mousa, Mustafa

    2016-07-19

    Floods are the most common type of natural disaster. Often leading to loss of lives and properties in the thousands yearly. Among these events, urban flash floods are particularly deadly because of the short timescales on which they occur, and because of the population density of cities. Since most flood casualties are caused by a lack of information on the impending flood (type, location, severity), sensing these events is critical to generate accurate and detailed warnings and short term forecasts. However, no dedicated flash flood sensing systems, that could monitor the propagation of flash floods, in real time, currently exist in cities. In the present paper, firstly a new sensing device that can simultaneously monitor urban flash floods and traffic congestion has been presented. This sensing device is based on the combination of ultrasonic range-finding with remote temperature sensing, and can sense both phenomena with a high degree of accuracy, using a combination of L1-regularized reconstruction and artificial neural networks to process measurement data. Secondly, corresponding algorithms have been implemented on a low-power wireless sensor platform, and their performance in water level estimation in a 6 months test involving four different sensors is illustrated. The results demonstrate that urban water levels can be reliably estimated with error less than 2 cm, and that the preprocessing and machine learning schemes can run in real-time on currently available wireless sensor platforms.

  7. Numerical simulation of flood barriers

    Science.gov (United States)

    Srb, Pavel; Petrů, Michal; Kulhavý, Petr

    This paper deals with testing and numerical simulating of flood barriers. The Czech Republic has been hit by several very devastating floods in past years. These floods caused several dozens of causalities and property damage reached billions of Euros. The development of flood measures is very important, especially for the reduction the number of casualties and the amount of property damage. The aim of flood control measures is the detention of water outside populated areas and drainage of water from populated areas as soon as possible. For new flood barrier design it is very important to know its behaviour in case of a real flood. During the development of the barrier several standardized tests have to be carried out. Based on the results from these tests numerical simulation was compiled using Abaqus software and some analyses were carried out. Based on these numerical simulations it will be possible to predict the behaviour of barriers and thus improve their design.

  8. Use of a dam break model to assess flooding at Haddam Neck Nuclear Power Plant

    International Nuclear Information System (INIS)

    Scherrer, J.S.; Chery, D.L. Jr.

    1984-01-01

    Because of their proximity to necessary supplies of cooling water, nuclear power plants are susceptible to riverine flooding. Greater flood hazards exist where plants are located downstream of larger dams. The consequences of the Quabbin Reservoir dam failure on the Haddam Neck Nuclear Power Plant situated on the Connecticut River were investigated using a dam break flood routing model. Reasons for selecting a particular model are presented and the input assumption for the modeling process are developed. Relevant information concerning the level of manpower involvement is presented. The findings of this analysis demonstrate that the plant is adequately protected from the consequences of the postulated flood event

  9. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  10. Measuring flood footprint of a regional economy - A case study for the UK flooding

    Science.gov (United States)

    Guan, D.

    2013-12-01

    Analysis of the urban economy and society is central to understanding the broad impacts of flooding and to identify cost-effective adaptation and mitigation measures. Assessments of the flooding impacts on cities have traditionally focused on the initial impact on people and assets. These initial estimates (so-called ';direct damage') are useful both in understanding the immediate implications of damage, and in marshalling the pools of capital and supplies required for re-building after an event. Since different economies as well as societies are coupled, especially under the current economic crisis, any small-scale damage may be multiplied and cascaded throughout wider economic systems and social networks. The direct and indirect damage is currently not evaluated well and could be captured by quantification of what we call the flood footprint. Flooding in one location can impact the whole UK economy. Neglecting these knock-on costs (i.e. the true footprint of the flood) means we might be ignoring the economic benefits and beneficiaries of flood risk management interventions. In 2007, for example, floods cost the economy about £3.2 bn directly, but the wider effect might actually add another 50% to 250% to that. Flood footprint is a measure of the exclusive total socioeconomic impact that is directly and indirectly caused by a flood event to the flooding region and wider economic systems and social networks. We adopt the UK 2012 flooding. An input-output basic dynamic inequalities (BDI) model is used to assess the impact of the floodings on the level of a Yorkshire economy, accounting for interactions between industries through demand and supply of intermediate consumption goods with a circular flow. After the disaster the economy will be unbalanced. The recovery process finishes when the economy is completely balance, i.e., when labour production capacity equals demands and production and all the variables reach pre-disaster levels. The analysis is carried out

  11. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  12. Geochemistry and flooding as determining factors of plant species composition in Dutch winter-flooded riverine grasslands

    NARCIS (Netherlands)

    Beumer, V.; Wirdum, G. van; Beltman, B.; Griffioen, J.; Grootjans, A.P.; Verhoeven, J.T.A.

    2008-01-01

    Dutch water policy aims for more frequent, controlled flooding of river valley floodplains to avoid unwanted flooding elsewhere; in anticipation of increased flooding risks resulting from climate changes. Controlled flooding usually takes place in winter in parts of the valleys which had not been

  13. Promoting adaptive flood risk management: the role and potential of flood recovery mechanisms

    Directory of Open Access Journals (Sweden)

    Priest Sally J

    2016-01-01

    Full Text Available There is a high potential for recovery mechanisms to be used to incentivise the uptake of flood mitigation and loss reduction measures, undertake adaptation and promote community resilience. Indeed, creating a resilient response to flooding requires flood risk management approaches to be aligned and it needs to be ensured that recovery mechanisms to not provide disincentives for individuals and business to take proactive action to reduce risk. However, the degree to which it is desirable and effective for insurers and governments providing compensation to promote resilience and risk reduction depends upon how the cover or compensation is organised and the premiums which are charged. A review of international flood recovery mechanisms has been undertaken to identify firstly the types of schemes that exist and their characteristics. Analysis of existing instruments highlights that there are various potential approaches to encourage or require the uptake of flood mitigation and also discourage the construction of new development in high flood risk. However despite the presence of these instruments, those organising recovery mechanisms could be doing much more to incentivise increased resilience.

  14. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  15. Flood rich periods, flood poor periods and the need to look beyond instrumental records

    Science.gov (United States)

    Lane, S. N.

    2009-04-01

    For many, the later 20th Century and early 21st Century has become synonymous with a growing experience of flood risk. Scientists, politicians and the media have ascribed this to changing climate and there are good hypothetical reasons for human-induced climate change to be impacting upon the magnitude and frequency of extreme weather events. In this paper, I will interrogate this claim more carefully, using the UK's instrumental records of river flow, most of which begin after 1960, but a smaller number of which extend back into the 19th Century. Those records that extent back to the 19th Century suggest that major flood events tend to cluster into periods that are relatively flood rich and relatively flood poor, most notably in larger drainage basins: i.e. there is a clear scale issue. The timing (inset, duration, termination) of these periods varies systematically by region although there is a marked flood poor period for much of the UK during the late 1960s, 1970s and 1980s. It follows that at least some of the current experience of flooding, including why it has taken so many policy-makers and flood victims by surprise, may reflect a transition from a flood poor to a flood rich period, exacerbated by possible climate change impacts. These results point to the need to rethink how we think through what drives flood risk. First, it points to the need to look at some of the fundamental oscillations in core atmospheric drivers, such as the North Atlantic Multidecadal Oscillation, in explaining what drives flood risk. Consideration of precipitation, as opposed to river flow, is more advanced in this respect, and those of us working in rivers need to engage much more thoughtfully with atmospheric scientists. Second, it points to the severe inadequacies in using records of only a few decades duration. Even where these are pooled across adjacent sub-catchments, there is likely to be a severe bias in the estimation of flood return periods when we look at instrumental

  16. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    Science.gov (United States)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  17. On the Use of Global Flood Forecasts and Satellite-Derived Inundation Maps for Flood Monitoring in Data-Sparse Regions

    Directory of Open Access Journals (Sweden)

    Beatriz Revilla-Romero

    2015-11-01

    Full Text Available Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response decisions. Global-scale flood forecasting and satellite-based flood detection systems are currently operating, however their reliability for decision-making applications needs to be assessed. In this study, we performed comparative evaluations of several operational global flood forecasting and flood detection systems, using 10 major flood events recorded over 2012–2014. Specifically, we evaluated the spatial extent and temporal characteristics of flood detections from the Global Flood Detection System (GFDS and the Global Flood Awareness System (GloFAS. Furthermore, we compared the GFDS flood maps with those from NASA’s two Moderate Resolution Imaging Spectroradiometer (MODIS sensors. Results reveal that: (1 general agreement was found between the GFDS and MODIS flood detection systems, (2 large differences exist in the spatio-temporal characteristics of the GFDS detections and GloFAS forecasts, and (3 the quantitative validation of global flood disasters in data-sparse regions is highly challenging. Overall, satellite remote sensing provides useful near real-time flood information that can be useful for risk management. We highlight the known limitations of global flood detection and forecasting systems, and propose ways forward to improve the reliability of large-scale flood monitoring tools.

  18. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  19. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  20. Coupling Modelling of Urban Development and Flood Risk – An Attempt for a Combined Software Framework

    DEFF Research Database (Denmark)

    Löwe, Roland; Sto Domingo, Nina; Urich, Christian

    2015-01-01

    to use the results of the hydraulic simulation to condition DANCE4WATER and to account for flood risk in the simulated urban development. In an Australian case study, we demonstrate that future flood risk can be significantly reduced while maintaining the overall speed of urban development.......We have developed a setup that couples the urban development model DANCE4WATER with the 1D-2D hydraulic model MIKE FLOOD. The setup makes it possible to assess the impact of urban development and infrastructural change scenarios on flood risk in an automated manner. In addition, it permits us...

  1. Reconstruction of the 1945 Wieringermeer Flood

    Science.gov (United States)

    Hoes, O. A. C.; Hut, R. W.; van de Giesen, N. C.; Boomgaard, M.

    2013-03-01

    The present state-of-the-art in flood risk assessment focuses on breach models, flood propagation models, and economic modelling of flood damage. However, models need to be validated with real data to avoid erroneous conclusions. Such reference data can either be historic data, or can be obtained from controlled experiments. The inundation of the Wieringermeer polder in the Netherlands in April 1945 is one of the few examples for which sufficient historical information is available. The objective of this article is to compare the flood simulation with flood data from 1945. The context, the breach growth process and the flood propagation are explained. Key findings for current flood risk management addresses the importance of the drainage canal network during the inundation of a polder, and the uncertainty that follows from not knowing the breach growth parameters. This case study shows that historical floods provide valuable data for the validation of models and reveal lessons that are applicable in current day flood risk management.

  2. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  3. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  4. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  5. Estimating flood discharge using witness movies in post-flood hydrological surveys

    Science.gov (United States)

    Le Coz, Jérôme; Hauet, Alexandre; Le Boursicaud, Raphaël; Pénard, Lionel; Bonnifait, Laurent; Dramais, Guillaume; Thollet, Fabien; Braud, Isabelle

    2015-04-01

    The estimation of streamflow rates based on post-flood surveys is of paramount importance for the investigation of extreme hydrological events. Major uncertainties usually arise from the absence of information on the flow velocities and from the limited spatio-temporal resolution of such surveys. Nowadays, after each flood occuring in populated areas home movies taken from bridges, river banks or even drones are shared by witnesses through Internet platforms like YouTube. Provided that some topography data and additional information are collected, image-based velocimetry techniques can be applied to some of these movie materials, in order to estimate flood discharges. As a contribution to recent post-flood surveys conducted in France, we developed and applied a method for estimating velocities and discharges based on the Large Scale Particle Image Velocimetry (LSPIV) technique. Since the seminal work of Fujita et al. (1998), LSPIV applications to river flows were reported by a number of authors and LSPIV can now be considered a mature technique. However, its application to non-professional movies taken by flood witnesses remains challenging and required some practical developments. The different steps to apply LSPIV analysis to a flood home movie are as follows: (i) select a video of interest; (ii) contact the author for agreement and extra information; (iii) conduct a field topography campaign to georeference Ground Control Points (GCPs), water level and cross-sectional profiles; (iv) preprocess the video before LSPIV analysis: correct lens distortion, align the images, etc.; (v) orthorectify the images to correct perspective effects and know the physical size of pixels; (vi) proceed with the LSPIV analysis to compute the surface velocity field; and (vii) compute discharge according to a user-defined velocity coefficient. Two case studies in French mountainous rivers during extreme floods are presented. The movies were collected on YouTube and field topography

  6. Detecting Flood Variations in Shanghai over 1949–2009 with Mann-Kendall Tests and a Newspaper-Based Database

    Directory of Open Access Journals (Sweden)

    Shiqiang Du

    2015-04-01

    Full Text Available A valuable aid to assessing and managing flood risk lies in a reliable database of historical floods. In this study, a newspaper-based flood database for Shanghai (NFDS for the period 1949–2009 was developed through a systematic scanning of newspapers. After calibration and validation of the database, Mann-Kendall tests and correlation analysis were applied to detect possible changes in flood frequencies. The analysis was carried out for three different flood types: overbank flood, agricultural waterlogging, and urban waterlogging. The compiled NFDS registered 146 floods and 92% of them occurred in the flood-prone season from June to September. The statistical analyses showed that both the annual flood and the floods in June–August increased significantly. Urban waterlogging showed a very strong increasing trend, probably because of insufficient capacity of urban drainage system and impacts of rapid urbanization. By contrast, the decrease in overbank flooding and the slight increase in agricultural waterlogging were likely because of the construction of river levees and seawalls and the upgrade of agricultural drainage systems, respectively. This study demonstrated the usefulness of local newspapers in building a historical flood database and in assessing flood characterization.

  7. Protection of Basic Nuclear Installations Against External Flooding - Guide No. 13

    International Nuclear Information System (INIS)

    2013-01-01

    The French regulations require that the flooding hazard be taken into consideration in the demonstration of nuclear safety of basic nuclear installations (BNI). This guide details the recommendations concerning the external flooding hazard which is defined, for the purpose of this guide, as being a flood whose origin is external to the structures, areas or buildings of the BNI accommodating systems or components to be protected, whatever the cause(s) of that flooding (rainfall, river spates, storms, pipes failures, etc.). An external flood therefore means any flood originating outside the perimeter of the BNI and certain floods originating within the BNI perimeter. The terms 'flood' or 'flooding' as used henceforth designate external flooding. The purpose of this guide is to: - define the situations to consider when assessing the flood hazard for the site in question; - propose an acceptable method of quantifying them; - list recommendations for defining means of protection adapted to the specifics of the flooding hazard, implemented by the licensee according to the life cycle phases of the installation. The guide has taken climate change into account when the state of knowledge so allows. It is necessary to take into account - on the basis of current knowledge - the predictable climate changes for a period representative of the installations' foreseeable life times, and until the next safety review. The use of this guide necessitates prior identification - for the installation in question - of the functions required to demonstrate nuclear safety and which shall be preserved in the event of flooding. These functions are called 'safety functions' in this guide. This guide applies to all the basic nuclear installations defined by article L. L.593-2 of the Environment Code. With regard to radioactive waste disposal installations, this guide only applies to above-ground facilities. This guide can be used to assess the external flooding hazards and the associated

  8. Operational flood forecasting, warning and response for multi-scale flood risks in developing cities

    NARCIS (Netherlands)

    Rogelis Prada, M.C.

    2016-01-01

    Flood early warning systems are recognized as one of the most effective flood risk management instruments when correctly embedded in comprehensive flood risk management strategies and policies. Many efforts around the world are being put in place to advance the components that determine the

  9. Geomorphic changes caused by the 2011 flood at selected sites along the lower Missouri River and comparison to historical floods: Chapter H in 2011 floods of the central United States

    Science.gov (United States)

    Juracek, Kyle E.

    2014-01-01

    An analysis of recent and historical U.S. Geological Survey streamgage information was used to assess geomorphic changes caused by the 2011 flood, in comparison to selected historical floods, at three streamgage sites along the lower Missouri River—Sioux City, Iowa; Omaha, Nebraska; and Kansas City, Missouri. Channel-width change was not evident at the three streamgage sites following the 2011 flood and likely was inhibited by bank stabilization. Pronounced changes in channel-bed elevation were indicated. At Sioux City and Omaha, the geomorphic effects of the 2011 flood were similar in terms of the magnitude of channelbed scour and recovery. At both sites, the 2011 flood caused pronounced scour (about 3 feet) of the channel bed; however, at Omaha, most of the channel-bed scour occurred after the flood had receded. More than 1 year after the flood, the channel bed had only partially recovered (about 1 foot) at both sites. Pronounced scour (about 3 feet at Sioux City and about 1.5 feet at Omaha) also was caused by the 1952 flood, which had a substantially larger peak discharge but was much shorter in duration at both sites. Again, at Omaha, most of the channel- bed scour occurred after the flood had receded. At Sioux City, substantial recovery of the channel bed (about 2.5 feet) was documented 1 year after the 1952 flood. Recovery to the pre-flood elevation was complete by April 1954. The greater recovery following the 1952 flood, compared to the 2011 flood, likely was related to a more abundant sediment supply because the flood predated the completion of most of the main-stem dam, channelization, and bank stabilization projects. At Omaha, following the 1952 flood, the channel bed never fully recovered to its pre-flood elevation. The geomorphic effect of the 2011 flood at Kansas City was fill (about 1 foot) on the channel bed followed by relative stability. The 1952 flood, which had a substantially larger peak discharge but was much shorter in duration, caused

  10. Using open source data for flood risk mapping and management in Brazil

    Science.gov (United States)

    Whitley, Alison; Malloy, James; Chirouze, Manuel

    2013-04-01

    Whitley, A., Malloy, J. and Chirouze, M. Worldwide the frequency and severity of major natural disasters, particularly flooding, has increased. Concurrently, countries such as Brazil are experiencing rapid socio-economic development with growing and increasingly concentrated populations, particularly in urban areas. Hence, it is unsurprising that Brazil has experienced a number of major floods in the past 30 years such as the January 2011 floods which killed 900 people and resulted in significant economic losses of approximately 1 billion US dollars. Understanding, mitigating against and even preventing flood risk is high priority. There is a demand for flood models in many developing economies worldwide for a range of uses including risk management, emergency planning and provision of insurance solutions. However, developing them can be expensive. With an increasing supply of freely-available, open source data, the costs can be significantly reduced, making the tools required for natural hazard risk assessment more accessible. By presenting a flood model developed for eight urban areas of Brazil as part of a collaboration between JBA Risk Management and Guy Carpenter, we explore the value of open source data and demonstrate its usability in a business context within the insurance industry. We begin by detailing the open source data available and compare its suitability to commercially-available equivalents for datasets including digital terrain models and river gauge records. We present flood simulation outputs in order to demonstrate the impact of the choice of dataset on the results obtained and its use in a business context. Via use of the 2D hydraulic model JFlow+, our examples also show how advanced modelling techniques can be used on relatively crude datasets to obtain robust and good quality results. In combination with accessible, standard specification GPU technology and open source data, use of JFlow+ has enabled us to produce large-scale hazard maps

  11. Assessment of static flood modeling techniques: application to contrasting marshes flooded during Xynthia (western France

    Directory of Open Access Journals (Sweden)

    J. F. Breilh

    2013-06-01

    Full Text Available This study aims to assess the performance of raster-based flood modeling methods on a wide diversity of coastal marshes. These methods are applied to the flooding associated with the storm Xynthia, which severely hit the western coast of France in February 2010. Static and semi-dynamic methods are assessed using a combination of LiDAR data, post-storm delineation of flooded areas and sea levels originating from both tide gauge measurements and storm surge modeling. Static methods are applied to 27 marshes showing a wide geomorphological diversity. It appears that these methods are suitable for marshes with a small distance between the coastline and the landward boundary of the marsh, which causes these marshes to flood rapidly. On the contrary, these methods overpredict flooded areas for large marshes where the distance between the coastline and the landward boundary of the marsh is large, because the flooding cannot be considered as instantaneous. In this case, semi-dynamic methods based on surge overflowing volume calculations can improve the flooding prediction significantly. This study suggests that static and semi-dynamic flood modeling methods can be attractive and quickly deployed to rapidly produce predictive flood maps of vulnerable areas under certain conditions, particularly for small distances between the coastline and the landward boundary of the low-lying coastal area.

  12. Effect of catchment properties and flood generation regime on copula selection for bivariate flood frequency analysis

    Science.gov (United States)

    Filipova, Valeriya; Lawrence, Deborah; Klempe, Harald

    2018-02-01

    Applying copula-based bivariate flood frequency analysis is advantageous because the results provide information on both the flood peak and volume. More data are, however, required for such an analysis, and it is often the case that only data series with a limited record length are available. To overcome this issue of limited record length, data regarding climatic and geomorphological properties can be used to complement statistical methods. In this paper, we present a study of 27 catchments located throughout Norway, in which we assess whether catchment properties, flood generation processes and flood regime have an effect on the correlation between flood peak and volume and, in turn, on the selection of copulas. To achieve this, the annual maximum flood events were first classified into events generated primarily by rainfall, snowmelt or a combination of these. The catchments were then classified into flood regime, depending on the predominant flood generation process producing the annual maximum flood events. A contingency table and Fisher's exact test were used to determine the factors that affect the selection of copulas in the study area. The results show that the two-parameter copulas BB1 and BB7 are more commonly selected in catchments with high steepness, high mean annual runoff and rainfall flood regime. These findings suggest that in these types of catchments, the dependence structure between flood peak and volume is more complex and cannot be modeled effectively using a one-parameter copula. The results illustrate that by relating copula types to flood regime and catchment properties, additional information can be supplied for selecting copulas in catchments with limited data.

  13. Field Demonstration of Carbon Dioxide Miscible Flooding in the Lansing-Kansas City Formation, Central Kansas

    Energy Technology Data Exchange (ETDEWEB)

    Alan Byrnes; G. Paul Willhite; Don Green; Richard Pancake; JyunSyung Tsau; W. Lynn Watney; John Doveton; Willard Guy; Rodney Reynolds; Dave Murfin; James Daniels; Russell Martin; William Flanders; Dave Vander Griend; Eric Mork; Paul Cantrell

    2010-03-07

    A pilot carbon dioxide miscible flood was initiated in the Lansing Kansas City C formation in the Hall Gurney Field, Russell County, Kansas. The reservoir zone is an oomoldic carbonate located at a depth of about 2900 feet. The pilot consists of one carbon dioxide injection well and three production wells. Continuous carbon dioxide injection began on December 2, 2003. By the end of June 2005, 16.19 MM lb of carbon dioxide was injected into the pilot area. Injection was converted to water on June 21, 2005 to reduce operating costs to a breakeven level with the expectation that sufficient carbon dioxide was injected to displace the oil bank to the production wells by water injection. By March 7,2010, 8,736 bbl of oil were produced from the pilot. Production from wells to the northwest of the pilot region indicates that oil displaced from carbon dioxide injection was produced from Colliver A7, Colliver A3, Colliver A14 and Graham A4 located on adjacent leases. About 19,166 bbl of incremental oil were estimated to have been produced from these wells as of March 7, 2010. There is evidence of a directional permeability trend toward the NW through the pilot region. The majority of the injected carbon dioxide remains in the pilot region, which has been maintained at a pressure at or above the minimum miscibility pressure. Estimated oil recovery attributed to the CO2 flood is 27,902 bbl which is equivalent to a gross CO2 utilization of 4.8 MCF/bbl. The pilot project is not economic.

  14. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  15. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  16. Toward more flood resilience: Is a diversification of flood risk management strategies the way forward?

    Directory of Open Access Journals (Sweden)

    Dries L. T. Hegger

    2016-12-01

    Full Text Available European countries face increasing flood risks because of urbanization, increase of exposure and damage potential, and the effects of climate change. In literature and in practice, it is argued that a diversification of strategies for flood risk management (FRM, including flood risk prevention (through proactive spatial planning, flood defense, flood risk mitigation, flood preparation, and flood recovery, makes countries more flood resilient. Although this thesis is plausible, it should still be empirically scrutinized. We aim to do this. Drawing on existing literature we operationalize the notion of "flood resilience" into three capacities: capacity to resist; capacity to absorb and recover; and capacity to transform and adapt. Based on findings from the EU FP7 project STAR-FLOOD, we explore the degree of diversification of FRM strategies and related flood risk governance arrangements at the national level in Belgium, England, France, the Netherlands, Poland, and Sweden, as well as these countries' achievement in terms of the three capacities. We found that the Netherlands and to a lesser extent Belgium have a strong capacity to resist, France a strong capacity to absorb and recover, and especially England a high capacity to transform and adapt. Having a diverse portfolio of FRM strategies in place may be conducive to high achievements related to the capacities to absorb/recover and to transform and adapt. Hence, we conclude that diversification of FRM strategies contributes to resilience. However, the diversification thesis should be nuanced in the sense that there are different ways to be resilient. First, the three capacities imply different rationales and normative starting points for flood risk governance, the choice between which is inherently political. Second, we found trade-offs between the three capacities, e.g., being resistant seems to lower the possibility to be absorbent. Third, to explain countries' achievements in terms of

  17. Developing a Global Database of Historic Flood Events to Support Machine Learning Flood Prediction in Google Earth Engine

    Science.gov (United States)

    Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.

    2016-12-01

    There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.

  18. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  19. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  20. Observing floods from space: Experience gained from COSMO-SkyMed observations

    Science.gov (United States)

    Pierdicca, N.; Pulvirenti, L.; Chini, M.; Guerriero, L.; Candela, L.

    2013-03-01

    The COSMO-SkyMed mission offers a unique opportunity to obtain all weather radar images characterized by short revisit time, thus being useful for flood evolution mapping. The COSMO-SkyMed system has been activated several times in the last few years in occasion of flood events all over the world in order to provide very high resolution X-band SAR images useful for flood detection purposes. This paper discusses the major outcomes of the experience gained, within the framework of the OPERA Pilot Project funded by the Italian Space Agency, from using COSMO-SkyMed data for the purpose of near real time generation of flood maps. A review of the mechanisms which determine the imprints of the inundation on the radar images and of the fundamental simulation tools able to predict these imprints and help image interpretation is provided. The approach developed to process the data and to generate the flood maps is also summarized. Then, the paper illustrates the experience gained with COSMO-SkyMed by describing and discussing a number of significant examples. These examples demonstrate the potential of the COSMO-SkyMed system and the suitability of the approach developed for generating the final products, but they also highlight some critical aspects that require further investigations to improve the reliability of the flood maps.

  1. Developing a Malaysia flood model

    Science.gov (United States)

    Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina

    2014-05-01

    Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.

  2. The influence of climate change on flood risks in France ­- first estimates and uncertainty analysis

    OpenAIRE

    Dumas , Patrice; Hallegatte , Sréphane; Quintana-Seguí , Pere; Martin , Eric

    2013-01-01

    International audience; Abstract. This paper proposes a methodology to project the possible evolution of river flood damages due to climate change, and applies it to mainland France. Its main contributions are (i) to demonstrate a methodology to investigate the full causal chain from global climate change to local economic flood losses; (ii) to show that future flood losses may change in a very significant manner over France; (iii) to show that a very large uncertainty arises from the climate...

  3. Characterising Record Flooding in the United Kingdom

    Science.gov (United States)

    Cox, A.; Bates, P. D.; Smith, J. A.

    2017-12-01

    Though the most notable floods in history have been carefully explained, there remains a lack of literature that explores the nature of record floods as a whole in the United Kingdom. We characterise the seasonality, statistical and spatial distribution, and meteorological causes of peak river flows for 521 gauging stations spread across the British Isles. We use annual maximum data from the National River Flow Archive, catchment descriptors from the Flood Estimation Handbook, and historical records of large floods. What we aim to find is in what ways, if any, the record flood for a station is different from more 'typical' floods. For each station, we calculate two indices: the seasonal anomaly and the flood index. Broadly, the seasonal anomaly is the degree to which a station's record flood happens at a different time of year compared to typical floods at that site, whilst the flood index is a station's record flood discharge divided by the discharge of the 1-in-10-year return period event. We find that while annual maximum peaks are dominated by winter frontal rainfall, record floods are disproportionately caused by summer convective rainfall. This analysis also shows that the larger the seasonal anomaly, the higher the flood index. Additionally, stations across the country have record floods that occur in the summer with no notable spatial pattern, yet the most seasonally anomalous record events are concentrated around the south and west of the British Isles. Catchment descriptors tell us little about the flood index at a particular station, but generally areas with lower mean annual precipitation have a higher flood index. The inclusion of case studies from recent and historical examples of notable floods across the UK supplements our analysis and gives insight into how typical these events are, both statistically and meteorologically. Ultimately, record floods in general happen at relatively unexpected times and with unpredictable magnitudes, which is a

  4. Using Memory in the Right Way to Accelerate Big Data Processing

    Institute of Scientific and Technical Information of China (English)

    阎栋; 尹绪森; 连城; 钟翔; 周鑫; 吴甘沙

    2015-01-01

    Big data processing is becoming a standout part of data center computation. However, latest research has indicated that big data workloads cannot make full use of modern memory systems. We find that the dramatic inefficiency of the big data processing is from the enormous amount of cache misses and stalls of the depended memory accesses. In this paper, we introduce two optimizations to tackle these problems. The first one is the slice-and-merge strategy, which reduces the cache miss rate of the sort procedure. The second optimization is direct-memory-access, which reforms the data structure used in key/value storage. These optimizations are evaluated with both micro-benchmarks and the real-world benchmark HiBench. The results of our micro-benchmarks clearly demonstrate the effectiveness of our optimizations in terms of hardware event counts; and the additional results of HiBench show the 1.21X average speedup on the application-level. Both results illustrate that careful hardware/software co-design will improve the memory efficiency of big data processing. Our work has already been integrated into Intel distribution for Apache Hadoop.

  5. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  6. Natural Flood Management in context: evaluating and enhancing the impact.

    Science.gov (United States)

    Metcalfe, Peter; Beven, Keith; Hankin, Barry; Lamb, Rob

    2016-04-01

    The series of flood events in the UK throughout December 2015 have led to calls for a reappraisal of the country's approach to flood management. In parts of Cumbria so-called "1 in 100" year floods have occurred three times in the last ten years, leading to significant infrastructure damage. Hard-engineered defences upgraded to cope with an anticipated 20% increase in peak flows and these 1% AEP events have been overwhelmed. It has become more widely acknowledged that unsympathetic agricultural and upland management practices, mainly since the Second World War, have led to a significant loss of storage in mid and upper catchments and their consequent ability to retain and slow storm run-off. Natural Flood Management (NFM) is a nature-based solution to restoring this storage and flood peak attenuation through a network of small-scale features exploiting natural topography and materials. Combined with other "soft" interventions such as restoring flood plain roughness and tree-planting, NFM offers the attractive prospect of an intervention that can target both the ecological and chemical objectives of the Water Framework Directive and the resilience demanded by the Floods Directive. We developed a simple computerised physical routing model that can account for the presence of in-channel and offline features such as would be found in a NFM scheme. These will add storage to the channel and floodplain and throttle the downstream discharge at storm flows. The model was applied to the heavily-modified channel network of an agricultural catchment in North Yorkshire using the run-off simulated for two storm events that caused flooding downstream in the autumn of 2012. Using up to 60 online features we demonstrated some gains in channel storage and a small impact on the flood hydrograph which would, however, have been insufficient to prevent the downstream floods in either of the storms. Complementary research at JBA has applied their hydrodynamic model JFLOW+ to identify

  7. Sex-specific responses to winter flooding, spring waterlogging and post-flooding recovery in Populus deltoides

    OpenAIRE

    Ling-Feng Miao; Fan Yang; Chun-Yu Han; Yu-Jin Pu; Yang Ding; Li-Jia Zhang

    2017-01-01

    Winter flooding events are common in some rivers and streams due to dam constructions, and flooding and waterlogging inhibit the growth of trees in riparian zones. This study investigated sex-specific morphological, physiological and ultrastructural responses to various durations of winter flooding and spring waterlogging stresses, and post-flooding recovery characteristics in Populus deltoides. There were no significant differences in the morphological, ultrastructural and the majority of ph...

  8. Application of RUNTA code in flood analyses

    International Nuclear Information System (INIS)

    Perez Martin, F.; Benitez Fonzalez, F.

    1994-01-01

    Flood probability analyses carried out to date indicate the need to evaluate a large number of flood scenarios. This necessity is due to a variety of reasons, the most important of which include: - Large number of potential flood sources - Wide variety of characteristics of flood sources - Large possibility of flood-affected areas becoming inter linked, depending on the location of the potential flood sources - Diversity of flood flows from one flood source, depending on the size of the rupture and mode of operation - Isolation times applicable - Uncertainties in respect of the structural resistance of doors, penetration seals and floors - Applicable degrees of obstruction of floor drainage system Consequently, a tool which carries out the large number of calculations usually required in flood analyses, with speed and flexibility, is considered necessary. The RUNTA Code enables the range of possible scenarios to be calculated numerically, in accordance with all those parameters which, as a result of previous flood analyses, it is necessary to take into account in order to cover all the possible floods associated with each flood area

  9. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  10. Protection of base nuclear installations against external flooding - Guide nr 13, release of the 08/01/2013

    International Nuclear Information System (INIS)

    2013-01-01

    As the French law requires the flooding risk to be taken into account in the demonstration of the nuclear safety of base nuclear installations (INB), this guide aims at defining situations to be taken into account when assessing the flooding risk for a site (identification of water sources and of flooding causes, definition of flooding situations), at proposing an acceptable method to quantify these situations (local rains, rise of water level, problems on hydraulic works, dam failure, ocean waves, and so on), and at listing recommendations to define the protection means which are adapted to the specificities of the flooding risk, and are implemented by the operator with respect to the installation lifetime

  11. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  12. Adjustable Robust Strategies for Flood Protection

    NARCIS (Netherlands)

    Postek, Krzysztof; den Hertog, Dick; Kind, J.; Pustjens, Chris

    2016-01-01

    Flood protection is of major importance to many flood-prone regions and involves substantial investment and maintenance costs. Modern flood risk management requires often to determine a cost-efficient protection strategy, i.e., one with lowest possible long run cost and satisfying flood protection

  13. Applying a coupled hydrometeorological simulation system to flash flood forecasting over the Korean Peninsula

    Science.gov (United States)

    Ryu, Young; Lim, Yoon-Jin; Ji, Hee-Sook; Park, Hyun-Hee; Chang, Eun-Chul; Kim, Baek-Jo

    2017-11-01

    In flash flood forecasting, it is necessary to consider not only traditional meteorological variables such as precipitation, evapotranspiration, and soil moisture, but also hydrological components such as streamflow. To address this challenge, the application of high resolution coupled atmospheric-hydrological models is emerging as a promising alternative. This study demonstrates the feasibility of linking a coupled atmospheric-hydrological model (WRF/WRFHydro) with 150-m horizontal grid spacing for flash flood forecasting in Korea. The study area is the Namgang Dam basin in Southern Korea, a mountainous area located downstream of Jiri Mountain (1915 m in height). Under flash flood conditions, the simulated precipitation over the entire basin is comparable to the domain-averaged precipitation, but discharge data from WRF-Hydro shows some differences in the total available water and the temporal distribution of streamflow (given by the timing of the streamflow peak following precipitation), compared to observations. On the basis of sensitivity tests, the parameters controlling the infiltration of excess precipitation and channel roughness depending on stream order are refined and their influence on temporal distribution of streamflow is addressed with intent to apply WRF-Hydro to flash flood forecasting in the Namgang Dam basin. The simulation results from the WRF-Hydro model with optimized parameters demonstrate the potential utility of a coupled atmospheric-hydrological model for forecasting heavy rain-induced flash flooding over the Korean Peninsula.

  14. Flood Inundation Mapping and Emergency Operations during Hurricane Harvey

    Science.gov (United States)

    Fang, N. Z.; Cotter, J.; Gao, S.; Bedient, P. B.; Yung, A.; Penland, C.

    2017-12-01

    Hurricane Harvey struck the Gulf Coast as Category 4 on August 25, 2017 with devastating and life-threatening floods in Texas. Harris County received up to 49 inches of rainfall over a 5-day period and experienced flooding level and impacts beyond any previous storm in Houston's history. The depth-duration-frequency analysis reveals that the areal average rainfall for Brays Bayou surpasses the 500-year rainfall in both 24 and 48 hours. To cope with this unprecedented event, the researchers at the University of Texas at Arlington and Rice University worked closely with the U.S. Army Corps of Engineers (USACE), the National Weather Service (NWS), the Texas Division of Emergency Management (TDEM), Walter P. Moore and Associates, Inc. and Halff Associates, to conduct a series of meteorological, hydrologic and hydraulic analyses to delineate flood inundation maps. Up to eight major watersheds in Harris County were delineated based the available QPE data from WGRFC. The inundation map over Brays Bayou with their impacts from Hurricane Harvey was delineated in comparison with those of 100-, 500-year, and Probable Maximum Precipitation (PMP) design storms. This presentation will provide insights for both engineers and planners to re-evaluate the existing flood infrastructure and policy, which will help build Houston stronger for future extreme storms. The collaborative effort among the federal, academic, and private entities clearly demonstrates an effective approach for flood inundation mapping initiatives for the nation.

  15. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  16. Internal flooding analyses results of Slovak NPPs

    International Nuclear Information System (INIS)

    Sopira, Vladimir

    2000-01-01

    The assessment of the flood risk was the objective of the internal flooding analysis for NPPs Bohunice V1, V2 and Mochovce. All important flooding sources were identified. The rooms containing safety important components were analyzed from the point of view of: Integrity of flood boundaries; Capability for drainage; Flood signalisation; Flood localization and liquidation; Vulnerability of safety system component. The redundancies of safety systems are located mostly separately and no flood can endanger more than single train. It can be concluded that NPPs with WWER-440 are very safe against the flooding initiating event

  17. Smoky River coal flood risk mapping study

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-06-01

    The Canada-Alberta Flood Damage Reduction Program (FDRP) is designed to reduce flood damage by identifying areas susceptible to flooding and by encouraging application of suitable land use planning, zoning, and flood preparedness and proofing. The purpose of this study is to define flood risk and floodway limits along the Smoky River near the former Smoky River Coal (SRC) plant. Alberta Energy has been responsible for the site since the mine and plant closed in 2000. The study describes flooding history, available data, features of the river and valley, calculation of flood levels, and floodway determination, and includes flood risk maps. The HEC-RAS program is used for the calculations. The flood risk area was calculated using the 1:100 year return period flood as the hydrological event. 7 refs., 11 figs., 7 tabs., 3 apps.

  18. Synoptic-scale atmospheric conditions associated with flash flooding in watersheds of the Catskill Mountains, New York, USA

    Science.gov (United States)

    Teale, N. G.; Quiring, S. M.

    2015-12-01

    Understanding flash flooding is important in unfiltered watersheds, such as portions of the New York City water supply system (NYCWSS), as water quality is degraded by turbidity associated with flooding. To further understand flash flooding in watersheds of the NYCWSS, synoptic-scale atmospheric conditions most frequently associated with flash flooding between 1987 and 2013 were examined. Flash floods were identified during this time period using USGS 15-minute discharge data at the Esopus Creek near Allaben, NY and Neversink River at Claryville, NY gauges. Overall, 25 flash floods were detected, occurring over 17 separate flash flood days. These flash flood days were compared to the days on which flash flood warnings encompassing the study area was issued by the National Weather Service. The success rate for which the flash flood warnings for Ulster County coincided with flash flood in the study watershed was 0.09, demonstrating the highly localized nature of flash flooding in the Catskill Mountain region. The synoptic-scale atmospheric patterns influencing the study area were characterized by a principal component analysis and k-means clustering of NCEP/NCAR 500 mb geopotential height reanalysis data. This procedure was executed in Spatial Synoptic Typer Tools 4.0. While 17 unique synoptic patterns were identified, only 3 types were strongly associated with flash flooding events. A strong southwesterly flow suggesting advection of moisture from the Atlantic Ocean and Gulf of Mexico is shown in composites of these 3 types. This multiscalar study thereby links flash flooding in the NYCWSS with synoptic-scale atmospheric circulation.Understanding flash flooding is important in unfiltered watersheds, such as portions of the New York City water supply system (NYCWSS), as water quality is degraded by turbidity associated with flooding. To further understand flash flooding in watersheds of the NYCWSS, synoptic-scale atmospheric conditions most frequently associated with

  19. Elementary Teachers' Comprehension of Flooding through Inquiry-based Professional Development and Use of Self-regulation Strategies

    Science.gov (United States)

    Lewis, Elizabeth B.; van der Hoeven Kraft, Katrien J.; Bueno Watts, Nievita; Baker, Dale R.; Wilson, Meredith J.; Lang, Michael

    2011-07-01

    This study focuses on elementary teachers' comprehension of flooding before and after inquiry-based professional development (PD). There was an improvement in teachers' understanding toward a normative view from pre- to post-test (n = 17, mean gain = 4.3, SD = 3.27). Several misunderstandings and a general lack of knowledge about flooding emerged from the geoscience content two-tier pre-test, some of which persisted throughout the PD seminar while other responses provided evidence of teachers' improved understanding. The concepts that teachers struggled with were also apparent upon examining teachers' reflections upon their learning and teaching practices throughout the seminar. Teachers were challenged as they attempted to add new academic language, such as storm surge and discharge, to their prior understandings. Flooding concepts that teachers showed the least improvement on included analyzing a topographic region, reading a map image, and hydrograph interpretation. Teachers' greatest areas of improved understanding occurred in understanding the probability and role of ground conditions in flooding events. Teachers demonstrated considerable growth in their understanding of some flooding concepts through scaffolded inquiry lessons modeled throughout the PD. Those teachers who had greater prior knowledge and demonstrated more use of self-regulated learning showed the most change toward a normative view of flooding. The explicit modeling and participation in inquiry-based science activities and written responses to self-regulatory learning prompts throughout the seminar supported teachers' learning.

  20. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding

    Science.gov (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte

    2015-04-01

    flood water, based on either measured waste water pathogen concentrations or on assumptions regarding the prevalence of infections in the population. The exposure (dosage) to pathogens was estimated by multiplying the concentration with literature values for the ingestion of water for different exposure groups (e.g. children, adults). The probability of infection was determined by applying dose response relations and MonteCarlo simulation. The methodology is demonstrated on two cases, i.e one case from a developing country with poor sanitation and one case from a developed country, where climate adaptation is the main issue: The risk of cholera in the City of Dhaka, Bangladesh during a flood event 2004, and the risk of bacterial and viral infections of during a flood event in Copenhagen, Denmark in 2011. Results PIC The historical flood events in Dhaka (2004) and Copenhagen (2011) were successfully modelled. The urban flood model was successfully coupled to QMRA. An example of the results of the quantitative microbial risk assessment given as the average estimated risk of cholera infection for children below 5 years living in slum areas in Dhaka is shown in the figure. Similarly, the risk of infection during the flood event in Copenhagen will be presented in the article. Conclusions We have developed a methodology for the dynamic modeling of the risk of infection during waste water influenced urban flooding. The outcome of the modelling exercise indicates that direct contact with polluted flood water is a likely route of transmission of cholera in Dhaka, and bacterial and viral infectious diseases in Copenhagen. It demonstrates the applicability and the potential for linking urban flood models with QMRA in order to identify interventions to reduce the burden of disease on the population in Dhaka City and Copenhagen.

  1. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  2. Flood risk governance arrangements in Europe

    Science.gov (United States)

    Matczak, P.; Lewandowski, J.; Choryński, A.; Szwed, M.; Kundzewicz, Z. W.

    2015-06-01

    The STAR-FLOOD (Strengthening and Redesigning European Flood Risk Practices Towards Appropriate and Resilient Flood Risk Governance Arrangements) project, funded by the European Commission, investigates strategies for dealing with flood risk in six European countries: Belgium, the UK, France, the Netherlands, Poland and Sweden and in 18 vulnerable urban regions in these countries. The project aims to describe, analyse, explain, and evaluate the main similarities and differences between the selected EU Member States in terms of development and performance of flood risk governance arrangements. It also discusses the scientific and societal importance of these similarities and differences. Attention is paid to identification and characterization of shifts in flood risk governance arrangements and in flood risk management strategies and to determination of triggering factors and restraining factors. An assessment of a change of resilience and appropriateness (legitimacy, effectiveness, efficiency) of flood risk governance arrangements in Poland is presented and comparison with other European countries is offered.

  3. Flood risk governance arrangements in Europe

    Directory of Open Access Journals (Sweden)

    P. Matczak

    2015-06-01

    Full Text Available The STAR-FLOOD (Strengthening and Redesigning European Flood Risk Practices Towards Appropriate and Resilient Flood Risk Governance Arrangements project, funded by the European Commission, investigates strategies for dealing with flood risk in six European countries: Belgium, the UK, France, the Netherlands, Poland and Sweden and in 18 vulnerable urban regions in these countries. The project aims to describe, analyse, explain, and evaluate the main similarities and differences between the selected EU Member States in terms of development and performance of flood risk governance arrangements. It also discusses the scientific and societal importance of these similarities and differences. Attention is paid to identification and characterization of shifts in flood risk governance arrangements and in flood risk management strategies and to determination of triggering factors and restraining factors. An assessment of a change of resilience and appropriateness (legitimacy, effectiveness, efficiency of flood risk governance arrangements in Poland is presented and comparison with other European countries is offered.

  4. Utility of Big Area Additive Manufacturing (BAAM) For The Rapid Manufacture of Customized Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Love, Lonnie J [ORNL

    2015-08-01

    This Oak Ridge National Laboratory (ORNL) Manufacturing Development Facility (MDF) technical collaboration project was conducted in two phases as a CRADA with Local Motors Inc. Phase 1 was previously reported as Advanced Manufacturing of Complex Cyber Mechanical Devices through Community Engagement and Micro-manufacturing and demonstrated the integration of components onto a prototype body part for a vehicle. Phase 2 was reported as Utility of Big Area Additive Manufacturing (BAAM) for the Rapid Manufacture of Customized Electric Vehicles and demonstrated the high profile live printing of an all-electric vehicle using ONRL s Big Area Additive Manufacturing (BAAM) technology. This demonstration generated considerable national attention and successfully demonstrated the capabilities of the BAAM system as developed by ORNL and Cincinnati, Inc. and the feasibility of additive manufacturing of a full scale electric vehicle as envisioned by the CRADA partner Local Motors, Inc.

  5. Tacking Flood Risk from Watersheds using a Natural Flood Risk Management Toolkit

    Science.gov (United States)

    Reaney, S. M.; Pearson, C.; Barber, N.; Fraser, A.

    2017-12-01

    In the UK, flood risk management is moving beyond solely mitigating at the point of impact in towns and key infrastructure to tackle problem at source through a range of landscape based intervention measures. This natural flood risk management (NFM) approach has been trailed within a range of catchments in the UK and is moving towards being adopted as a key part of flood risk management. The approach offers advantages including lower cost and co-benefits for water quality and habitat creation. However, for an agency or group wishing to implement NFM within a catchment, there are two key questions that need to be addressed: Where in the catchment to place the measures? And how many measures are needed to be effective? With this toolkit, these questions are assessed with a two-stage workflow. First, SCIMAP-Flood gives a risk based mapping of likely locations that contribute to the flood peak. This tool uses information on land cover, hydrological connectivity, flood generating rainfall patterns and hydrological travel time distributions to impacted communities. The presented example applies the tool to the River Eden catchment, UK, with 5m grid resolution and hence provide sub-field scale information at the landscape extent. SCIMAP-Flood identifies sub-catchments where physically based catchment hydrological simulation models can be applied to test different NFM based mitigation measures. In this example, the CRUM3 catchment hydrological model has been applied within an uncertainty framework to consider the effectiveness of soil compaction reduction and large woody debris dams within a sub-catchment. It was found that large scale soil aeration to reduce soil compaction levels throughout the catchment is probably the most useful natural flood management measure for this catchment. NFM has potential for wide-spread application and these tools help to ensure that the measures are correctly designed and the scheme performance can be quantitatively assessed and predicted.

  6. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  7. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  8. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  9. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  10. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  11. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  12. Interconnected ponds operation for flood hazard distribution

    Science.gov (United States)

    Putra, S. S.; Ridwan, B. W.

    2016-05-01

    The climatic anomaly, which comes with extreme rainfall, will increase the flood hazard in an area within a short period of time. The river capacity in discharging the flood is not continuous along the river stretch and sensitive to the flood peak. This paper contains the alternatives on how to locate the flood retention pond that are physically feasible to reduce the flood peak. The flood ponds were designed based on flood curve number criteria (TR-55, USDA) with the aim of rapid flood peak capturing and gradual flood retuning back to the river. As a case study, the hydrologic condition of upper Ciliwung river basin with several presumed flood pond locations was conceptually designed. A fundamental tank model that reproducing the operation of interconnected ponds was elaborated to achieve the designed flood discharge that will flows to the downstream area. The flood hazard distribution status, as the model performance criteria, will be computed within Ciliwung river reach in Manggarai Sluice Gate spot. The predicted hazard reduction with the operation of the interconnected retention area result had been bench marked with the normal flow condition.

  13. Compound simulation of fluvial floods and storm surges in a global coupled river-coast flood model: Model development and its application to 2007 Cyclone Sidr in Bangladesh

    Science.gov (United States)

    Ikeuchi, Hiroaki; Hirabayashi, Yukiko; Yamazaki, Dai; Muis, Sanne; Ward, Philip J.; Winsemius, Hessel C.; Verlaan, Martin; Kanae, Shinjiro

    2017-08-01

    Water-related disasters, such as fluvial floods and cyclonic storm surges, are a major concern in the world's mega-delta regions. Furthermore, the simultaneous occurrence of extreme discharges from rivers and storm surges could exacerbate flood risk, compared to when they occur separately. Hence, it is of great importance to assess the compound risks of fluvial and coastal floods at a large scale, including mega-deltas. However, most studies on compound fluvial and coastal flooding have been limited to relatively small scales, and global-scale or large-scale studies have not yet addressed both of them. The objectives of this study are twofold: to develop a global coupled river-coast flood model; and to conduct a simulation of compound fluvial flooding and storm surges in Asian mega-delta regions. A state-of-the-art global river routing model was modified to represent the influence of dynamic sea surface levels on river discharges and water levels. We conducted the experiments by coupling a river model with a global tide and surge reanalysis data set. Results show that water levels in deltas and estuaries are greatly affected by the interaction between river discharge, ocean tides and storm surges. The effects of storm surges on fluvial flooding are further examined from a regional perspective, focusing on the case of Cyclone Sidr in the Ganges-Brahmaputra-Meghna Delta in 2007. Modeled results demonstrate that a >3 m storm surge propagated more than 200 km inland along rivers. We show that the performance of global river routing models can be improved by including sea level dynamics.

  14. Communicating Flood Risk with Street-Level Data

    Science.gov (United States)

    Sanders, B. F.; Matthew, R.; Houston, D.; Cheung, W. H.; Karlin, B.; Schubert, J.; Gallien, T.; Luke, A.; Contreras, S.; Goodrich, K.; Feldman, D.; Basolo, V.; Serrano, K.; Reyes, A.

    2015-12-01

    Coastal communities around the world face significant and growing flood risks that require an accelerating adaptation response, and fine-resolution urban flood models could serve a pivotal role in enabling communities to meet this need. Such models depict impacts at the level of individual buildings and land parcels or "street level" - the same spatial scale at which individuals are best able to process flood risk information - constituting a powerful tool to help communities build better understandings of flood vulnerabilities and identify cost-effective interventions. To measure understanding of flood risk within a community and the potential impact of street-level models, we carried out a household survey of flood risk awareness in Newport Beach, California, a highly urbanized coastal lowland that presently experiences nuisance flooding from high tides, waves and rainfall and is expected to experience a significant increase in flood frequency and intensity with climate change. Interviews were completed with the aid of a wireless-enabled tablet device that respondents could use to identify areas they understood to be at risk of flooding and to view either a Federal Emergency Management Agency (FEMA) flood map or a more detailed map prepared with a hydrodynamic urban coastal flood model (UCI map) built with grid cells as fine as 3 m resolution and validated with historical flood data. Results indicate differences in the effectiveness of the UCI and FEMA maps at communicating the spatial distribution of flood risk, gender differences in how the maps affect flood understanding, and spatial biases in the perception of flood vulnerabilities.

  15. An Investigation on the Sensitivity of the Parameters of Urban Flood Model

    Science.gov (United States)

    M, A. B.; Lohani, B.; Jain, A.

    2015-12-01

    Global climatic change has triggered weather patterns which lead to heavy and sudden rainfall in different parts of world. The impact of heavy rainfall is severe especially on urban areas in the form of urban flooding. In order to understand the effect of heavy rainfall induced flooding, it is necessary to model the entire flooding scenario more accurately, which is now becoming possible with the availability of high resolution airborne LiDAR data and other real time observations. However, there is not much understanding on the optimal use of these data and on the effect of other parameters on the performance of the flood model. This study aims at developing understanding on these issues. In view of the above discussion, the aim of this study is to (i) understand that how the use of high resolution LiDAR data improves the performance of urban flood model, and (ii) understand the sensitivity of various hydrological parameters on urban flood modelling. In this study, modelling of flooding in urban areas due to heavy rainfall is carried out considering Indian Institute of Technology (IIT) Kanpur, India as the study site. The existing model MIKE FLOOD, which is accepted by Federal Emergency Management Agency (FEMA), is used along with the high resolution airborne LiDAR data. Once the model is setup it is made to run by changing the parameters such as resolution of Digital Surface Model (DSM), manning's roughness, initial losses, catchment description, concentration time, runoff reduction factor. In order to realize this, the results obtained from the model are compared with the field observations. The parametric study carried out in this work demonstrates that the selection of catchment description plays a very important role in urban flood modelling. Results also show the significant impact of resolution of DSM, initial losses and concentration time on urban flood model. This study will help in understanding the effect of various parameters that should be part of a

  16. Assessment of channel changes, model of historical floods, and effects of backwater on flood stage, and flood mitigation alternatives for the Wichita River at Wichita Falls, Texas

    Science.gov (United States)

    Winters, Karl E.; Baldys, Stanley

    2011-01-01

    In cooperation with the City of Wichita Falls, the U.S. Geological Survey assessed channel changes on the Wichita River at Wichita Falls, Texas, and modeled historical floods to investigate possible causes and potential mitigation alternatives to higher flood stages in recent (2007 and 2008) floods. Extreme flooding occurred on the Wichita River on June 30, 2007, inundating 167 homes in Wichita Falls. Although a record flood stage was reached in June 2007, the peak discharge was much less than some historical floods at Wichita Falls. Streamflow and stage data from two gages on the Wichita River and one on Holliday Creek were used to assess the interaction of the two streams. Changes in the Wichita River channel were evaluated using historical aerial and ground photography, comparison of recent and historical cross sections, and comparison of channel roughness coefficients with those from earlier studies. The floods of 2007 and 2008 were modeled using a one-dimensional step-backwater model. Calibrated channel roughness was larger for the 2007 flood compared to the 2008 flood, and the 2007 flood peaked about 4 feet higher than the 2008 flood. Calibration of the 1941 flood yielded a channel roughness coefficient (Manning's n) of 0.030, which represents a fairly clean natural channel. The step-backwater model was also used to evaluate the following potential mitigation alternatives: (1) increasing the capacity of the bypass channel near River Road in Wichita Falls, Texas; (2) removal of obstructions near the Scott Avenue and Martin Luther King Junior Boulevard bridges in Wichita Falls, Texas; (3) widening of aggraded channel banks in the reach between Martin Luther King Junior Boulevard and River Road; and (4) reducing channel bank and overbank roughness. Reductions in water-surface elevations ranged from 0.1 foot to as much as 3.0 feet for the different mitigation alternatives. The effects of implementing a combination of different flood-mitigation alternatives were

  17. The Complex Relationship Between Heavy Storms and Floods: Implication on Stormwater Drainage design and Management

    Science.gov (United States)

    Demissie, Y.; Mortuza, M. R.; Moges, E.; Yan, E.; Li, H. Y.

    2017-12-01

    Due to the lack of historical and future streamflow data for flood frequency analysis at or near most drainage sites, it is a common practice to directly estimate the design flood (maximum discharge or volume of stream for a given return period) based on storm frequency analysis and the resulted Intensity-Duration-Frequency (IDF) curves. Such analysis assumes a direct relationship between storms and floods with, for example, the 10-year rainfall expected to produce the 10-year flood. However, in reality, a storm is just one factor among the many other hydrological and metrological factors that can affect the peak flow and hydrograph. Consequently, a heavy storm does not necessarily always lead to flooding or a flood events with the same frequency. This is evident by the observed difference in the seasonality of heavy storms and floods in most regions. In order to understand site specific causal-effect relationship between heavy storms and floods and improve the flood analysis for stormwater drainage design and management, we have examined the contributions of various factors that affect floods using statistical and information theory methods. Based on the identified dominant causal-effect relationships, hydrologic and probability analyses were conducted to develop the runoff IDF curves taking into consideration the snowmelt and rain-on-snow effect, the difference in the storm and flood seasonality, soil moisture conditions, and catchment potential for flash and riverine flooding. The approach was demonstrated using data from military installations located in different parts of the United States. The accuracy of the flood frequency analysis and the resulted runoff IDF curves were evaluated based on the runoff IDF curves developed from streamflow measurements.

  18. Merging information from multi-model flood projections in a hierarchical Bayesian framework

    Science.gov (United States)

    Le Vine, Nataliya

    2016-04-01

    Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  19. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  20. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  1. Structural master plan of flood mitigation measures

    Directory of Open Access Journals (Sweden)

    A. Heidari

    2009-01-01

    Full Text Available Flood protection is one of the practical methods in damage reduction. Although it not possible to be completely protected from flood disaster but major part of damages can be reduced by mitigation plans. In this paper, the optimum flood mitigation master plan is determined by economic evaluation in trading off between the construction costs and expected value of damage reduction as the benefits. Size of the certain mitigation alternative is also be obtained by risk analysis by accepting possibility of flood overtopping. Different flood mitigation alternatives are investigated from various aspects in the Dez and Karun river floodplain areas as a case study in south west of IRAN. The results show that detention dam and flood diversion are the best alternatives of flood mitigation methods as well as enforcing the flood control purpose of upstream multipurpose reservoirs. Dyke and levees are not mostly justifiable because of negative impact on down stream by enhancing routed flood peak discharge magnitude and flood damages as well.

  2. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  3. Do flood risk perceptions provide useful insights for flood risk management? Findings from central Vietnam

    NARCIS (Netherlands)

    Bubeck, P.; Botzen, W.J.W.; Suu, L.T.T.; Aerts, J.C.J.H.

    2012-01-01

    Following the renewed attention for non-structural flood risk reduction measures implemented at the household level, there has been an increased interest in individual flood risk perceptions. The reason for this is the commonly-made assumption that flood risk perceptions drive the motivation of

  4. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  5. Composite Flood Risk for Virgin Island

    Science.gov (United States)

    The Composite Flood Risk layer combines flood hazard datasets from Federal Emergency Management Agency (FEMA) flood zones, NOAA's Shallow Coastal Flooding, and the National Hurricane Center SLOSH model for Storm Surge inundation for category 1, 2, and 3 hurricanes.Geographic areas are represented by a grid of 10 by 10 meter cells and each cell has a ranking based on variation in exposure to flooding hazards: Moderate, High and Extreme exposure. Geographic areas in each input layers are ranked based on their probability of flood risk exposure. The logic was such that areas exposed to flooding on a more frequent basis were given a higher ranking. Thus the ranking incorporates the probability of the area being flooded. For example, even though a Category 3 storm surge has higher flooding elevations, the likelihood of the occurrence is lower than a Category 1 storm surge and therefore the Category 3 flood area is given a lower exposure ranking. Extreme exposure areas are those areas that are exposed to relatively frequent flooding.The ranked input layers are then converted to a raster for the creation of the composite risk layer by using cell statistics in spatial analysis. The highest exposure ranking for a given cell in any of the three input layers is assigned to the corresponding cell in the composite layer.For example, if an area (a cell) is rank as medium in the FEMA layer, moderate in the SLOSH layer, but extreme in the SCF layer, the cell will be considere

  6. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  7. A methodology to derive Synthetic Design Hydrographs for river flood management

    Science.gov (United States)

    Tomirotti, Massimo; Mignosa, Paolo

    2017-12-01

    The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.

  8. Integrated Urban Flood Analysis considering Optimal Operation of Flood Control Facilities in Urban Drainage Networks

    Science.gov (United States)

    Moon, Y. I.; Kim, M. S.; Choi, J. H.; Yuk, G. M.

    2017-12-01

    eavy rainfall has become a recent major cause of urban area flooding due to the climate change and urbanization. To prevent property damage along with casualties, a system which can alert and forecast urban flooding must be developed. Optimal performance of reducing flood damage can be expected of urban drainage facilities when operated in smaller rainfall events over extreme ones. Thus, the purpose of this study is to execute: A) flood forecasting system using runoff analysis based on short term rainfall; and B) flood warning system which operates based on the data from pump stations and rainwater storage in urban basins. In result of the analysis, it is shown that urban drainage facilities using short term rainfall forecasting data by radar will be more effective to reduce urban flood damage than using only the inflow data of the facility. Keywords: Heavy Rainfall, Urban Flood, Short-term Rainfall Forecasting, Optimal operating of urban drainage facilities. AcknowledgmentsThis research was supported by a grant (17AWMP-B066744-05) from Advanced Water Management Research Program (AWMP) funded by Ministry of Land, Infrastructure and Transport of Korean government.

  9. Validation of individual and aggregate global flood hazard models for two major floods in Africa.

    Science.gov (United States)

    Trigg, M.; Bernhofen, M.; Whyman, C.

    2017-12-01

    A recent intercomparison of global flood hazard models undertaken by the Global Flood Partnership shows that there is an urgent requirement to undertake more validation of the models against flood observations. As part of the intercomparison, the aggregated model dataset resulting from the project was provided as open access data. We compare the individual and aggregated flood extent output from the six global models and test these against two major floods in the African Continent within the last decade, namely severe flooding on the Niger River in Nigeria in 2012, and on the Zambezi River in Mozambique in 2007. We test if aggregating different number and combination of models increases model fit to the observations compared with the individual model outputs. We present results that illustrate some of the challenges of comparing imperfect models with imperfect observations and also that of defining the probability of a real event in order to test standard model output probabilities. Finally, we propose a collective set of open access validation flood events, with associated observational data and descriptions that provide a standard set of tests across different climates and hydraulic conditions.

  10. Crowdsourcing detailed flood data

    Science.gov (United States)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  11. Structural evaluation of multifunctional flood defenses

    NARCIS (Netherlands)

    Voorendt, M.Z.; Kothuis, Baukje; Kok, Matthijs

    2017-01-01

    Flood risk reduction aims to minimize losses in low-lying areas. One of the ways to reduce flood risks is to protect land by means of flood defenses. The Netherlands has a long tradition of flood protection and, therefore, a wide variety of technical reports written

  12. Real-time flood monitoring and warning system

    Directory of Open Access Journals (Sweden)

    Jirapon Sunkpho

    2011-04-01

    Full Text Available Flooding is one of the major disasters occurring in various parts of the world. The system for real-time monitoring ofwater conditions: water level; flow; and precipitation level, was developed to be employed in monitoring flood in Nakhon SiThammarat, a southern province in Thailand. The two main objectives of the developed system is to serve 1 as informationchannel for flooding between the involved authorities and experts to enhance their responsibilities and collaboration and2 as a web based information source for the public, responding to their need for information on water condition and flooding.The developed system is composed of three major components: sensor network, processing/transmission unit, and database/application server. These real-time data of water condition can be monitored remotely by utilizing wireless sensors networkthat utilizes the mobile General Packet Radio Service (GPRS communication in order to transmit measured data to theapplication server. We implemented a so-called VirtualCOM, a middleware that enables application server to communicatewith the remote sensors connected to a GPRS data unit (GDU. With VirtualCOM, a GDU behaves as if it is a cable directlyconnected the remote sensors to the application server. The application server is a web-based system implemented usingPHP and JAVA as the web application and MySQL as its relational database. Users can view real-time water conditionas well as the forecasting of the water condition directly from the web via web browser or via WAP. The developed systemhas demonstrated the applicability of today’s sensors in wirelessly monitor real-time water conditions.

  13. Flood hazards for nuclear power plants

    International Nuclear Information System (INIS)

    Yen, B.C.

    1988-01-01

    Flooding hazards for nuclear power plants may be caused by various external geophysical events. In this paper the hydrologic hazards from flash floods, river floods and heavy rain at the plant site are considered. Depending on the mode of analysis, two types of hazard evaluation are identified: 1) design hazard which is the probability of flooding over an expected service period, and 2) operational hazard which deals with real-time forecasting of the probability of flooding of an incoming event. Hazard evaluation techniques using flood frequency analysis can only be used for type 1) design hazard. Evaluation techniques using rainfall-runoff simulation or multi-station correlation can be used for both types of hazard prediction. (orig.)

  14. Cache Management of Big Data in Equipment Condition Assessment

    Directory of Open Access Journals (Sweden)

    Ma Yan

    2016-01-01

    Full Text Available Big data platform for equipment condition assessment is built for comprehensive analysis. The platform has various application demands. According to its response time, its application can be divided into offline, interactive and real-time types. For real-time application, its data processing efficiency is important. In general, data cache is one of the most efficient ways to improve query time. However, big data caching is different from the traditional data caching. In the paper we propose a distributed cache management framework of big data for equipment condition assessment. It consists of three parts: cache structure, cache replacement algorithm and cache placement algorithm. Cache structure is the basis of the latter two algorithms. Based on the framework and algorithms, we make full use of the characteristics of just accessing some valuable data during a period of time, and put relevant data on the neighborhood nodes, which largely reduce network transmission cost. We also validate the performance of our proposed approaches through extensive experiments. It demonstrates that the proposed cache replacement algorithm and cache management framework has higher hit rate or lower query time than LRU algorithm and round-robin algorithm.

  15. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  16. Floods in a changing climate

    Science.gov (United States)

    Theresa K. Andersen; Marshall J. Shepherd

    2013-01-01

    Atmospheric warming and associated hydrological changes have implications for regional flood intensity and frequency. Climate models and hydrological models have the ability to integrate various contributing factors and assess potential changes to hydrology at global to local scales through the century. This survey of floods in a changing climate reviews flood...

  17. Coping with Pluvial Floods by Private Households

    Directory of Open Access Journals (Sweden)

    Viktor Rözer

    2016-07-01

    Full Text Available Pluvial floods have caused severe damage to urban areas in recent years. With a projected increase in extreme precipitation as well as an ongoing urbanization, pluvial flood damage is expected to increase in the future. Therefore, further insights, especially on the adverse consequences of pluvial floods and their mitigation, are needed. To gain more knowledge, empirical damage data from three different pluvial flood events in Germany were collected through computer-aided telephone interviews. Pluvial flood awareness as well as flood experience were found to be low before the respective flood events. The level of private precaution increased considerably after all events, but is mainly focused on measures that are easy to implement. Lower inundation depths, smaller potential losses as compared with fluvial floods, as well as the fact that pluvial flooding may occur everywhere, are expected to cause a shift in damage mitigation from precaution to emergency response. However, an effective implementation of emergency measures was constrained by a low dissemination of early warnings in the study areas. Further improvements of early warning systems including dissemination as well as a rise in pluvial flood preparedness are important to reduce future pluvial flood damage.

  18. Flood loss assessment in the Kota Tinggi

    International Nuclear Information System (INIS)

    Tam, T H; Ibrahim, A L; Rahman, M Z A; Mazura, Z

    2014-01-01

    Malaysia is free from several destructive and widespread natural disasters but frequently affected by floods, which caused massive flood damage. In 2006 and 2007, an extreme rainfall occured in many parts of Peninsular Malaysia, which caused severe flooding in several major cities. Kota Tinggi was chosen as study area as it is one the seriously affected area in Johor state. The aim of this study is to estimate potential flood damage to physical elements in Kota Tinggi. The flood damage map contains both qualitative and quantitative information which corresponds to the consequences of flooding. This study only focuses on physical elements. Three different damage functions were adopted to calculate the potential flood damage and flood depth is considered as the main parameter. The adopted functions are United States, the Netherlands and Malaysia. The estimated flood damage for housing using United States, the Netherlands and Malaysia was RM 350/m 2 RM 200/m 2 and RM 100/m 2 respectively. These results successfully showed the average flood damage of physical element. Such important information needed by local authority and government for urban spatial planning and aiming to reduce flood risk

  19. Lessons Learned from Southeast Asian Floods

    Science.gov (United States)

    Osti, R.; Tanaka, S.

    2009-04-01

    At certain scales, flood has always been the lifeline of many people from Southeast Asian countries. People are traditionally accustomed to living with such floods and their livelihood is adjusted accordingly to optimize the benefits from the floods. However, large scale flood occasionally turns into the disaster and causes massive destruction not only in terms of human causalities but also damage to economic, ecological and social harmonies in the region. Although economic growth is prevailing in a relative term, the capacity of people to cope with such extreme events is weakening therefore the flood disaster risk is increasing in time. Recent examples of flood disaster in the region clearly show the increasing severity of disaster impact. This study reveals that there are many factors, which directly or indirectly influence the change. This paper considers the most prominent natural and socio-economic factors and analyzes their trend with respect to flood disasters in each country's context. A regional scale comparative analysis further helps to exchange the know how and to determine what kind of strategy and policy are lacking to manage the floods in a long run. It is also helpful in identifying the critical sectors that should be addressed first to mitigate the potential damage from the floods.

  20. 44 CFR 65.14 - Remapping of areas for which local flood protection systems no longer provide base flood protection.

    Science.gov (United States)

    2010-10-01

    ... local flood protection systems no longer provide base flood protection. 65.14 Section 65.14 Emergency... § 65.14 Remapping of areas for which local flood protection systems no longer provide base flood... process of restoring a flood protection system that was: (i) Constructed using Federal funds; (ii...