WorldWideScience

Sample records for demonstration flood big

  1. Big Muddy Field Low-Tension Flood Demonstration Project. Sixth annual report, April 1983-March 1984

    Energy Technology Data Exchange (ETDEWEB)

    1984-10-01

    The Big Muddy low-tension flood is a commercial-size demonstration project consisting of nine 10-acre injection patterns in the heart of the Big Muddy Oil Field located 15 miles east of Casper, Wyoming. The main goal of the project is to provide data for commercialization of the process for the Big Muddy Field and similar Wyoming and Colorado fields. Other objectives are discussed in previous annual reports. This report discusses the project performance during the polymer drive phase with emphasis on the analyses of oil cut, pattern balance, and early tracer response. The oil rate increased only slightly during 1983 and began to flatten near year-end at about 210 barrels per day or 12% of the injection rate. The injection rate was increased in late 1982 and early 1983 but simply resulted in a net input (influx plus injection) greater than production with only slight improvement in oil rate. In fact, the imbalance is suspected of contributing to the early flattening in oil production. Though the project oil rate flattened, an increased cut was observed in the north row of wells, indicating an oil response to slug injection in all wells except Well 54. Also during 1983, the polymer drive volume increased to about 10% of pore volume or to the midpoint of the polymer drive. Tracer and slug components have still appeared in only a few wells, even after 20% pore volume injection. Oil treating was becoming more troublesome but was relieved when the new treating facility was put into use. 5 references, 91 figures, 7 tables.

  2. Big Muddy Field low-tension flood-demonstration project. Fifth annual report, April 1982-March 1983

    Energy Technology Data Exchange (ETDEWEB)

    Painter, T.R.; Borah, M.T.; Ferrell, H.H.

    1983-08-01

    The Big Muddy low-tension flood is a commercial-size demonstration project consisting of nine 10-acre injection patterns in the heart of the Big Muddy Oil Field located 15 miles east of Casper, Wyoming. The main goal of the project is to provide data for commercialization of the process for the Big Muddy Field and similar Wyoming and Colorado fields. This report discusses the project performance during the last part of slug injection with particular emphasis on the analysis of the early oil response and the injectivity. Other work discussed in this report includes the pilot testing for an oil-treating facility which led to a new design. The oil production rate increased from about 75 BPD at year-end 1981 to about 170 BPD, or from about 4 percent to about 11 percent of the injection rate, in March of 1983. During the same period, the produced oil cut increased from 2 percent to about 5 percent. The low-tension slug injection was completed in August 1982 and injection of a polymer solution having the same mobility is continuing. The total low-tension slug volume was 873,000 barrels or 10.2 percent pore volume. 4 references, 91 figures, 19 tables.

  3. Big Muddy Field Low-Tension Flood Demonstration Project. Third annual report, April 1980-March 1981

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.G.; Ferrell, H.H.; Stewart, W.C.

    1981-11-01

    Objectives of the project are: evaluate a commercial-scale field test using cost-optimized chemical slug size and composition; field test a surfactant system which could be made available in commercial quantities; demonstrate oil recovery effectiveness in multiple patterns; and demonstrate the feasibility of applying a low-tension process to low-permeability sands by using propped fractures in injection and producing wells. The first annual report dealt primarily with drilling, formation evaluation, and preliminary plant design. The second annual report emphasized plant construction and completion of laboratory work to specify the chemicals needed for the project. This report discusses the project operation during the preflush and problems arising during start-up of chemical injection. The most significant operating problem during the preflush was failure of the monel filter screens due to chlorine attack. The monel screens were replaced with polyester cloth screens. The cloth screens worked very well filtering the preflush water. After a short term test in which the 200-square-foot filter showed that the cloth screens would also filter the polymer, polyester cloth screens were ordered as replacement screens for the 800-square-foot product filter. All of the construction and installation necessary for the chemical phase handling and blending were completed, individual components were checked out, and the low-tension slug injection was scheduled to begin the latter half of January. In spite of the preparation, low-tension slug injection has been delayed because of continued faulty filter operation. The exact cause of the erratic filter operation is still being evaluated.

  4. Looking at the big scale - Global Flood Forecasting

    Science.gov (United States)

    Burek, P.; Alfieri, L.; Thielen-del Pozo, J.; Muraro, D.; Pappenberger, F.; Krzeminsk, B.

    2012-04-01

    discharge forecasts are compared with three warning threshold maps. Results are displayed through a password protected web-portal where the members can browse in an easy and intuitive way different aspects of the most recent or past forecasts as spatially distributed information. Critical points in the river channels showing an increased probability of flooding over various forecasts are linked to time series of flood threshold exceedances in order to provide more detailed information. Although the system is still in its infancy and requires further research and development, rigorous testing and adaptations, it has already demonstrated its potential in recent catastrophic floods. The severe floods in Pakistan in July-August 2010 were clearly detected by the system as a major flood event. Recent examples are the floods in the South-Eastern Asia (mainly Thailand, Cambodia and Vietnam) in September-October 2011. For the lower Mekong River, probabilistic forecasts from the global simulations on 18th September 2011 showed a probability higher than 40% of exceeding the high alert level from 2nd to 4th October, hence 14 days in advance. Regarding the devastating monsoon flooding in Thailand, the peak flow of the Chao Phraya River was forecast since mid of September 2011, about 10-15 days before the actual peak occurred and the major losses took place.

  5. Demonstration of Black Liquor Gasification at Big Island

    Energy Technology Data Exchange (ETDEWEB)

    Robert DeCarrera

    2007-04-14

    This Final Technical Report provides an account of the project for the demonstration of Black Liquor Gasification at Georgia-Pacific LLC's Big Island, VA facility. This report covers the period from May 5, 2000 through November 30, 2006.

  6. Assessment of big floods in the Eastern Black Sea Basin of Turkey.

    Science.gov (United States)

    Yüksek, Ömer; Kankal, Murat; Üçüncü, Osman

    2013-01-01

    In this study, general knowledge and some details of the floods in Eastern Black Sea Basin of Turkey are presented. Brief hydro-meteorological analysis of selected nine floods and detailed analysis of the greatest flood are given. In the studied area, 51 big floods have taken place between 1955-2005 years, causing 258 deaths and nearly US $500,000,000 of damage. Most of the floods have occurred in June, July and August. It is concluded that especially for the rainstorms that have caused significantly damages, the return periods of the rainfall heights and resultant flood discharges have gone up to 250 and 500 years, respectively. A general agreement is observed between the return periods of rains and resultant floods. It is concluded that there has been no significant climate change to cause increases in flood harms. The most important human factors to increase the damage are determined as wrong and illegal land use, deforestation and wrong urbanization and settlement, psychological and technical factors. Some structural and non-structural measures to mitigate flood damages are also included in the paper. Structural measures include dykes and flood levees. Main non-structural measures include flood warning system, modification of land use, watershed management and improvement, flood insurance, organization of flood management studies, coordination between related institutions and education of the people and informing of the stakeholders.

  7. Flood-inundation maps for the Big Blue River at Shelbyville, Indiana

    Science.gov (United States)

    Fowler, Kathleen K.

    2017-02-13

    Digital flood-inundation maps for a 4.1-mile reach of the Big Blue River at Shelbyville, Indiana, were created by the U.S. Geological Survey (USGS) in cooperation with the Indiana Office of Community and Rural Affairs. The floodinundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at https://water. usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage on the Big Blue River at Shelbyville, Ind. (station number 03361500). Near-real-time stages at this streamgage may be obtained from the USGS National Water Information System at https://waterdata. usgs.gov/ or the National Weather Service (NWS) Advanced Hydrologic Prediction Service at https://water.weather.gov/ ahps/, which also forecasts flood hydrographs at this site (SBVI3). Flood profiles were computed for the stream reach by means of a one-dimensional step-backwater model. The hydraulic model was calibrated by using the most current stage-discharge relation at the Big Blue River at Shelbyville, Ind., streamgage. The calibrated hydraulic model was then used to compute 12 water-surface profiles for flood stages referenced to the streamgage datum and ranging from 9.0 feet, or near bankfull, to 19.4 feet, the highest stage of the current stage-discharge rating curve. The simulated water-surface profiles were then combined with a Geographic Information System digital elevation model (derived from light detection and ranging [lidar] data having a 0.98-foot vertical accuracy and 4.9-foot horizontal resolution) to delineate the area flooded at each water level. The availability of these maps, along with Internet information regarding current stage from the USGS streamgage at the Big Blue River at Shelbyville, Ind., and forecasted stream stages from the NWS, will provide emergency management personnel and residents with information that is critical for flood response

  8. Big Blue River at Shelbyville, Indiana flood-inundation geospatial datasets​

    Science.gov (United States)

    Fowler, Kathleen K.

    2017-01-01

    Digital flood-inundation maps for a 4.1-mile reach of the Big Blue River at Shelbyville, Indiana, were created by the U.S. Geological Survey (USGS) in cooperation with the Indiana Office of Community and Rural Affairs. The flood-inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage on the Big Blue River at Shelbyville, Indiana (station number 03361500). Near-real-time stages at this streamgage may be obtained from the USGS National Water Information System at http://waterdata.usgs.gov/ or the National Weather Service (NWS) Advanced Hydrologic Prediction Service at http://water.weather.gov/ahps/, which also forecasts flood hydrographs at this site (SBVI3).Flood profiles were computed for the stream reach by means of a one-dimensional step-backwater model. The hydraulic model was calibrated by using the most current stage-discharge relation at the Big Blue River at Shelbyville, Ind., streamgage. The calibrated hydraulic model was then used to compute 12 water-surface profiles for flood stages referenced to the streamgage datum and ranging from 9.0 feet, or near bankfull, to 19.4 feet, the highest stage of the current stage-discharge rating curve. The simulated water-surface profiles were then combined with a Geographic Information System digital elevation model (derived from light detection and ranging [lidar] data having a 0.98-foot vertical accuracy and 4.9-foot horizontal resolution) to delineate the area flooded at each water level.The attached files on this landing page are the inputs and outputs for the U.S. Army Corps of Engineers HEC-RAS model used to create flood-inundation maps for the referenced report, https://doi.org/10.3133/sir20165166. There are two child items that contain final geospatial datasets for the flood-inundation maps

  9. PROCESSING BIG REMOTE SENSING DATA FOR FAST FLOOD DETECTION IN A DISTRIBUTED COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2017-07-01

    Full Text Available The Earth observation (EO missions of the space agencies and space industry (ESA, NASA, national and commercial companies are evolving as never before. These missions aim to develop and launch next-generation series of satellites and sensors and often provide huge amounts of data, even free of charge, to enable novel monitoring services. The wide geospatial sector is targeted to handle new challenges to store, process and visualize these geospatial data, reaching the level of Big Data by their volume, variety, velocity, along with the need of multi-source spatio-temporal geospatial data processing. Handling and analysis of remote sensing data has always been a cumbersome task due to the ever-increasing size and frequency of collected information. This paper presents the achievements of the IQmulus EU FP7 research and development project with respect to processing and analysis of geospatial big data in the context of flood and waterlogging detection.

  10. Processing Big Remote Sensing Data for Fast Flood Detection in a Distributed Computing Environment

    Science.gov (United States)

    Olasz, A.; Kristóf, D.; Nguyen Thai, B.; Belényesi, M.; Giachetta, R.

    2017-07-01

    The Earth observation (EO) missions of the space agencies and space industry (ESA, NASA, national and commercial companies) are evolving as never before. These missions aim to develop and launch next-generation series of satellites and sensors and often provide huge amounts of data, even free of charge, to enable novel monitoring services. The wide geospatial sector is targeted to handle new challenges to store, process and visualize these geospatial data, reaching the level of Big Data by their volume, variety, velocity, along with the need of multi-source spatio-temporal geospatial data processing. Handling and analysis of remote sensing data has always been a cumbersome task due to the ever-increasing size and frequency of collected information. This paper presents the achievements of the IQmulus EU FP7 research and development project with respect to processing and analysis of geospatial big data in the context of flood and waterlogging detection.

  11. The ordered network structure and its prediction for the big floods of the Changjiang River Basins

    Energy Technology Data Exchange (ETDEWEB)

    Men, Ke-Pei; Zhao, Kai; Zhu, Shu-Dan [Nanjing Univ. of Information Science and Technology, Nanjing (China). College of Mathematics and Statistics

    2013-12-15

    According to the latest statistical data of hydrology, a total of 21 floods took place over the Changjiang (Yangtze) River Basins from 1827 to 2012 and showed an obvious commensurable orderliness. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered analysis with complex network technology, we focus on the summary of the ordered network structure of the Changjiang floods, supplement new information, further optimize networks, construct the 2D- and 3D-ordered network structure and make prediction research. Predictions show that the future big deluges will probably occur over the Changjiang River Basin around 2013-2014, 2020-2021, 2030, 2036, 2051, and 2058. (orig.)

  12. From Big Data to Small Transportable Products for Decision Support for Floods in Namibia

    Science.gov (United States)

    Mandl, D.; Frye, S.; Cappelaere, P.; Policelli, F.; Handy, M.; Sohlberg, R. A.; Grossman, R.

    2013-12-01

    During the past four years, a team from NASA, Oklahoma University, University of Maryland and University of Chicago in collaboration with the Namibia Hydrological Services (NHS) has explored ways to provide decision support products for floods. The products include a variety of data including a hydrological model, ground measurements such as river gauges, and earth remote sensing data. This poster or presentation highlights the lessons learned in acquiring, storing, managing big data on the cloud and turning it into relevant products for GEOSS users. Technology that has been explored includes the use of Hadoop/MapReduce and Accumulo to process and manage the large data sets. OpenStreetMap was explored for use in cataloging water boundaries and enabling collaborative mapping of the base water mask and floods. A Flood Dashboard was created to customize displays of various data products. Finally, a higher level Geo-Social Application Processing Interface (API) was developed so that users can discover, generate products dynamically for their specific needs/societal benefit areas and then share them with their Community of Practice over social networks. Results of this experiment have included 100x reduction in size of some flood products, making it possible to distribute these products to mobile platforms and/or bandwidth-limited users.

  13. Flood-inundation maps for a 12.5-mile reach of Big Papillion Creek at Omaha, Nebraska

    Science.gov (United States)

    Strauch, Kellan R.; Dietsch, Benjamin J.; Anderson, Kayla J.

    2016-03-22

    Digital flood-inundation maps for a 12.5-mile reach of the Big Papillion Creek from 0.6 mile upstream from the State Street Bridge to the 72nd Street Bridge in Omaha, Nebraska, were created by the U.S. Geological Survey (USGS) in cooperation with the Papio-Missouri River Natural Resources District. The flood-inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage on the Big Papillion Creek at Fort Street at Omaha, Nebraska (station 06610732). Near-real-time stages at this streamgage may be obtained on the Internet from the USGS National Water Information System at http://waterdata.usgs.gov/ or the National Weather Service Advanced Hydrologic Prediction Service at http:/water.weather.gov/ahps/, which also forecasts flood hydrographs at this site.

  14. Application of Advection-Diffusion Routing Model to Flood Wave Propagation:A Case Study on Big Piney River, Missouri USA

    Institute of Scientific and Technical Information of China (English)

    Yang Yang; Theodore A Endreny; David J Nowak

    2016-01-01

    Flood wave propagation modeling is of critical importance to advancing water re-sources management and protecting human life and property. In this study, we investigated how the advection-diffusion routing model performed in flood wave propagation on a 16 km long down-stream section of the Big Piney River, MO. Model performance was based on gaging station data at the upstream and downstream cross sections. We demonstrated with advection-diffusion theory that for small differences in watershed drainage area between the two river cross sections, inflow along the reach mainly contributes to the downstream hydrograph’s rising limb and not to the falling limb. The downstream hydrograph’s falling limb is primarily determined by the propagated flood wave originating at the upstream cross section. This research suggests the parameter for the advection-diffusion routing model can be calibrated by fitting the hydrograph falling limb. Application of the advection diffusion model to the flood wave of January 29, 2013 supports our theoretical finding that the propagated flood wave determines the downstream cross section falling limb, and the model has good performance in our test examples.

  15. Green River Formation water flood demonstration project. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pennington, B.I.; Dyer, J.E.; Lomax, J.D. [Inland Resources, Inc. (United States)]|[Lomax Exploration Co., Salt Lake City, UT (United States); Deo, M.D. [Utah Univ., Salt Lake City, UT (United States). Dept. of Chemical and Fuels Engineering

    1996-11-01

    The objectives of the project were to understand the oil production mechanisms in the Monument Butte unit via reservoir characterization and reservoir simulations and to transfer the water flooding technology to similar units in the vicinity, particularly the Travis and the Boundary units. The reservoir characterization activity in the project basically consisted of extraction and analysis of a full diameter core, Formation Micro Imaging (FMI) logs from several wells and Magnetic Resonance Imaging (MRI) logs from two wells. In addition, several side-wall cores were drilled and analyzed, oil samples from a number of wells were physically and chemically characterized (using high-temperature gas chromatography), oil-water relative permeabilities were measured and pour points and cloud points of a few oil samples were determined. The reservoir modeling activity comprised of reservoir simulation of all the three units at different scales and near well-bore modeling of the wax precipitation effects. The reservoir simulation activities established the extent of pressurization of the sections of the reservoirs in the immediate vicinity of the Monument Butte unit. This resulted in a major expansion of the unit and the production from this expanded unit increased from about 300 barrels per day to about 2,000 barrels per day.

  16. The geomorphic effectiveness of a large flood on the Rio Grande in the Big Bend region: insights on geomorphic controls and post-flood geomorphic response

    Science.gov (United States)

    Dean, David J.; Schmidt, John C.

    2013-01-01

    Since the 1940s, the Rio Grande in the Big Bend region has undergone long periods of channel narrowing, which have been occasionally interrupted by rare, large floods that widen the channel (termed a channel reset). The most recent channel reset occurred in 2008 following a 17-year period of extremely low stream flow and rapid channel narrowing. Flooding was caused by precipitation associated with the remnants of tropical depression Lowell in the Rio Conchos watershed, the largest tributary to the Rio Grande. Floodwaters approached 1500 m3/s (between a 13 and 15 year recurrence interval) and breached levees, inundated communities, and flooded the alluvial valley of the Rio Grande; the wetted width exceeding 2.5 km in some locations. The 2008 flood had the 7th largest magnitude of record, however, conveyed the largest volume of water than any other flood. Because of the narrow pre-flood channel conditions, record flood stages occurred. We used pre- and post-flood aerial photographs, channel and floodplain surveys, and 1-dimensional hydraulic models to quantify the magnitude of channel change, investigate the controls of flood-induced geomorphic changes, and measure the post-flood response of the widened channel. These analyses show that geomorphic changes included channel widening, meander migration, avulsions, extensive bar formation, and vertical floodplain accretion. Reach-averaged channel widening between 26 and 52% occurred, but in some localities exceeded 500%. The degree and style of channel response was related, but not limited to, three factors: 1) bed-load supply and transport, 2) pre-flood channel plan form, and 3) rapid declines in specific stream power downstream of constrictions and areas of high channel bed slope. The post-flood channel response has consisted of channel contraction through the aggradation of the channel bed and the formation of fine-grained benches inset within the widened channel margins. The most significant post-flood geomorphic

  17. Coastal Flooding in Florida's Big Bend Region with Application to Sea Level Rise Based on Synthetic Storms Analysis

    Directory of Open Access Journals (Sweden)

    Scott C. Hagen and Peter Bacopoulos

    2012-01-01

    Full Text Available Flooding is examined by comparing maximum envelopes of water against the 0.2% (= 1-in-500-year return-period flooding surface generated as part of revising the Federal Emergency Management _ flood insurance rate maps for Franklin, Wakulla, and Jefferson counties in _ Big Bend Region. The analysis condenses the number of storms to a small fraction of the original 159 used in production. The analysis is performed by assessing which synthetic storms contributed to inundation extent (the extent of inundation into the floodplain, coverage (the overall surface area of the inundated floodplain and the spatially variable 0.2% flooding surface. The results are interpreted in terms of storm attributes (pressure deficit, radius to maximum winds, translation speed, storm heading, and landfall location and the physical processes occurring within the natural system (storms surge and waves; both are contextualized against existing and new hurricane scales. The approach identifies what types of storms and storm attributes lead to what types of inundation, as measured in terms of extent and coverage, in _ Big Bend Region and provides a basis in the identification of a select subset of synthetic storms for studying the impact of sea level rise. The sea level rise application provides a clear contrast between a dynamic approach versus that of a static approach.

  18. Coastal Flooding in Florida's Big Bend Region with Application to Sea Level Rise Based on Synthetic Storms Analysis

    Directory of Open Access Journals (Sweden)

    Scott C. Hagen Peter Bacopoulos

    2012-01-01

    Full Text Available Flooding is examined by comparing maximum envelopes of water against the 0.2% (= 1-in-500-year return-period flooding surface generated as part of revising the Federal Emergency Management Agency¡¦s flood insurance rate maps for Franklin, Wakulla, and Jefferson counties in Florida¡¦s Big Bend Region. The analysis condenses the number of storms to a small fraction of the original 159 used in production. The analysis is performed by assessing which synthetic storms contributed to inundation extent (the extent of inundation into the floodplain, coverage (the overall surface area of the inundated floodplain and the spatially variable 0.2% flooding surface. The results are interpreted in terms of storm attributes (pressure deficit, radius to maximum winds, translation speed, storm heading, and landfall location and the physical processes occurring within the natural system (storms surge and waves; both are contextualized against existing and new hurricane scales. The approach identifies what types of storms and storm attributes lead to what types of inundation, as measured in terms of extent and coverage, in Florida¡¦s Big Bend Region and provides a basis in the identification of a select subset of synthetic storms for studying the impact of sea level rise. The sea level rise application provides a clear contrast between a dynamic approach versus that of a static approach.

  19. Results from the Big Spring basin water quality monitoring and demonstration projects, Iowa, USA

    Science.gov (United States)

    Rowden, R.D.; Liu, H.; Libra, R.D.

    2001-01-01

    Agricultural practices, hydrology, and water quality of the 267-km2 Big Spring groundwater drainage basin in Clayton County, Iowa, have been monitored since 1981. Land use is agricultural; nitrate-nitrogen (-N) and herbicides are the resulting contaminants in groundwater and surface water. Ordovician Galena Group carbonate rocks comprise the main aquifer in the basin. Recharge to this karstic aquifer is by infiltration, augmented by sinkhole-captured runoff. Groundwater is discharged at Big Spring, where quantity and quality of the discharge are monitored. Monitoring has shown a threefold increase in groundwater nitrate-N concentrations from the 1960s to the early 1980s. The nitrate-N discharged from the basin typically is equivalent to over one-third of the nitrogen fertilizer applied, with larger losses during wetter years. Atrazine is present in groundwater all year; however, contaminant concentrations in the groundwater respond directly to recharge events, and unique chemical signatures of infiltration versus runoff recharge are detectable in the discharge from Big Spring. Education and demonstration efforts have reduced nitrogen fertilizer application rates by one-third since 1981. Relating declines in nitrate and pesticide concentrations to inputs of nitrogen fertilizer and pesticides at Big Spring is problematic. Annual recharge has varied five-fold during monitoring, overshadowing any water-quality improvements resulting from incrementally decreased inputs. ?? Springer-Verlag 2001.

  20. A Demonstration of Big Data Technology for Data Intensive Earth Science (Invited)

    Science.gov (United States)

    Kuo, K.; Clune, T.; Ramachandran, R.; Rushing, J.; Fekete, G.; Lin, A.; Doan, K.; Oloso, A. O.; Duffy, D.

    2013-12-01

    Big Data technologies exhibit great potential to change the way we conduct scientific investigations, especially analysis of voluminous and diverse data sets. Obviously, not all Big Data technologies are applicable to all aspects of scientific data analysis. Our NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) project, Automated Event Service (AES), pioneers the exploration of Big Data technologies for data intensive Earth science. Since Earth science data are largely stored and manipulated in the form of multidimensional arrays, the project first evaluates array performance of several candidate Big Data technologies, including MapReduce (Hadoop), SciDB, and a custom-built Polaris system, which have one important feature in common: shared nothing architecture. The evaluation finds SicDB to be the most promising. In this presentation, we demonstrate SciDB using a couple of use cases, each operating on a distinct data set in the regular latitude-longitude grid. The first use case is the discovery and identification of blizzards using NASA's Modern Era Retrospective-analysis for Research and Application (MERRA) data sets. The other finds diurnal signals in the same 8-year period using SSMI data from three different instruments with different equator crossing times by correlating their retrieved parameters. In addition, the AES project is also developing a collaborative component to enable the sharing of event queries and results. Preliminary capabilities will be presented as well.

  1. Flood-inundation maps for Big Creek from the McGinnis Ferry Road bridge to the confluence of Hog Wallow Creek, Alpharetta and Roswell, Georgia

    Science.gov (United States)

    Musser, Jonathan W.

    2015-08-20

    Digital flood-inundation maps for a 12.4-mile reach of Big Creek that extends from 260 feet above the McGinnis Ferry Road bridge to the U.S. Geological Survey (USGS) streamgage at Big Creek below Hog Wallow Creek at Roswell, Georgia (02335757), were developed by the USGS in cooperation with the cities of Alpharetta and Roswell, Georgia. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage at Big Creek near Alpharetta, Georgia (02335700). Real-time stage information from this USGS streamgage may be obtained at http://waterdata.usgs.gov/ and can be used in conjunction with these maps to estimate near real-time areas of inundation. The National Weather Service (NWS) is incorporating results from this study into the Advanced Hydrologic Prediction Service (AHPS) flood-warning system http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs for many streams where the USGS operates streamgages and provides flow data. The forecasted peak-stage information for the USGS streamgage at Big Creek near Alpharetta (02335700), available through the AHPS Web site, may be used in conjunction with the maps developed for this study to show predicted areas of flood inundation.

  2. Flood-inundation maps for Big Creek from the McGinnis Ferry Road bridge to the confluence of Hog Wallow Creek, Alpharetta and Roswell, Georgia

    Science.gov (United States)

    Musser, Jonathan W.

    2015-08-20

    Digital flood-inundation maps for a 12.4-mile reach of Big Creek that extends from 260 feet above the McGinnis Ferry Road bridge to the U.S. Geological Survey (USGS) streamgage at Big Creek below Hog Wallow Creek at Roswell, Georgia (02335757), were developed by the USGS in cooperation with the cities of Alpharetta and Roswell, Georgia. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage at Big Creek near Alpharetta, Georgia (02335700). Real-time stage information from this USGS streamgage may be obtained at http://waterdata.usgs.gov/ and can be used in conjunction with these maps to estimate near real-time areas of inundation. The National Weather Service (NWS) is incorporating results from this study into the Advanced Hydrologic Prediction Service (AHPS) flood-warning system http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs for many streams where the USGS operates streamgages and provides flow data. The forecasted peak-stage information for the USGS streamgage at Big Creek near Alpharetta (02335700), available through the AHPS Web site, may be used in conjunction with the maps developed for this study to show predicted areas of flood inundation.

  3. DIGITAL FLOOD INSURANCE RATE MAP DATABASE, BIG HORN COUNTY, WYOMING (ALL JURISDICTIONS)

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Digital Flood Insurance Rate Map (DFIRM) Database depicts flood risk information and supporting data used to develop the risk data. The primary risk...

  4. Effectiveness and reliability of emergency measures for flood prevention

    NARCIS (Netherlands)

    Lendering, K.T.; Jonkman, S.N.; Kok, M.

    2014-01-01

    Floods in the summer of 2013 in Central Europe demonstrated once again that floods account for a large part of damage and loss of life caused by natural disasters. During flood threats emergency measures, such as sand bags and big bags, are often applied to strengthen the flood defences and attempt

  5. Effectiveness and reliability of emergency measures for flood prevention

    NARCIS (Netherlands)

    Lendering, K.T.; Jonkman, S.N.; Kok, M.

    2014-01-01

    Floods in the summer of 2013 in Central Europe demonstrated once again that floods account for a large part of damage and loss of life caused by natural disasters. During flood threats emergency measures, such as sand bags and big bags, are often applied to strengthen the flood defences and attempt

  6. Floods

    Science.gov (United States)

    Floods are common in the United States. Weather such as heavy rain, thunderstorms, hurricanes, or tsunamis can ... is breached, or when a dam breaks. Flash floods, which can develop quickly, often have a dangerous ...

  7. Big Muddy Field Low-Tension Flood Demonstration Project. Fourth annual report, April 1981-March 1982

    Energy Technology Data Exchange (ETDEWEB)

    Painter, T.

    1982-09-01

    During 1981, about two-thirds of the low-tension slug was injected. By year-end, the oil cut had increased from 0.6 to over 2%. Injection rates were less than predicted. The viscosity of the slug was reduced from 20 to 14 cp. Following the viscosity reduction, the injection rate continued to decline from 65,000 to 50,000 barrels per month by year-end. An oil response has apparently occurred from the low-tension slug injection, with tracer appearance in only Well 88. The plant operation is now satisfactory. By year-end, the filter operation was satisfactory. The blending system produced satisfactory concentrations throughout the year. Laboratory support work is being done on field treatments of sulfonate-containing crude. A family of chemicals has been developed for a two-stage field treating operation; the first stage for demulsification and the second stage for extraction of the sulfonate from the crude oil. All of the sulfonate has been manufactured and quality control tests were run on each batch. While differences between batches could not be discerned by chemical analysis, the amount of alcohol required for phase stability at constant temperature differed between batches. It was found that if the required isobutyl alcohol (IBA) for phase stability was added, the oil recovery was about the same for each batch. 90 figures, 20 tables.

  8. Linking Surface Morphological Change to Subsurface Fluvial Architecture: What Imprints do big Floods Leave?

    Science.gov (United States)

    Ashworth, P. J.; Best, J. L.; Sambrook-Smith, G. H.; Parker, N.; Lane, S. N.; Lunt, I. A.; Simpson, C. J.; Widdison, P. E.

    2008-12-01

    Ideas concerning the origin of alluvial deposits and their paleoenvironmental interpretation have usually resulted in two schools of thought: that such deposits are either the result of ordinary 'day-to-day' processes that acted uniformly through time, or that they are related to rare events that had a disproportionate effect on erosion and deposition rates. Despite the long running debate of gradualism and catastrophism within the Earth Sciences, there is surprisingly little quantitative data to assess what magnitude of event is represented in many fluvial sequences. This paper reports results of a unique natural 'experiment' where surface (digital elevation models obtained from digital photogrammetry) and subsurface (ground penetrating radar, GPR) data were taken immediately prior to, and after, a large (1 in 40 year) flood event that occurred in 2005 on the sand-bed, braided South Saskatchewan River, Canada. We surveyed several reaches of the river both before and after this major flood event, and collected repeat aerial surveys of the entire channel, as well as GPR surveys along identical survey lines. This allows us to examine the morphological change in the channel form during this flood, quantify the probability distributions of bed heights within the channels, and assess the amount of erosion and/or deposition represented within the subsurface architecture. Results indicate that although this high-magnitude flood had a marked geomorphic impact, the style and scale of both scour and deposition were the same as that measured during lower-magnitude, annual, floods. Hence, rather than being a reflection of either frequent or rare events, alluvial deposits in the South Saskatchewan contain the record of both but these different scale events may be virtually indistinguishable in the subsurface alluvial architecture.

  9. A Cloud-Based Global Flood Disaster Community Cyber-Infrastructure: Development and Demonstration

    Science.gov (United States)

    Wan, Zhanming; Hong, Yang; Khan, Sadiq; Gourley, Jonathan; Flamig, Zachary; Kirschbaum, Dalia; Tang, Guoqiang

    2014-01-01

    Flood disasters have significant impacts on the development of communities globally. This study describes a public cloud-based flood cyber-infrastructure (CyberFlood) that collects, organizes, visualizes, and manages several global flood databases for authorities and the public in real-time, providing location-based eventful visualization as well as statistical analysis and graphing capabilities. In order to expand and update the existing flood inventory, a crowdsourcing data collection methodology is employed for the public with smartphones or Internet to report new flood events, which is also intended to engage citizen-scientists so that they may become motivated and educated about the latest developments in satellite remote sensing and hydrologic modeling technologies. Our shared vision is to better serve the global water community with comprehensive flood information, aided by the state-of-the- art cloud computing and crowdsourcing technology. The CyberFlood presents an opportunity to eventually modernize the existing paradigm used to collect, manage, analyze, and visualize water-related disasters.

  10. Storm and flood of July 31-August 1, 1976, in the Big Thompson River and Cache la Poudre River basins, Larimer and Weld Counties, Colorado

    Science.gov (United States)

    McCain, Jerald F.; Shroba, R.R.

    1979-01-01

    PART A: Devastating flash floods swept through the canyon section of Larimer County in north-central Colorado during the night of July 31-August I, 1976, causing 139 deaths, 5 missing persons, and more than $35 million in total damages. The brunt of the storms occurred over the Big Thompson River basin between Drake and Estes Park with rainfall amounts as much as 12 inches being reported during the storm period. In the Cache la Poudre River basin to the north, a rainfall amount of 10 inches was reported for one locality while 6 inches fell over a widespread area near the central part of the basin. The storms developed when strong low-level easterly winds to the rear of a polar front pushed a moist, conditionally unstable airmass upslope into the Front Range of the Rocky Mountains. Orographic uplift released the convective instability, and light south-southeasterly winds at middle and upper levels allowed the storm complex to remain nearly stationary over the foothills for several hours. Minimal entrainment of relatively moist air at middle and upper levels, very low cloud bases, and a slightly tilted updraft structure contributed to a high precipitation efficiency. Intense rainfall began soon after 1900 MDT (Mountain Daylight Time) in the Big Thompson River and the North Fork Cache la Poudre River basins. A cumulative rainfall curve developed for Glen Comfort from radar data indicates that 7.5 inches of rain fell during the period 1930-2040 MDT on July 31. In the central part of the storm area west of Fort Collins, the heaviest rainfall began about 2200 MDT on July 31 and continued until 0100 MDT on August 1. Peak discharges were extremely large on many streams in the storm area-exceeding previously recorded maximum discharges at several locations. The peak discharge of the Big Thompson River at the gaging station at the canyon mouth, near Drake was 31,200 cubic feet per second or more than four times the previous maximum discharge of 7,600 cubic feet per second at

  11. Flood-Fighting Structures Demonstration and Evaluation Program: Laboratory and Field Testing in Vicksburg, Mississippi

    Science.gov (United States)

    2007-07-01

    actual flood conditions. Log impact tests were conducted at a water elevation of 66 percent levee height to model the impact of waterborne debris...sand. A total of 250 cu yd of sand was delivered to the site. An automatic-speed sandbagger, Model ASB-3 (Hogan Manufacturing, Inc.) was rented...Local Sponsor Mr. Renold Minsky , President, Fifth Louisiana Levee Board Mr. Bump Calloway, Director, Warren County (MS) Civil Defense The

  12. Noah's Flood and the Associated Tremendous Rainfall as a Possible Result of Collision of a Big Asteroid with the Sun

    CERN Document Server

    Shopov, Y Y; Georgiev, L N; Damyanov, Y; Damyanova, A; Ford, D C; Yonge, C J

    2009-01-01

    A good correlation between the growth rate of the cave speleothems and the annual precipitation at the cave site allow quantitative reconstruction of the precipitation. Measuring the growth rate of a speleothem from Duhlata Cave, Bulgaria we found that around 7500 B.P. the speleothem growth rate (averaged for 120 years) exceeds 53 times its recent value suggesting that enormous precipitation flooded the Black Sea basin at that time. Its possible connection with the Bible (Noahs) Flood is discussed. We propose a possible mechanism of the flooding of the Black Sea during the Flood involving production of a super- Tsunami by pushing of the Black Sea water towards the Crimea cost by Mediterranean waters. We propose also an Astronomical Theory of the origin of the Bible Flood. We attribute higher water evaporation and rainfall to be caused by rapid increasing of the solar radiation resulting from a collision of a large asteroid or comet with the Sun.

  13. Green River Formation Water Flood Demonstration Project: Final report. [October 21, 1992-April, 30, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Deo, M.D. [Dept. of Chemical and Fuels Engineering, University of Utah, Salt Lake City (US); Dyer, J.E.; Lomax, J.D. [Inland Resources, Inc., Lomax Exploration Co., Salt Lake City, UT (US); Nielson, D.L.; Lutz, S.J. [Energy and Geoscience Institute at the University of Utah, Salt Lake City (US)

    1996-11-01

    The objectives were to understand the oil production mechanisms in the Monument Butte unit via reservoir characterization and reservoir simulations and to transfer the water flooding technology to similar units in the vicinity, particularly the Travis and the Boundary units. Comprehensive reservoir characterization and reservoir simulations of the Monument Butte, Travis and Boundary units were presented in the two published project yearly reports. The primary and the secondary production from the Monument Butte unit were typical of oil production from an undersaturated oil reservoir close to its bubble point. The water flood in the smaller Travis unit appeared affected by natural and possibly by large interconnecting hydraulic fractures. Water flooding the boundary unit was considered more complicated due to the presence of an oil water contact in one of the wells. The reservoir characterization activity in the project basically consisted of extraction and analysis of a full diameter c ore, Formation Micro Imaging logs from several wells and Magnetic Resonance Imaging logs from two wells. In addition, several side-wall cores were drilled and analyzed, oil samples from a number of wells were physically and chemically characterized (using gas chromatography), oil-water relative permeabilities were measured and pour points and cloud points of a few oil samples were determined. The reservoir modeling activity comprised of reservoir simulation of all the three units at different scales and near well-bore modeling of the wax precipitation effects. The reservoir characterization efforts identified new reservoirs in the Travis and the Boundary units. The reservoir simulation activities established the extent of pressurization of the sections of the reservoirs in the immediate vicinity of the Monument Butte unit. This resulted in a major expansion of the unit and the production from this expanded unit increased from about 300 barrels per day to about 2000 barrels per day.

  14. Field Demonstration of Carbon Dioxide Miscible Flooding in the Lansing-Kansas City Formation, Central Kansas

    Energy Technology Data Exchange (ETDEWEB)

    Alan Byrnes; G. Paul Willhite; Don Green; Richard Pancake; JyunSyung Tsau; W. Lynn Watney; John Doveton; Willard Guy; Rodney Reynolds; Dave Murfin; James Daniels; Russell Martin; William Flanders; Dave Vander Griend; Eric Mork; Paul Cantrell

    2010-03-07

    A pilot carbon dioxide miscible flood was initiated in the Lansing Kansas City C formation in the Hall Gurney Field, Russell County, Kansas. The reservoir zone is an oomoldic carbonate located at a depth of about 2900 feet. The pilot consists of one carbon dioxide injection well and three production wells. Continuous carbon dioxide injection began on December 2, 2003. By the end of June 2005, 16.19 MM lb of carbon dioxide was injected into the pilot area. Injection was converted to water on June 21, 2005 to reduce operating costs to a breakeven level with the expectation that sufficient carbon dioxide was injected to displace the oil bank to the production wells by water injection. By March 7,2010, 8,736 bbl of oil were produced from the pilot. Production from wells to the northwest of the pilot region indicates that oil displaced from carbon dioxide injection was produced from Colliver A7, Colliver A3, Colliver A14 and Graham A4 located on adjacent leases. About 19,166 bbl of incremental oil were estimated to have been produced from these wells as of March 7, 2010. There is evidence of a directional permeability trend toward the NW through the pilot region. The majority of the injected carbon dioxide remains in the pilot region, which has been maintained at a pressure at or above the minimum miscibility pressure. Estimated oil recovery attributed to the CO2 flood is 27,902 bbl which is equivalent to a gross CO2 utilization of 4.8 MCF/bbl. The pilot project is not economic.

  15. Affordable Development and Demonstration of a Small NTR Engine and Stage: How Small is Big Enough?

    Science.gov (United States)

    Borowski, Stanley K.; Sefcik, Robert J.; Fittje, James E.; McCurdy, David R.; Qualls, Arthur L.; Schnitzler, Bruce G.; Werner, James E.; Weitzberg (Abraham); Joyner, Claude R.

    2015-01-01

    The Nuclear Thermal Rocket (NTR) derives its energy from fission of uranium-235 atoms contained within fuel elements that comprise the engine's reactor core. It generates high thrust and has a specific impulse potential of approximately 900 seconds - a 100% increase over today's best chemical rockets. The Nuclear Thermal Propulsion (NTP) project, funded by NASA's AES program, includes five key task activities: (1) Recapture, demonstration, and validation of heritage graphite composite (GC) fuel (selected as the "Lead Fuel" option); (2) Engine Conceptual Design; (3) Operating Requirements Definition; (4) Identification of Affordable Options for Ground Testing; and (5) Formulation of an Affordable Development Strategy. During FY'14, a preliminary DDT&E plan and schedule for NTP development was outlined by GRC, DOE and industry that involved significant system-level demonstration projects that included GTD tests at the NNSS, followed by a FTD mission. To reduce cost for the GTD tests and FTD mission, small NTR engines, in either the 7.5 or 16.5 klbf thrust class, were considered. Both engine options used GC fuel and a "common" fuel element (FE) design. The small approximately 7.5 klbf "criticality-limited" engine produces approximately 157 megawatts of thermal power (MWt) and its core is configured with parallel rows of hexagonal-shaped FEs and tie tubes (TTs) with a FE to TT ratio of approximately 1:1. The larger approximately 16.5 klbf Small Nuclear Rocket Engine (SNRE), developed by LANL at the end of the Rover program, produces approximately 367 MWt and has a FE to TT ratio of approximately 2:1. Although both engines use a common 35 inch (approximately 89 cm) long FE, the SNRE's larger diameter core contains approximately 300 more FEs needed to produce an additional 210 MWt of power. To reduce the cost of the FTD mission, a simple "1-burn" lunar flyby mission was considered to reduce the LH2 propellant loading, the stage size and complexity. Use of existing and

  16. Nursing Management Minimum Data Set: Cost-Effective Tool To Demonstrate the Value of Nurse Staffing in the Big Data Science Era.

    Science.gov (United States)

    Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L

    2016-01-01

    There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.

  17. Commercial scale demonstration enhanced oil recovery by miceller-polymer flooding. M-1 project: facilities report

    Energy Technology Data Exchange (ETDEWEB)

    Knight, B.L. (ed.)

    1977-04-01

    ERDA and Marathon Oil Company contracted together for a commercial scale demonstration of enhanced oil recovery by the Maraflood (TM) oil recovery process. This M-1 Project is located within Sections 15, 16, 21 and 22, T6N, R13W, Crawford County, Illinois, encompassing approximately 407 acres of Robinson Sand reservoir developed in the first decade of the century. The area covers portions of several waterfloods developed on 10-acre spacing in the 1950's that were approaching their economic limit. This report describes all M-1 Project facilities, how they were prepared or constructed, their purpose and how they operate: (1) wells (drilling and completion); (2) production facility; (3) injection facility; and (4) various service systems required during project development and/or operation. (48 fig, 7 tables) (DLC).

  18. Demonstrator Flood Control Room : Inventarisatie van de wensen van de verschillende Deltares onderdelen en een hierop gebaseerd ontwerp

    NARCIS (Netherlands)

    Boertjens, G.J.; Attema-van Waas, A.R.; Guikema, M.; Schilder, C.M.C.; Veen, M.J. van der

    2009-01-01

    Op basis van het uitgevoerde onderzoek trekt TNO de volgende conclusies: • De bestaande ruimte die Deltares op het oog heeft voor de realisatie van de trainingsruimte is klein. Een eerste fase van de gewenste Flood Control Room is realiseerbaar in deze ruimte, met inachtneming dat niet alle geïdenti

  19. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  20. Floods and Flash Flooding

    Science.gov (United States)

    Floods and flash flooding Now is the time to determine your area’s flood risk. If you are not sure whether you ... If you are in a floodplain, consider buying flood insurance. Do not drive around barricades. If your ...

  1. Flooding and Flood Management

    Science.gov (United States)

    Brooks, K.N.; Fallon, J.D.; Lorenz, D.L.; Stark, J.R.; Menard, Jason; Easter, K.W.; Perry, Jim

    2011-01-01

    Floods result in great human disasters globally and nationally, causing an average of $4 billion of damages each year in the United States. Minnesota has its share of floods and flood damages, and the state has awarded nearly $278 million to local units of government for flood mitigation projects through its Flood Hazard Mitigation Grant Program. Since 1995, flood mitigation in the Red River Valley has exceeded $146 million. Considerable local and state funding has been provided to manage and mitigate problems of excess stormwater in urban areas, flooding of farmlands, and flood damages at road crossings. The cumulative costs involved with floods and flood mitigation in Minnesota are not known precisely, but it is safe to conclude that flood mitigation is a costly business. This chapter begins with a description of floods in Minneosta to provide examples and contrasts across the state. Background material is presented to provide a basic understanding of floods and flood processes, predication, and management and mitigation. Methods of analyzing and characterizing floods are presented because they affect how we respond to flooding and can influence relevant practices. The understanding and perceptions of floods and flooding commonly differ among those who work in flood forecasting, flood protection, or water resource mamnagement and citizens and businesses affected by floods. These differences can become magnified following a major flood, pointing to the need for better understanding of flooding as well as common language to describe flood risks and the uncertainty associated with determining such risks. Expectations of accurate and timely flood forecasts and our ability to control floods do not always match reality. Striving for clarity is important in formulating policies that can help avoid recurring flood damages and costs.

  2. Big Data: Big Confusion? Big Challenges?

    Science.gov (United States)

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data : Big Confusion? Big Challenges? Mary Maureen... Data : Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...Acquisition Research Symposium • ~!& UNC CHARlD1TE 90% of the data in the world today was created in the last two years Big Data growth from

  3. Affordable Development and Demonstration of a Small Nuclear Thermal Rocket (NTR) Engine and Stage: How Small Is Big Enough?

    Science.gov (United States)

    Borowski, Stanley K.; Sefcik, Robert J.; Fittje, James E.; McCurdy, David R.; Qualls, Arthur L.; Schnitzler, Bruce G.; Werner, James E.; Weitzberg, Abraham; Joyner, Claude R.

    2016-01-01

    The Nuclear Thermal Rocket (NTR) derives its energy from fission of uranium-235 atoms contained within fuel elements that comprise the engine's reactor core. It generates high thrust and has a specific impulse potential of approximately 900 specific impulse - a 100 percent increase over today's best chemical rockets. The Nuclear Thermal Propulsion (NTP) project, funded by NASA's Advanced Exploration Systems (AES) program, includes five key task activities: (1) Recapture, demonstration, and validation of heritage graphite composite (GC) fuel (selected as the Lead Fuel option); (2) Engine Conceptual Design; (3) Operating Requirements Definition; (4) Identification of Affordable Options for Ground Testing; and (5) Formulation of an Affordable Development Strategy. During fiscal year (FY) 2014, a preliminary Design Development Test and Evaluation (DDT&E) plan and schedule for NTP development was outlined by the NASA Glenn Research Center (GRC), Department of Energy (DOE) and industry that involved significant system-level demonstration projects that included Ground Technology Demonstration (GTD) tests at the Nevada National Security Site (NNSS), followed by a Flight Technology Demonstration (FTD) mission. To reduce cost for the GTD tests and FTD mission, small NTR engines, in either the 7.5 or 16.5 kilopound-force thrust class, were considered. Both engine options used GC fuel and a common fuel element (FE) design. The small approximately 7.5 kilopound-force criticality-limited engine produces approximately157 thermal megawatts and its core is configured with parallel rows of hexagonal-shaped FEs and tie tubes (TTs) with a FE to TT ratio of approximately 1:1. The larger approximately 16.5 kilopound-force Small Nuclear Rocket Engine (SNRE), developed by Los Alamos National Laboratory (LANL) at the end of the Rover program, produces approximately 367 thermal megawatts and has a FE to TT ratio of approximately 2:1. Although both engines use a common 35-inch (approximately

  4. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Herberger, Sarah Elizabeth Marie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  5. 75 FR 14091 - Final Flood Elevation Determinations

    Science.gov (United States)

    2010-03-24

    ... participation in the National Flood Insurance Program (NFIP). DATES: The date of issuance of the Flood Insurance Rate Map (FIRM) showing BFEs and modified BFEs for each community. This date may be obtained by...-1035 Big Creek South Ely Street......... + 713 City of Bertram. Big Creek Road + 719 Cedar Lake...

  6. 城市突发水涝灾害大数据分析技术研究%Study on the Big Data Technology of the City Unexpected Flood Disaster

    Institute of Scientific and Technical Information of China (English)

    孙欣欣

    2016-01-01

    城市突发水涝灾害是当今城市化发展下的主要自然灾害之一,其造成的损失不容小觑,基于大数据技术对城市突发水涝灾害相关海量数据进行处理,深入分析灾害发生背后的根本原因有着重要意义。本文对城市突发水涝灾害相关大数据及其特点进行了分析,剖析了城市突发水涝灾害相关大数据分析处理的难点所在,结合城市突发水涝灾害相关数据特点,介绍了相关大数据分析处理的技术方法,并设计了一种基于Hadoop的混合大数据分析工具,希望能够对城市突发水涝灾害领域中的大数据应用研究起到一定参考作用。%Flood disaster is the main natural disaster in the process of urbanization, which result in enormous losses. Thus it is necessary to analyze the massive data of flood disaster for exploring the fundamental reason of unexpected disasters, based on Hadoop. This article analyzes big data of city flood disaster and its features and dissects the difficulties in dealing with Hadoop. Combined with the feature of data, analytical methods and tools of Hadoop are introduced and designed, which can contribute to the applied research of flood disaster Hadoop.

  7. The Met Office NWP-based Nowcasting Demonstration Project for the summer 2012 floods and London Olympics 2012

    Science.gov (United States)

    Ballard, Susan P.; Li, Zhihong; Simonin, David; Tubbs, Robert; Kelly, Graeme; Caron, Jean-Francois

    2013-04-01

    The Met Office developed a high resolution (1.5km) NWP system covering southern England and Wales for nowcasting (NDP) for the London Olympics 2012 using the Unified Model and hourly cycling 4D-Var data assimilation. The system produces 6 hour forecasts every hour. This has been running near-real-time since March 2012 on the IBM Power 6 and moved to the new IBM Power 7 in September 2012. The system uses latent heat nudging of radar derived rain rates provided every 15mins, direct assimilation in VAR of an hourly 3D cloud cover analysis and high time frequency subhourly radar Doppler winds (6 per hour), wind profiler and MSG SEVIRI upper tropospheric water vapour channels every 15mins as well as hourly surface synoptic reports and AMDAR reports. Eumetsat Satellite winds (AMVs) are used but they are very coarse horizontally and temporally eg at T-30mins only. The domain includes 8 of the UK network radars of which 5 were providing Doppler radial winds by the time of the Olympics and 4 wind. Boundary condition updates were provided every 30mins from 1.5km resolution 6hourly forecasts from a 3hourly cycling 3-km 3D-VAR for the UK region, UKV model. The NDP uses a 4D-Var data assimilation system with 1/2 UM resolution (i.e. 3km), hourly assimilation windows with 10 minute LS states, and 100 second timestep. The PF model and its adjoint have dimensions of 180 x 144 x 70. Observations are extracted in the observation time window T-30 mins to T+30 mins. The 1.5km UM (360 x 288 x 70) uses 50 sec time-stepping on 6 nodes in 12 x16 decomposition. 4D-Var increments are added to UM at the initial forecast time T-30 mins (at first UM time step). A 45 minute data cutoff was used and forecasts were available within 1 hour of nominal analysis time ie taking 15mins for observation processing, data assimilation and forecast. Summer 2012 was an excellent time to assess the skill of the system for flash flood prediction due to the extreme weather over the UK during that period

  8. Big data, big governance

    NARCIS (Netherlands)

    Reep, Frans van der

    2016-01-01

    “Natuurlijk is het leuk dat mijn koelkast zelf melk bestelt op basis van data gerelateerde patronen. Deep learning op basis van big data kent grote beloften,” zegt Frans van der Reep van Inholland. Geen wonder dat dit op de Hannover Messe tijdens de Wissenstag van ScienceGuide een hoofdthema zal zij

  9. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  10. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  11. Ordered network structure and its prediction for the big floods in the Changjiang River Basin%长江流域大洪水有序网络结构及其预测研究

    Institute of Scientific and Technical Information of China (English)

    门可佩

    2014-01-01

    根据1827年以来的统计资料进行分析,长江流域大洪水展示出极为显著的有序性。运用翁文波信息预测理论,构建长江大洪水二维平面和三维立体信息有序网络结构并进行综合分析和预测,结果表明:2014、2020、2030、2036、2051与2058年前后的汛期,长江流域将有可能发生大洪水。%According to the latest statistical data of hydrology,a total of 21 floods took place in the Changjiang Riv-er Basin from 1827 to now,which showed a very obvious orderliness.Based on the information forecasting theory and ordered network analysis,we construct the 2D-and 3D-ordered network structure and make prediction research.Pre-diction results show that the future big deluges will probably occur in the Changjiang River Basin around 2014, 2020,2030,2036,2051 and 2058.

  12. 北京城市洪涝模型建设与典型示范%Construction and demonstration of urban flood modeling in Beijing

    Institute of Scientific and Technical Information of China (English)

    王毅; 刘洪伟; 潘兴瑶; 邸苏闯

    2015-01-01

    With the rapid urbanization, the ratio of impervious area in river basins has increased greatly. It resulted in the increase of peak flood discharges, reduction of time of concentrations, and expanding threat of flood disasters. Especially, under the unified planning and overall management of channels and drainage systems for urban rivers, the rainfall-runoff relationship is changed greatly at urban river basin scale. Considering the complex and uncertainty of urban flood disasters, this paper proposes a technical framework of urban flood model, and taking Qinghe River as an example to construct the model. The river flood diversion capacity is rechecked. The indicator levels of early warning of river flood are refined with quantified the relationship between flood peak and rainfall depth. The watershed flood risk characteristics are also addressed based on the model. This study would provide a guide to flood control and storm water utilization assessment in the watershed of urban area.%伴随城市快速发展,城市不透水面积持续增加,使得在同等降雨条件下流域内洪峰流量增加、峰现时间缩短,洪涝灾害威胁扩大.城市水利工程众多,虽然城市排水管网和河道都经过全面整治,但随着城区面积增加,城区防洪排涝河道和管网的服务范围、下垫面状况、排水单元和沟渠、支流发生了显著改变,河网的产汇流过程也发生了较大变化.针对城市洪涝灾害发生的复杂性和不确定性特点,提出北京市洪涝模型构建技术框架,并以清河流域为典型流域构建洪涝模型,分别从不同下垫面和降雨条件下河网排洪能力复核、河道防洪预警指标细化、城市洪涝风险量化等方面构建基于洪涝模拟的城市防汛安全调度技术体系.

  13. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  14. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  15. Development of flood index by characterisation of flood hydrographs

    Science.gov (United States)

    Bhattacharya, Biswa; Suman, Asadusjjaman

    2015-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Due to climatological characteristics there are catchments where flood forecasting may have a relatively limited role and flood event management may have to be trusted upon. For example, in flash flood catchments, which often may be tiny and un-gauged, flood event management often depends on approximate prediction tools such as flash flood guidance (FFG). There are catchments fed largely by flood waters coming from upstream catchments, which are un-gauged or due to data sharing issues in transboundary catchments the flow of information from upstream catchment is limited. Hydrological and hydraulic modelling of these downstream catchments will never be sufficient to provide any required forecasting lead time and alternative tools to support flood event management will be required. In FFG, or similar approaches, the primary motif is to provide guidance by synthesising the historical data. We follow a similar approach to characterise past flood hydrographs to determine a flood index (FI), which varies in space and time with flood magnitude and its propagation. By studying the variation of the index the pockets of high flood risk, requiring attention, can be earmarked beforehand. This approach can be very useful in flood risk management of catchments where information about hydro-meteorological variables is inadequate for any forecasting system. This paper presents the development of FI and its application to several catchments including in Kentucky in the USA

  16. 76 FR 43923 - Final Flood Elevation Determinations

    Science.gov (United States)

    2011-07-22

    ... participation in the National Flood Insurance Program (NFIP). DATES: The date of issuance of the Flood Insurance Rate Map (FIRM) showing BFEs and modified BFEs for each community. This date may be obtained by... Approximately 275 feet downstream of Big Bethel +9 Road. Approximately 20 feet upstream of the confluence...

  17. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  18. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  19. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay between big...

  20. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  1. Big Data

    OpenAIRE

    2013-01-01

    Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google,...

  2. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...... a stochastic actor oriented model (SAOM) to analyze both network endogeneous mechanisms and individual agency driving the collaboration network and further if being a Big Ego in Big Science translates to increasing performance. Our findings suggest that the selection of collaborators is not based...... knowledge producing environments with more visible boundaries and higher thresholds for collaboration....

  3. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  4. Nogales flood detention study

    Science.gov (United States)

    Norman, Laura M.; Levick, Lainie; Guertin, D. Phillip; Callegary, James; Guadarrama, Jesus Quintanar; Anaya, Claudia Zulema Gil; Prichard, Andrea; Gray, Floyd; Castellanos, Edgar; Tepezano, Edgar; Huth, Hans; Vandervoet, Prescott; Rodriguez, Saul; Nunez, Jose; Atwood, Donald; Granillo, Gilberto Patricio Olivero; Ceballos, Francisco Octavio Gastellum

    2010-01-01

    Flooding in Ambos Nogales often exceeds the capacity of the channel and adjacent land areas, endangering many people. The Nogales Wash is being studied to prevent future flood disasters and detention features are being installed in tributaries of the wash. This paper describes the application of the KINEROS2 model and efforts to understand the capacity of these detention features under various flood and urbanization scenarios. Results depict a reduction in peak flow for the 10-year, 1-hour event based on current land use in tributaries with detention features. However, model results also demonstrate that larger storm events and increasing urbanization will put a strain on the features and limit their effectiveness.

  5. Understanding the allure of big infrastructure: Jakarta’s Great Garuda Sea Wall Project

    Directory of Open Access Journals (Sweden)

    Emma Colven

    2017-06-01

    Full Text Available In response to severe flooding in Jakarta, a consortium of Dutch firms in collaboration with the Indonesian government has designed the 'Great Garuda Sea Wall' project. The master plan proposes to construct a sea wall to enclose Jakarta Bay. A new waterfront city will be built on over 1000 hectares (ha of reclaimed land in the shape of the Garuda, Indonesia’s national symbol. By redeveloping North Jakarta, the project promises to realise the world-class city aspirations of Indonesia’s political elites. Heavily reliant on hydrological engineering, hard infrastructure and private capital, the project has been presented by proponents as the optimum way to protect the city from flooding. The project retains its allure among political elites despite not directly addressing land subsidence, understood to be a primary cause of flooding. I demonstrate how this project is driven by a techno-political network that brings together political and economic interests, world-class city discourses, engineering expertise, colonial histories, and postcolonial relations between Jakarta and the Netherlands. Due in part to this network, big infrastructure has long constituted the preferred state response to flooding in Jakarta. I thus make a case for provincialising narratives that claim we are witnessing a return to big infrastructure in water management.

  6. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  7. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  8. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  9. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  10. Health impacts of floods.

    Science.gov (United States)

    Du, Weiwei; FitzGerald, Gerard Joseph; Clark, Michele; Hou, Xiang-Yu

    2010-01-01

    Floods are the most common hazard to cause disasters and have led to extensive morbidity and mortality throughout the world. The impact of floods on the human community is related directly to the location and topography of the area, as well as human demographics and characteristics of the built environment. The aim of this study is to identify the health impacts of disasters and the underlying causes of health impacts associated with floods. A conceptual framework is developed that may assist with the development of a rational and comprehensive approach to prevention, mitigation, and management. This study involved an extensive literature review that located >500 references, which were analyzed to identify common themes, findings, and expert views. The findings then were distilled into common themes. The health impacts of floods are wide ranging, and depend on a number of factors. However, the health impacts of a particular flood are specific to the particular context. The immediate health impacts of floods include drowning, injuries, hypothermia, and animal bites. Health risks also are associated with the evacuation of patients, loss of health workers, and loss of health infrastructure including essential drugs and supplies. In the medium-term, infected wounds, complications of injury, poisoning, poor mental health, communicable diseases, and starvation are indirect effects of flooding. In the long-term, chronic disease, disability, poor mental health, and poverty-related diseases including malnutrition are the potential legacy. This article proposes a structured approach to the classification of the health impacts of floods and a conceptual framework that demonstrates the relationships between floods and the direct and indirect health consequences.

  11. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  12. Big queues

    CERN Document Server

    Ganesh, Ayalvadi; Wischik, Damon

    2004-01-01

    Big Queues aims to give a simple and elegant account of how large deviations theory can be applied to queueing problems. Large deviations theory is a collection of powerful results and general techniques for studying rare events, and has been applied to queueing problems in a variety of ways. The strengths of large deviations theory are these: it is powerful enough that one can answer many questions which are hard to answer otherwise, and it is general enough that one can draw broad conclusions without relying on special case calculations.

  13. Big Man

    Institute of Scientific and Technical Information of China (English)

    郑秀文

    2012-01-01

    <正>梁炳"Edmond"说他演唱会后会跟太太去旅行。无论飞机降落在地球的哪角,有伴在旁就是幸福。他的concert名字是big man,初时我看错是big mac演唱会:心想干吗是大汉堡演唱会?嘻!后来才知看错。但其实细想,在成长路上,谁不曾是活得像个傻傻的面包,一团面粉暴露在这大千世界,时间和各式人生经历就是酵母,多少年月日,你我都会发酵成长。友情也是激发彼此成长的酵母,看到对方早已经从男仔成了男人,我都原来一早已不再能够以"女仔"称呼自己。在我眼中,他的改变是大的,爱玩外向的个性收窄了,现在的我们,

  14. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  15. 76 FR 19007 - Proposed Flood Elevation Determinations

    Science.gov (United States)

    2011-04-06

    .... Williamsburg County. Approximately 0.4 mile None +21 upstream of the Big Dam Swamp confluence. Smith Swamp At... section 110 of the Flood Disaster Protection Act of 1973, 42 U.S.C. 4104, and 44 CFR 67.4(a). These.... Sec. 67.4 2. The tables published under the authority of Sec. 67.4 are proposed to be amended...

  16. 75 FR 43479 - Proposed Flood Elevation Determinations

    Science.gov (United States)

    2010-07-26

    ... section 110 of the Flood Disaster Protection Act of 1973, 42 U.S.C. 4104, and 44 CFR 67.4(a). These.... Sec. 67.4 2. The tables published under the authority of Sec. 67.4 are proposed to be amended as... Street. Summet Brook (Backwater effects from Approximately 2,800 None +363 Town of Shrewsbury. Big...

  17. Flooding On

    Institute of Scientific and Technical Information of China (English)

    YIN PUMIN

    2010-01-01

    @@ Drenched riverside towns in central and south parts of China were preparing for even worse flooding as water levels in the country's huge rivers surged and rainstorms continued. As of July 27,accumulated precipitation since June 16 in 70 percent of the drainage areas of the Yangtze River had exceeded 50 mm,after three rounds of rainstorms,said Cai Qihua,Deputy Director of the Yangtze River Flood Control and Drought Relief Headquarters.

  18. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  19. A rainfall design method for spatial flood risk assessment: considering multiple flood sources

    Science.gov (United States)

    Jiang, X.; Tatano, H.

    2015-08-01

    Information about the spatial distribution of flood risk is important for integrated urban flood risk management. Focusing on urban areas, spatial flood risk assessment must reflect all risk information derived from multiple flood sources: rivers, drainage, coastal flooding etc. that may affect the area. However, conventional flood risk assessment deals with each flood source independently, which leads to an underestimation of flood risk in the floodplain. Even in floodplains that have no risk from coastal flooding, flooding from river channels and inundation caused by insufficient drainage capacity should be considered simultaneously. For integrated flood risk management, it is necessary to establish a methodology to estimate flood risk distribution across a floodplain. In this paper, a rainfall design method for spatial flood risk assessment, which considers the joint effects of multiple flood sources, is proposed. The concept of critical rainfall duration determined by the concentration time of flooding is introduced to connect response characteristics of different flood sources with rainfall. A copula method is then adopted to capture the correlation of rainfall amount with different critical rainfall durations. Rainfall events are designed taking advantage of the copula structure of correlation and marginal distribution of rainfall amounts within different critical rainfall durations. A case study in the Otsu River Basin, Osaka prefecture, Japan was conducted to demonstrate this methodology.

  20. A rainfall design method for spatial flood risk assessment: considering multiple flood sources

    Directory of Open Access Journals (Sweden)

    X. Jiang

    2015-08-01

    Full Text Available Information about the spatial distribution of flood risk is important for integrated urban flood risk management. Focusing on urban areas, spatial flood risk assessment must reflect all risk information derived from multiple flood sources: rivers, drainage, coastal flooding etc. that may affect the area. However, conventional flood risk assessment deals with each flood source independently, which leads to an underestimation of flood risk in the floodplain. Even in floodplains that have no risk from coastal flooding, flooding from river channels and inundation caused by insufficient drainage capacity should be considered simultaneously. For integrated flood risk management, it is necessary to establish a methodology to estimate flood risk distribution across a floodplain. In this paper, a rainfall design method for spatial flood risk assessment, which considers the joint effects of multiple flood sources, is proposed. The concept of critical rainfall duration determined by the concentration time of flooding is introduced to connect response characteristics of different flood sources with rainfall. A copula method is then adopted to capture the correlation of rainfall amount with different critical rainfall durations. Rainfall events are designed taking advantage of the copula structure of correlation and marginal distribution of rainfall amounts within different critical rainfall durations. A case study in the Otsu River Basin, Osaka prefecture, Japan was conducted to demonstrate this methodology.

  1. Enhancing Big Data Value Using Knowledge Discovery Techniques

    Directory of Open Access Journals (Sweden)

    Mai Abdrabo

    2016-08-01

    Full Text Available The world has been drowned by floods of data due to technological development. Consequently, the Big Data term has gotten the expression to portray the gigantic sum. Different sorts of quick data are doubling every second. We have to profit from this enormous surge of data to convert it to knowledge. Knowledge Discovery (KDD can enhance detecting the value of Big Data based on some techniques and technologies like Hadoop, MapReduce, and NoSQL. The use of Big Data value is critical in different fields. This survey discusses the expansion of data that led the world to Big Data expression. Big Data has distinctive characteristics as volume, variety, velocity, value, veracity, variability, viscosity, virality, ambiguity, and complexity. We will describe the connection between Big Data and KDD techniques to reach data value. Big Data applications that are applied by big organizations will be discussed. Characteristics of big data will be introduced, which represent a significant challenge for Big Data management. Finally, some of the important future directions in Big Data field will be presented.

  2. Evaluating flood potential with GRACE in the United States

    Science.gov (United States)

    Molodtsova, Tatiana; Molodtsov, Sergey; Kirilenko, Andrei; Zhang, Xiaodong; VanLooy, Jeffrey

    2016-04-01

    Reager and Famiglietti (2009) proposed an index, Reager's Flood Potential Index (RFPI), for early large-scale flood risk monitoring using the Terrestrial Water Storage Anomaly (TWSA) product derived from the Gravity Recovery and Climate Experiment (GRACE). We evaluated the efficacy of the RFPI for flood risk assessment over the continental USA using multi-year flood observation data from 2003 to 2012 by the US Geological Survey and Dartmouth Flood Observatory. In general, we found a good agreement between the RFPI flood risks and the observed floods on regional and even local scales. RFPI demonstrated skill in predicting the large-area, long-duration floods, especially during the summer season.

  3. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    to estimate both the structural and performance effects of selection, as well as the behavioral of crossing organizational boundaries. Preliminary results suggest that the selection of collaborators still is skewed, and identify a large assortativity effect, as well as a tendency to interact with both authors......In this paper we investigate the micro-mechanisms governing the structural evolution of a scientific collaboration. Empirical evidence indicates that we have transcended into a new paradigm with a new modus operandi where scientific discovery are not lead by so called lone ?stars?, or big egos......, but instead by a group of people, from a multitude of institutions, having a diverse knowledge set and capable of operating more and more complex instrumentation. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize a stochastic actor oriented model...

  4. Combating Floods

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    In summer and autumn of 1998, the river vatleys of the Changjiang, Songhua and Nenjiang rivers were stricken by exceptionally serious floods, As of the, 22nd of August, the flooded areas stretched over 52.4 million acres. More than 223 million people were affected by the flood. 4.97 million houses were ruined, economic losses totaled RMB 166 billion, and most tragically, 3,004 people lost their byes. It was one of the costliest disasters in Chinese history. Millions of People’s Liberation Army soldiers and local people joined hands to battle the floodwaters. Thanks to their unified efforts and tenacious struggle, they successfully withstood the rising, water, resumed production and began to rebuild their homes.

  5. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  6. Flooding On

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Drenched riverside towns in central and south parts of China were preparing for even worse flooding aswater levels in the country’s huge rivers surged and rainstorms continued.As of July 27,accumulated precipitation since June 16 in 70 percent of the drainage

  7. Big data=Big marketing?!

    Institute of Scientific and Technical Information of China (English)

    肖明超

    2012-01-01

    <正>互联网刚刚兴起的时候,有句话很流行:"在网上,没人知道你是一条狗。"但是,在20多年后的今天,这句话已经早被扔进了历史的垃圾堆,因为在技术的推动下,随着移动互联、社交网络、电子商务等的迅速发展,消费者的"行踪"变得越来越容易被把握,消费者在互联网上的眼球、行为轨迹、谈论、喜好、购物经历等等都可能被捕捉到,消费者进入一个几乎透明化生存的"大数据时代"(Age of Big Data)。数据不仅仅正在变得更加可用,人工智能(AI)技术,包括自然语言处理、模式识别和机器学习等技术的发展,正在让数据变得更加容易被计算机所理解,

  8. The Global Flood Model

    Science.gov (United States)

    Williams, P.; Huddelston, M.; Michel, G.; Thompson, S.; Heynert, K.; Pickering, C.; Abbott Donnelly, I.; Fewtrell, T.; Galy, H.; Sperna Weiland, F.; Winsemius, H.; Weerts, A.; Nixon, S.; Davies, P.; Schiferli, D.

    2012-04-01

    Recently, a Global Flood Model (GFM) initiative has been proposed by Willis, UK Met Office, Esri, Deltares and IBM. The idea is to create a global community platform that enables better understanding of the complexities of flood risk assessment to better support the decisions, education and communication needed to mitigate flood risk. The GFM will provide tools for assessing the risk of floods, for devising mitigation strategies such as land-use changes and infrastructure improvements, and for enabling effective pre- and post-flood event response. The GFM combines humanitarian and commercial motives. It will benefit: - The public, seeking to preserve personal safety and property; - State and local governments, seeking to safeguard economic activity, and improve resilience; - NGOs, similarly seeking to respond proactively to flood events; - The insurance sector, seeking to understand and price flood risk; - Large corporations, seeking to protect global operations and supply chains. The GFM is an integrated and transparent set of modules, each composed of models and data. For each module, there are two core elements: a live "reference version" (a worked example) and a framework of specifications, which will allow development of alternative versions. In the future, users will be able to work with the reference version or substitute their own models and data. If these meet the specification for the relevant module, they will interoperate with the rest of the GFM. Some "crowd-sourced" modules could even be accredited and published to the wider GFM community. Our intent is to build on existing public, private and academic work, improve local adoption, and stimulate the development of multiple - but compatible - alternatives, so strengthening mankind's ability to manage flood impacts. The GFM is being developed and managed by a non-profit organization created for the purpose. The business model will be inspired from open source software (eg Linux): - for non-profit usage

  9. STUDY OF FACTORS AFFECTING CUSTOMER BEHAVIOUR USING BIG DATA TECHNOLOGY

    OpenAIRE

    Prabin Sahoo; Dr. Nilay Yajnik

    2014-01-01

    Big data technology is getting momentum recently. There are several articles, books, blogs and discussion points to various facets of big data technology. The study in this paper focuses on big data as concept, and insights into 3 Vs such as Volume, Velocity and Variety and demonstrates their significance with respect to factors that can be processed using big data for studying customer behaviour for online users.

  10. Editorial for FGCS special issue: Big Data in the cloud

    OpenAIRE

    Chang, V; Ramachandran, M.; Wills, G.; Walters, RJ; Li, C-S; Watters, P

    2016-01-01

    Research associated with Big Data in the Cloud will be important topic over the next few years. The topic includes work on demonstrating architectures, applications, services, experiments and simulations in the Cloud to support the cases related to adoption of Big Data. A common approach to Big Data in the Cloud to allow better access, performance and efficiency when analysing and understanding the data is to deliver Everything as a Service. Organisations adopting Big Data this way find the b...

  11. Tsunami flooding

    Science.gov (United States)

    Geist, Eric; Jones, Henry; McBride, Mark; Fedors, Randy

    2013-01-01

    Panel 5 focused on tsunami flooding with an emphasis on Probabilistic Tsunami Hazard Analysis (PTHA) as derived from its counterpart, Probabilistic Seismic Hazard Analysis (PSHA) that determines seismic ground-motion hazards. The Panel reviewed current practices in PTHA and determined the viability of extending the analysis to extreme design probabilities (i.e., 10-4 to 10-6). In addition to earthquake sources for tsunamis, PTHA for extreme events necessitates the inclusion of tsunamis generated by submarine landslides, and treatment of the large attendant uncertainty in source characterization and recurrence rates. Tsunamis can be caused by local and distant earthquakes, landslides, volcanism, and asteroid/meteorite impacts. Coastal flooding caused by storm surges and seiches is covered in Panel 7. Tsunamis directly tied to earthquakes, the similarities with (and path forward offered by) the PSHA approach for PTHA, and especially submarine landslide tsunamis were a particular focus of Panel 5.

  12. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  13. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  14. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  15. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  16. What makes Big Data, Big Data? Exploring the ontological characteristics of 26 datasets

    Directory of Open Access Journals (Sweden)

    Rob Kitchin

    2016-02-01

    Full Text Available Big Data has been variously defined in the literature. In the main, definitions suggest that Big Data possess a suite of key traits: volume, velocity and variety (the 3Vs, but also exhaustivity, resolution, indexicality, relationality, extensionality and scalability. However, these definitions lack ontological clarity, with the term acting as an amorphous, catch-all label for a wide selection of data. In this paper, we consider the question ‘what makes Big Data, Big Data?’, applying Kitchin’s taxonomy of seven Big Data traits to 26 datasets drawn from seven domains, each of which is considered in the literature to constitute Big Data. The results demonstrate that only a handful of datasets possess all seven traits, and some do not possess either volume and/or variety. Instead, there are multiple forms of Big Data. Our analysis reveals that the key definitional boundary markers are the traits of velocity and exhaustivity. We contend that Big Data as an analytical category needs to be unpacked, with the genus of Big Data further delineated and its various species identified. It is only through such ontological work that we will gain conceptual clarity about what constitutes Big Data, formulate how best to make sense of it, and identify how it might be best used to make sense of the world.

  17. Flood Loss Models and Risk Analysis for Private Households in Can Tho City, Vietnam

    National Research Council Canada - National Science Library

    Do Thi Chinh; Nguyen Viet Dung; Animesh K Gain; Heidi Kreibich

    2017-01-01

    .... To improve flood risk analyses for Vietnam, this study presents novel multi-variable flood loss models for residential buildings and contents and demonstrates their application in a flood risk...

  18. Cincinnati Big Area Additive Manufacturing (BAAM)

    Energy Technology Data Exchange (ETDEWEB)

    Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  19. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  20. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  1. Big Data approaches for the analysis of large-scale fMRI data using Apache Spark and GPU processing: A demonstration on resting-state fMRI data from the Human Connectome Project

    Directory of Open Access Journals (Sweden)

    Roland N Boubela

    2016-01-01

    Full Text Available Technologies for scalable analysis of very large datasets have emerged in the domain of internet computing, but are still only rarely used in neuroimaging despite the existence of data and research questions in need of efficient computation tools especially in fMRI. In this work, we present software tools for the application of Apache Spark and Graphics Processing Units to neuroimaging datasets, in particular providing distributed file input for 4D NIfTI fMRI datasets in Scala for use in an Apache Spark environment. Examples for using this Big Data platform in graph analysis of fMRI datasets are shown to illustrate how processing pipelines employing it can be developed. With more tools for the convenient integration of neuroimaging file formats and typical processing steps, big data technologies could find wider endorsement in the community, leading to a range of potentially useful applications especially in view of the current collaborative creation of a wealth of large data repositories including thousands of individual fMRI datasets.

  2. Big Data Approaches for the Analysis of Large-Scale fMRI Data Using Apache Spark and GPU Processing: A Demonstration on Resting-State fMRI Data from the Human Connectome Project

    Science.gov (United States)

    Boubela, Roland N.; Kalcher, Klaudius; Huf, Wolfgang; Našel, Christian; Moser, Ewald

    2016-01-01

    Technologies for scalable analysis of very large datasets have emerged in the domain of internet computing, but are still rarely used in neuroimaging despite the existence of data and research questions in need of efficient computation tools especially in fMRI. In this work, we present software tools for the application of Apache Spark and Graphics Processing Units (GPUs) to neuroimaging datasets, in particular providing distributed file input for 4D NIfTI fMRI datasets in Scala for use in an Apache Spark environment. Examples for using this Big Data platform in graph analysis of fMRI datasets are shown to illustrate how processing pipelines employing it can be developed. With more tools for the convenient integration of neuroimaging file formats and typical processing steps, big data technologies could find wider endorsement in the community, leading to a range of potentially useful applications especially in view of the current collaborative creation of a wealth of large data repositories including thousands of individual fMRI datasets. PMID:26778951

  3. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  4. Using LiDAR surveys to document floods: A case study of the 2008 Iowa flood

    Science.gov (United States)

    Chen, Bo; Krajewski, Witold F.; Goska, Radek; Young, Nathan

    2017-10-01

    Can we use Light Detection and Ranging (LiDAR), an emergent remote sensing technology with wide applications, to document floods with high accuracy? To explore the feasibility of this application, we propose a method to extract distributed inundation depths from a LiDAR survey conducted during flooding. This method consists of three steps: (1) collecting LiDAR data during flooding; (2) classifying the LiDAR observational points as flooded water surface points and non-flooded points, and generating a floodwater surface elevation model; and (3) subtracting the bare earth Digital Terrain Model (DTM) from the flood surface elevation model to obtain a flood depth map. We applied this method to the 2008 Iowa flood in the United States and evaluated the results using the high-water mark measurements, flood extent extracted from SPOT (Small Programmable Object Technology) imagery, and the near-simultaneously acquired aerial photography. The root mean squared error of the LiDAR-derived floodwater surface profile to high-water marks was 30 cm, the consistency between the two flooded areas derived from LiDAR and SPOT imagery was 72% (81% if suspicious isolated ponds in the SPOT-derived extent were removed), and LiDAR-derived flood extent had a horizontal resolution of ∼3 m. This work demonstrates that LiDAR technology has the potential to provide calibration and validation reference data with appreciable accuracy for improved flood inundation modeling.

  5. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  6. Rapid Exposure Assessment of Nationwide River Flood for Disaster Risk Reduction

    Science.gov (United States)

    Kwak, Y.; Park, J.; Arifuzzaman, B.; Iwami, Y.; Amirul, Md.; Kondoh, A.

    2016-06-01

    considerably increased. For flood disaster risk reduction, it is important to identify and characterize flood area, locations (particularly lowland along rivers), and durations. For this purpose, flood mapping and monitoring are an imperative process and the fundamental part of risk management as well as emergency response. Our ultimate goal is to detect flood inundation areas over a nationwide scale despite limitations of optical and multispectral images, and to estimate flood risk in terms of affected people. We propose a methodological possibility to be used as a standard approach for nationwide rapid flood exposure assessment with the use of the multi-temporal Moderate Resolution Imaging Spectrometer (MODIS), a big contributor to progress in near-real-time flood mapping. The preliminary results in Bangladesh show that a propensity of flood risk change strongly depends on the temporal and spatial dynamics of exposure such as distributed population.

  7. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  8. Flooding and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2011

    2011-01-01

    According to the Federal Emergency Management Agency, flooding is the nation's most common natural disaster. Some floods develop slowly during an extended period of rain or in a warming trend following a heavy snow. Flash floods can occur quickly, without any visible sign of rain. Catastrophic floods are associated with burst dams and levees,…

  9. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  10. Big Boss Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper big boss interval games are introduced and various characterizations are given. The structure of the core of a big boss interval game is explicitly described and plays an important role relative to interval-type bi-monotonic allocation schemes for such games. Specifically, each element

  11. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  12. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  13. Global Fluctuation Spectra in Big Crunch/Big Bang String Vacua

    CERN Document Server

    Craps, B; Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study Big Crunch/Big Bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a Big Crunch and a Big Bang cosmology, as well as additional ``whisker'' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the Big Crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function $\\Delta$, which is momentum and time-dependent. We compute $\\Delta$ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to ``entanglement'' entropy in the Big Bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that $\\Delta\\to 1$ and, hence, the fluctuation spectrum is unaltered by the Big Crunch/Big Bang singularity. We comment on, but do not attempt to resolve, su...

  14. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  15. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  16. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  17. Millenial scale changes in flood magnitude and frequency and the role of changes in channel adjustment.

    Science.gov (United States)

    Croke, Jacky; Thompson, Christopher; Denham, Robert; Haines, Heather; Sharma, Ashneel; Pietsch, Timothy

    2016-04-01

    With access to only limited gauging records (~ 37 years in eastern Australia), Australia like many parts of the globe is heavily constrained in its ability to meaningfully predict the magnitude and frequency of extreme flood events. Flood inundation data gathered during recent floods (2011 and 213) now forms an essential insight into how landscapes may respond to future floods and to guide planning and policy. This study presents the first singe-catchment flood reconstruction analyses in a region of recognised hydrological variability, as characterised by alternating extremes of floods and droughts. The resultant 'Big Flood' data set consists of a unique combination of high-resolution topographic data on landscape changes during recent floods, and a detailed reconstruction of both the timing and estimated magnitude of past food events derived using OSL dating of flood deposits from a range of sedimentary environments. While distinct flood and drought 'phases' are recognisable over the timescale of several thousand years, the extent to which these reflect changes in flood magnitude and/or frequency remains complicated by catchment-specific geomorphology. Issues of flood sample preservation are discussed in this talk within the context of geomorphic setting and notably non-linear variations in the capacity for channel adjustment. This talk outlines the key factors which must be considered in evaluating the role of climate, landuse change and geomorphology in informing flood risk management in Queensland.

  18. Big data as a new approach in emergency medicine research

    Institute of Scientific and Technical Information of China (English)

    Ho Ting Wong; QianYin; Ying Qi Guo; Kristen Murray; Dong Hau Zhou; Diana Slade

    2015-01-01

    Big data is a hot topic in the academic sector, and healthcare researchers are definitely not an exception. This article aims to provide a showcase in emergency medicine research to demonstrate the advantages of conducting such research using big data. Big data is a new and cost-effective research approach, and emergency medicine researchers could benefit from using this approach and by doing so producing high-quality research at a faster pace.

  19. Big data as a new approach in emergency medicine research

    Directory of Open Access Journals (Sweden)

    Ho Ting Wong

    2015-08-01

    Full Text Available Big data is a hot topic in the academic sector, and healthcare researchers are definitely not an exception. This article aims to provide a showcase in emergency medicine research to demonstrate the advantages of conducting such research using big data. Big data is a new and cost-effective research approach, and emergency medicine researchers could benefit from using this approach and by doing so producing high-quality research at a faster pace.

  20. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  1. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  2. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  3. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  4. Inhomogeneous Big Bang Cosmology

    CERN Document Server

    Wagh, S M

    2002-01-01

    In this letter, we outline an inhomogeneous model of the Big Bang cosmology. For the inhomogeneous spacetime used here, the universe originates in the infinite past as the one dominated by vacuum energy and ends in the infinite future as the one consisting of "hot and relativistic" matter. The spatial distribution of matter in the considered inhomogeneous spacetime is {\\em arbitrary}. Hence, observed structures can arise in this cosmology from suitable "initial" density contrast. Different problems of the standard model of Big Bang cosmology are also resolved in the present inhomogeneous model. This inhomogeneous model of the Big Bang Cosmology predicts "hot death" for the universe.

  5. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  6. Flood simulation and verification with IoT sensors

    Science.gov (United States)

    Chang, Che-Hao; Hsu, Chih-Tsung; Wu, Shiang-Jen; Huang, Sue-Wei

    2017-04-01

    2D flood dynamic simulation is a vivid tool to demonstrate the possible expose area that sustain impact of high rise of water level. Along with progress in high resolution digital terrain model, the simulation results are quite convinced yet not proved to be close to what is really happened. Due to the dynamic and uncertain essence, the expose area usually could not be well defined during a flood event. Recent development in IoT sensors bring a low power and long distance communication which help us to collect real time flood depths. With these time series of flood depths at different locations, we are capable of verifying the simulation results corresponding to the flood event. 16 flood gauges with IoT specification as well as two flood events in Annan district, Tainan city, Taiwan are examined in this study. During the event in 11, June, 2016, 12 flood gauges works well and 8 of them provide observation match to simulation.

  7. South China Flooded

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Vehicles traverse a flooded street in Liuzhou, guangxi zhuang Autonomous Region, on May 19.heavy rainstorms repeatedly struck China this month, triggering floods, mudflows and landslides. hunan, guangdong and Jiangxi provinces and Chongqing Municipality were the worst hit.

  8. Base Flood Elevation

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  9. Flood Control Structures

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  10. Flooding: Prioritizing protection?

    Science.gov (United States)

    Peduzzi, Pascal

    2017-09-01

    With climate change, urban development and economic growth, more assets and infrastructures will be exposed to flooding. Now research shows that investments in flood protection are globally beneficial, but have varied levels of benefit locally.

  11. Flood Hazard Area

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  12. Flood Hazard Boundaries

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  13. Characterization of remarkable floods in France, a transdisciplinary approach applied on generalized floods of January 1910

    Science.gov (United States)

    Boudou, Martin; Lang, Michel; Vinet, Freddy; Coeur, Denis

    2014-05-01

    . The January 1910's flood is one of these remarkable floods. This event is foremost known for its aftermaths on the Seine basin, where the flood remains the strongest recorded in Paris since 1658. However, its impacts were also widespread to France's Eastern regions (Martin, 2001). To demonstrate the evaluation grid's interest, we propose a deep analysis of the 1910's river flood with the integration of historical documentation. The approach focus on eastern France where the flood remains the highest recorded for several rivers but were often neglected by scientists in favor of Paris's flood. Through a transdisciplinary research based on the evaluation grid method, we will describe the January 1910 flood event and define why it can be considered as a remarkable flood for these regions.

  14. The Big Chills

    Science.gov (United States)

    Bond, G. C.; Dwyer, G. S.; Bauch, H. A.

    2002-12-01

    At the end of the last glacial, the Earth's climate system abruptly shifted into the Younger Dryas, a 1500-year long cold snap known in the popular media as the Big Chill. Following an abrupt warming ending the Younger Dryas about 11,600 years ago, the climate system has remained in an interglacial state, thought to have been relatively stable and devoid, with possibly one or two exceptions, of abrupt climate change. A growing amount of evidence suggests that this benign view of interglacial climate is incorrect. High resolution records of North Atlantic ice rafted sediment, now regarded as evidence of extreme multiyear sea ice drift, reveal abrupt shifts on centennial and millennial time scales. These have been traced from the end of the Younger Dryas to the present, revealing evidence of significant climate variability through all of the last two millennia. Correlatives of these events have been found in drift ice records from the Arctic's Laptev Sea, in the isotopic composition of North Grip ice, and in dissolved K from the GISP2 ice core, attesting to their regional extent and imprint in proxies of very different origins. Measurements of Mg/Ca ratios in planktic foraminifera over the last two millennia in the eastern North Atlantic demonstrate that increases in drifting multiyear sea ice were accompanied by abrupt decreases in sea surface temperatures, especially during the Little Ice Age. Estimated rates of temperature change are on the order of two degrees centigrade, more than thirty percent of the regional glacial to interglacial change, within a few decades. When compared at the same resolution, these interglacial variations are as abrupt as the last glacial's Dansgaard-Oeschger cycles. The interglacial abrupt changes are especially striking because they occurred within the core of the warm North Atlantic Current. The changes may have been triggered by variations in solar irradiance, but if so their large magnitude and regional extent requires amplifying

  15. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  16. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  17. Flood Risk Regional Flood Defences: Technical report

    NARCIS (Netherlands)

    Lendering, K.T.

    2015-01-01

    Historically the Netherlands have always had to deal with the threat of flooding, both from the rivers and the sea as well as from heavy rainfall. The country consists of a large amount of polders, which are low lying areas of land protected from flooding by embankments. These polders require an

  18. Flood Risk Regional Flood Defences: Technical report

    NARCIS (Netherlands)

    Lendering, K.T.

    2015-01-01

    Historically the Netherlands have always had to deal with the threat of flooding, both from the rivers and the sea as well as from heavy rainfall. The country consists of a large amount of polders, which are low lying areas of land protected from flooding by embankments. These polders require an ext

  19. Flood Foresight: A near-real time flood monitoring and forecasting tool for rapid and predictive flood impact assessment

    Science.gov (United States)

    Revilla-Romero, Beatriz; Shelton, Kay; Wood, Elizabeth; Berry, Robert; Bevington, John; Hankin, Barry; Lewis, Gavin; Gubbin, Andrew; Griffiths, Samuel; Barnard, Paul; Pinnell, Marc; Huyck, Charles

    2017-04-01

    The hours and days immediately after a major flood event are often chaotic and confusing, with first responders rushing to mobilise emergency responders, provide alleviation assistance and assess loss to assets of interest (e.g., population, buildings or utilities). Preparations in advance of a forthcoming event are becoming increasingly important; early warning systems have been demonstrated to be useful tools for decision markers. The extent of damage, human casualties and economic loss estimates can vary greatly during an event, and the timely availability of an accurate flood extent allows emergency response and resources to be optimised, reduces impacts, and helps prioritise recovery. In the insurance sector, for example, insurers are under pressure to respond in a proactive manner to claims rather than waiting for policyholders to report losses. Even though there is a great demand for flood inundation extents and severity information in different sectors, generating flood footprints for large areas from hydraulic models in real time remains a challenge. While such footprints can be produced in real time using remote sensing, weather conditions and sensor availability limit their ability to capture every single flood event across the globe. In this session, we will present Flood Foresight (www.floodforesight.com), an operational tool developed to meet the universal requirement for rapid geographic information, before, during and after major riverine flood events. The tool provides spatial data with which users can measure their current or predicted impact from an event - at building, basin, national or continental scales. Within Flood Foresight, the Screening component uses global rainfall predictions to provide a regional- to continental-scale view of heavy rainfall events up to a week in advance, alerting the user to potentially hazardous situations relevant to them. The Forecasting component enhances the predictive suite of tools by providing a local

  20. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    ’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  1. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  2. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  3. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  4. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  5. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  6. Sharing big biomedical data.

    Science.gov (United States)

    Toga, Arthur W; Dinov, Ivo D

    The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent datasets; however there are significant technical, social, regulatory, and institutional barriers that need to be overcome to ensure the power of Big Data overcomes these detrimental factors. Pragmatic policies that demand extensive sharing of data, promotion of data fusion, provenance, interoperability and balance security and protection of personal information are critical for the long term impact of translational Big Data analytics.

  7. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  8. Beyond 'flood hotspots': Modelling emergency service accessibility during flooding in York, UK

    Science.gov (United States)

    Coles, Daniel; Yu, Dapeng; Wilby, Robert L.; Green, Daniel; Herring, Zara

    2017-03-01

    This paper describes the development of a method that couples flood modelling with network analysis to evaluate the accessibility of city districts by emergency responders during flood events. We integrate numerical modelling of flood inundation with geographical analysis of service areas for the Ambulance Service and the Fire & Rescue Service. The method was demonstrated for two flood events in the City of York, UK to assess the vulnerability of care homes and sheltered accommodation. We determine the feasibility of emergency services gaining access within the statutory 8- and 10-min targets for high-priority, life-threatening incidents 75% of the time, during flood episodes. A hydrodynamic flood inundation model (FloodMap) simulates the 2014 pluvial and 2015 fluvial flood events. Predicted floods (with depth >25 cm and areas >100 m2) were overlain on the road network to identify sites with potentially restricted access. Accessibility of the city to emergency responders during flooding was quantified and mapped using; (i) spatial coverage from individual emergency nodes within the legislated timeframes, and; (ii) response times from individual emergency service nodes to vulnerable care homes and sheltered accommodation under flood and non-flood conditions. Results show that, during the 2015 fluvial flood, the area covered by two of the three Fire & Rescue Service stations reduced by 14% and 39% respectively, while the remaining station needed to increase its coverage by 39%. This amounts to an overall reduction of 6% and 20% for modelled and observed floods respectively. During the 2014 surface water flood, 7 out of 22 care homes (32%) and 15 out of 43 sheltered accommodation nodes (35%) had modelled response times above the 8-min threshold from any Ambulance station. Overall, modelled surface water flooding has a larger spatial footprint than fluvial flood events. Hence, accessibility of emergency services may be impacted differently depending on flood mechanism

  9. Hydrological forecast of maximal water level in Lepenica river basin and flood control measures

    Directory of Open Access Journals (Sweden)

    Milanović Ana

    2006-01-01

    Full Text Available Lepenica river basin territory has became axis of economic and urban development of Šumadija district. However, considering Lepenica River with its tributaries, and their disordered river regime, there is insufficient of water for water supply and irrigation, while on the other hand, this area is suffering big flood and torrent damages (especially Kragujevac basin. The paper presents flood problems in the river basin, maximum water level forecasts, and flood control measures carried out until now. Some of the potential solutions, aiming to achieve the effective flood control, are suggested as well.

  10. ANALYTICS OF BIG DATA

    OpenAIRE

    Asst. Prof. Shubhada Talegaon

    2014-01-01

    Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, senti...

  11. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  12. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  13. Big data need big theory too

    Science.gov (United States)

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  14. Big data need big theory too

    OpenAIRE

    Coveney, Peter V.; Dougherty, Edward R; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  15. Big data need big theory too.

    OpenAIRE

    Coveney, P. V.; Dougherty, E. R.; Highfield, R. R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  16. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  17. Principles of big data preparing, sharing, and analyzing complex information

    CERN Document Server

    Berman, Jules J

    2013-01-01

    Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endo

  18. Differential Privacy Preserving in Big Data Analytics for Connected Health.

    Science.gov (United States)

    Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei

    2016-04-01

    In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy.

  19. The Semantic Network of Flood Hydrological Data for Kelantan, Malaysia

    Science.gov (United States)

    Yusoff, Aziyati; Din, Norashidah Md; Yussof, Salman; Ullah Khan, Samee

    2016-03-01

    Every year, authorities in Malaysia are putting efforts on disaster management mechanisms including the flood incidence that might hit the east coast of Peninsular Malaysia. This includes the state of Kelantan of which it was reported that flood is just a normal event occurred annually. However, the aftermath was always unmanageable and had left the state to struggle for its own recoveries. Though it was expected that flood occurred every year, among the worst were in 1967, 1974, 1982 and recently in December 2014. This study is proposing a semantic network as an approach to the method of utilising big data analytics in analysing the huge data from the state’s flood reading stations. It is expected that by using current computing edge can also facilitate mitigating this particular disaster.

  20. Flood Impact Modelling and Natural Flood Management

    Science.gov (United States)

    Owen, Gareth; Quinn, Paul; ODonnell, Greg

    2016-04-01

    Local implementation of Natural Flood Management methods are now being proposed in many flood schemes. In principal it offers a cost effective solution to a number of catchment based problem as NFM tackles both flood risk and WFD issues. However within larger catchments there is the issue of which subcatchments to target first and how much NFM to implement. If each catchment has its own configuration of subcatchment and rivers how can the issues of flood synchronisation and strategic investment be addressed? In this study we will show two key aspects to resolving these issues. Firstly, a multi-scale network water level recorder is placed throughout the system to capture the flow concentration and travel time operating in the catchment being studied. The second is a Flood Impact Model (FIM), which is a subcatchment based model that can generate runoff in any location using any hydrological model. The key aspect to the model is that it has a function to represent the impact of NFM in any subcatchment and the ability to route that flood wave to the outfall. This function allows a realistic representation of the synchronisation issues for that catchment. By running the model in interactive mode the user can define an appropriate scheme that minimises or removes the risk of synchornisation and gives confidence that the NFM investment is having a good level of impact downstream in large flood events.

  1. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  2. Urban pluvial flood prediction

    DEFF Research Database (Denmark)

    Thorndahl, Søren Liedtke; Nielsen, Jesper Ellerbæk; Jensen, David Getreuer

    2016-01-01

    Flooding produced by high-intensive local rainfall and drainage system capacity exceedance can have severe impacts in cities. In order to prepare cities for these types of flood events – especially in the future climate – it is valuable to be able to simulate these events numerically both...... historically and in real-time. There is a rather untested potential in real-time prediction of urban floods. In this paper radar data observations with different spatial and temporal resolution, radar nowcasts of 0–2 h lead time, and numerical weather models with lead times up to 24 h are used as inputs...... to an integrated flood and drainage systems model in order to investigate the relative difference between different inputs in predicting future floods. The system is tested on a small town Lystrup in Denmark, which has been flooded in 2012 and 2014. Results show it is possible to generate detailed flood maps...

  3. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  4. Hydrologic versus geomorphic drivers of trends in flood hazard

    Science.gov (United States)

    Slater, Louise J.; Bliss Singer, Michael; Kirchner, James W.

    2016-04-01

    Flooding is a major threat to lives and infrastructure, yet trends in flood hazard are poorly understood. The capacity of river channels to convey flood flows is typically assumed to be stationary, so changes in flood frequency are thought to be driven primarily by trends in streamflow. However, changes in channel capacity will also modify flood hazard, even if the flow frequency distribution does not change. We developed new methods for separately quantifying how trends in both streamflow and channel capacity have affected flood frequency at gauging sites across the United States. Using daily discharge records and manual field measurements of channel cross-sectional geometry for USGS gauging stations that have defined flood stages (water levels), we present novel methods for measuring long-term trends in channel capacity of gauged rivers, and for quantifying how they affect overbank flood frequency. We apply these methods to 401 U.S. rivers and detect measurable trends in flood hazard linked to changes in channel capacity and/or the frequency of high flows. Flood frequency is generally nonstationary across these 401 U.S. rivers, with increasing flood hazard at a statistically significant majority of sites. Changes in flood hazard driven by channel capacity are smaller, but more numerous, than those driven by streamflow, with a slight tendency to compensate for streamflow changes. Our results demonstrate that accurately quantifying changes in flood hazard requires accounting separately for trends in both streamflow and channel capacity, or using water levels directly. They also show that channel capacity trends may have unforeseen consequences for flood management and for estimating flood insurance costs. Slater, L. J., M. B. Singer, and J. W. Kirchner (2015), Hydrologic versus geomorphic drivers of trends in flood hazard, Geophys. Res. Lett., 42, 370-376, doi:10.1002/2014GL062482.

  5. 2 Dimensional Hydrodynamic Flood Routing Analysis on Flood Forecasting Modelling for Kelantan River Basin

    Directory of Open Access Journals (Sweden)

    Azad Wan Hazdy

    2017-01-01

    Full Text Available Flood disaster occurs quite frequently in Malaysia and has been categorized as the most threatening natural disaster compared to landslides, hurricanes, tsunami, haze and others. A study by Department of Irrigation and Drainage (DID show that 9% of land areas in Malaysia are prone to flood which may affect approximately 4.9 million of the population. 2 Dimensional floods routing modelling demonstrate is turning out to be broadly utilized for flood plain display and is an extremely viable device for evaluating flood. Flood propagations can be better understood by simulating the flow and water level by using hydrodynamic modelling. The hydrodynamic flood routing can be recognized by the spatial complexity of the schematization such as 1D model and 2D model. It was found that most of available hydrological models for flood forecasting are more focus on short duration as compared to long duration hydrological model using the Probabilistic Distribution Moisture Model (PDM. The aim of this paper is to discuss preliminary findings on development of flood forecasting model using Probabilistic Distribution Moisture Model (PDM for Kelantan river basin. Among the findings discuss in this paper includes preliminary calibrated PDM model, which performed reasonably for the Dec 2014, but underestimated the peak flows. Apart from that, this paper also discusses findings on Soil Moisture Deficit (SMD and flood plain analysis. Flood forecasting is the complex process that begins with an understanding of the geographical makeup of the catchment and knowledge of the preferential regions of heavy rainfall and flood behaviour for the area of responsibility. Therefore, to decreases the uncertainty in the model output, so it is important to increase the complexity of the model.

  6. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  7. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  8. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at CBS......) have developed a research-based capability mapping tool, entitled DataProfit, which the public business consultants can use to upgrade their tool kit to enable data-driven growth in manufacturing organizations. Benefit: The DataProfit model/tool comprises insights of an extensive research project...

  9. Focus : big data, little questions?

    OpenAIRE

    Uprichard, Emma

    2013-01-01

    Big data. Little data. Deep data. Surface data. Noisy, unstructured data. Big. The world of data has gone from being analogue and digital, qualitative and quantitative, transactional and a by-product, to, simply, BIG. It is as if we couldn’t quite deal with its omnipotence and just ran out of adjectives. BIG. With all the data power it is supposedly meant to entail, one might have thought that a slightly better descriptive term might have been latched onto. But, no. BIG. Just BIG.

  10. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  11. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  12. Big and Small

    CERN Document Server

    Ekers, R D

    2010-01-01

    Technology leads discovery in astronomy, as in all other areas of science, so growth in technology leads to the continual stream of new discoveries which makes our field so fascinating. Derek de Solla Price had analysed the discovery process in science in the 1960s and he introduced the terms 'Little Science' and 'Big Science' as part of his discussion of the role of exponential growth in science. I will show how the development of astronomical facilities has followed this same trend from 'Little Science' to 'Big Science' as a field matures. We can see this in the discoveries resulting in Nobel Prizes in astronomy. A more detailed analysis of discoveries in radio astronomy shows the same effect. I include a digression to look at how science progresses, comparing the roles of prediction, serendipity, measurement and explanation. Finally I comment on the differences between the 'Big Science' culture in Physics and in Astronomy.

  13. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  14. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  15. Fragmented patterns of flood change across the United States

    Science.gov (United States)

    Archfield, S. A.; Hirsch, R. M.; Viglione, A.; Blöschl, G.

    2016-10-01

    Trends in the peak magnitude, frequency, duration, and volume of frequent floods (floods occurring at an average of two events per year relative to a base period) across the United States show large changes; however, few trends are found to be statistically significant. The multidimensional behavior of flood change across the United States can be described by four distinct groups, with streamgages experiencing (1) minimal change, (2) increasing frequency, (3) decreasing frequency, or (4) increases in all flood properties. Yet group membership shows only weak geographic cohesion. Lack of geographic cohesion is further demonstrated by weak correlations between the temporal patterns of flood change and large-scale climate indices. These findings reveal a complex, fragmented pattern of flood change that, therefore, clouds the ability to make meaningful generalizations about flood change across the United States.

  16. Transcriptome marker diagnostics using big data.

    Science.gov (United States)

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  17. Unsupervised Tensor Mining for Big Data Practitioners.

    Science.gov (United States)

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  18. Architecture technology for Big Data

    National Research Council Canada - National Science Library

    Juan José Camargo Vega; Jonathan Felipe Camargo Ortega; Luis Joyanes Aguilar

    2015-01-01

    The term Big Data with each passing day, it becomes more important, which is why in this research is studied, analyzed and disclosed in a comprehensive manner the different architectures of Big Data...

  19. RASOR flood modelling

    Science.gov (United States)

    Beckers, Joost; Buckman, Lora; Bachmann, Daniel; Visser, Martijn; Tollenaar, Daniel; Vatvani, Deepak; Kramer, Nienke; Goorden, Neeltje

    2015-04-01

    Decision making in disaster management requires fast access to reliable and relevant information. We believe that online information and services will become increasingly important in disaster management. Within the EU FP7 project RASOR (Rapid Risk Assessment and Spatialisation of Risk) an online platform is being developed for rapid multi-hazard risk analyses to support disaster management anywhere in the world. The platform will provide access to a plethora of GIS data that are relevant to risk assessment. It will also enable the user to run numerical flood models to simulate historical and newly defined flooding scenarios. The results of these models are maps of flood extent, flood depths and flow velocities. The RASOR platform will enable to overlay historical event flood maps with observations and Earth Observation (EO) imagery to fill in gaps and assess the accuracy of the flood models. New flooding scenarios can be defined by the user and simulated to investigate the potential impact of future floods. A series of flood models have been developed within RASOR for selected case study areas around the globe that are subject to very different flood hazards: • The city of Bandung in Indonesia, which is prone to fluvial flooding induced by heavy rainfall. The flood hazard is exacerbated by land subsidence. • The port of Cilacap on the south coast of Java, subject to tsunami hazard from submarine earthquakes in the Sunda trench. • The area south of city of Rotterdam in the Netherlands, prone to coastal and/or riverine flooding. • The island of Santorini in Greece, which is subject to tsunamis induced by landslides. Flood models have been developed for each of these case studies using mostly EO data, augmented by local data where necessary. Particular use was made of the new TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement) product from the German Aerospace centre (DLR) and EADS Astrium. The presentation will describe the flood models and the

  20. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  1. Flood regimes in a changing world: What do we know?

    Science.gov (United States)

    Bloeschl, G.

    2015-12-01

    There has been a surprisingly large number of major floods in the last years around the world which suggests that floods may have increased and will continue to increase in the next decades. However, the realism of such changes is still hotly discussed in the literature. In this presentation I will argue that a fresh look is needed at the flood change problem in terms of the causal factors including river training, land use changes and climate variability. Analysing spatial patterns of dynamic flood characteristics helps learn form the rich diversity of flood processes across the landscape. I will present a number of examples across Europe to illustrate the range of flood generation processes and the causal factors of changes in the flood regime. On the basis of these examples, I will demonstrate how comparative hydrology can assist in learning from the differences of flood characteristics between catchments both for present and future conditions. Focus on the interactions of the natural and human water system will be instrumental in making meaningful statements about future floods in a changing world. References Hall et al. (2014) Understanding Flood Regime Changes in Europe: A state of the art assessment. Hydrol. Earth Sys. Sc., 18, 2735-2772. Blöschl et al. (2015) Increasing river floods: fiction or reality? Wiley Interdisciplinary Reviews: Water. doi: 10.1002/wat2.1079

  2. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  3. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  4. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  5. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  6. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  7. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  8. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  9. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  10. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  11. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  12. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  13. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  14. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  15. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  16. 1999年长江洪水及几点认识%The 1999 Flood on Changjiang River and Some Thoug hts on It

    Institute of Scientific and Technical Information of China (English)

    WANG; Sheng-fu

    2001-01-01

    Following the basin wide heavy flood on Changjiang River in 1998, a si gnificant flood occurred in 1999. Comparative analysis of both floods in terms of flows and flooding situations shows that both floods had one common feature, that is, the flood stages were fairly high. But they differed greatly at the sa me time, that is, the 1998 flood was a basin wide heavy one while the 1999 flood was a significant local one. At Yichang station there occurred eight flood pea ks in 1998, while in 1999 only three peaks took place. The maximum peak dischar ge at this station in 1999 was 57 600 cubic meters per second, which was smaller than that in 1998. The maximum flood-volume in 30 d of the 1998 flood at this s t ation equaled that in 1954, when an extraordinary heavy flood happened on the River, while the maximum flood-volume in 30 d in 1999 was 25.8 billion cubic me t ers per second smaller than the 1998 one. It is seen that inflow floods from th e upstream Changjiang River (above Yichang) in 1999 were not so big. Comparison of flood volumes in longer period shows that the 1999 flood was relatively conc entrated while the 1998 one had lasted longer duration. Analysis shows that flo oding situations in both years differed significantly in terms of the flood volu mes diverted from river channels due to dyke breaches and collapses, the cases o f polder embankment collapses, the areas of inundated cultivated land and the nu mbers of dangerous events for hydraulic structures. These differences had been resulted from the different properties of both floods and the dyke strengthening efforts made after the 1998 flood. It is seen that flood control engineering c onstructions initiated in the days following the 1998 flood have played an impor tant role in fighting the 1999 flood.

  17. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  18. NASA Global Flood Mapping System

    Science.gov (United States)

    Policelli, Fritz; Slayback, Dan; Brakenridge, Bob; Nigro, Joe; Hubbard, Alfred

    2017-01-01

    Product utility key factors: Near real time, automated production; Flood spatial extent Cloudiness Pixel resolution: 250m; Flood temporal extent; Flash floods short duration on ground?; Landcover--Water under vegetation cover vs open water

  19. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  20. Flood hazard and flood risk assessment using a time series of satellite images: a case study in Namibia.

    Science.gov (United States)

    Skakun, Sergii; Kussul, Nataliia; Shelestov, Andrii; Kussul, Olga

    2014-08-01

    In this article, the use of time series of satellite imagery to flood hazard mapping and flood risk assessment is presented. Flooded areas are extracted from satellite images for the flood-prone territory, and a maximum flood extent image for each flood event is produced. These maps are further fused to determine relative frequency of inundation (RFI). The study shows that RFI values and relative water depth exhibit the same probabilistic distribution, which is confirmed by Kolmogorov-Smirnov test. The produced RFI map can be used as a flood hazard map, especially in cases when flood modeling is complicated by lack of available data and high uncertainties. The derived RFI map is further used for flood risk assessment. Efficiency of the presented approach is demonstrated for the Katima Mulilo region (Namibia). A time series of Landsat-5/7 satellite images acquired from 1989 to 2012 is processed to derive RFI map using the presented approach. The following direct damage categories are considered in the study for flood risk assessment: dwelling units, roads, health facilities, and schools. The produced flood risk map shows that the risk is distributed uniformly all over the region. The cities and villages with the highest risk are identified. The proposed approach has minimum data requirements, and RFI maps can be generated rapidly to assist rescuers and decisionmakers in case of emergencies. On the other hand, limitations include: strong dependence on the available data sets, and limitations in simulations with extrapolated water depth values.

  1. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  2. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  3. On Flood Alert

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    lina braces fora particularly dangerous flood season in the wake of disastrous rainstorms Aseries of heavy storms since early May led to severe flooding and landslides in south and southwest China,causing heavy casualties and economic losses. Severe convective weather such as downpours,

  4. Discover Floods Educators Guide

    Science.gov (United States)

    Project WET Foundation, 2009

    2009-01-01

    Now available as a Download! This valuable resource helps educators teach students about both the risks and benefits of flooding through a series of engaging, hands-on activities. Acknowledging the different roles that floods play in both natural and urban communities, the book helps young people gain a global understanding of this common--and…

  5. Tested Demonstrations.

    Science.gov (United States)

    Sands, Robert; And Others

    1982-01-01

    Procedures for two demonstrations are provided. The solubility of ammonia gas in water is demonstrated by introducing water into a closed can filled with the gas, collapsing the can. The second demonstration relates scale of standard reduction potentials to observed behavior of metals in reactions with hydrogen to produce hydrogen gas. (Author/JN)

  6. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  7. Distillation Column Flooding Predictor

    Energy Technology Data Exchange (ETDEWEB)

    George E. Dzyacky

    2010-11-23

    The Flooding Predictor™ is a patented advanced control technology proven in research at the Separations Research Program, University of Texas at Austin, to increase distillation column throughput by over 6%, while also increasing energy efficiency by 10%. The research was conducted under a U. S. Department of Energy Cooperative Agreement awarded to George Dzyacky of 2ndpoint, LLC. The Flooding Predictor™ works by detecting the incipient flood point and controlling the column closer to its actual hydraulic limit than historical practices have allowed. Further, the technology uses existing column instrumentation, meaning no additional refining infrastructure is required. Refiners often push distillation columns to maximize throughput, improve separation, or simply to achieve day-to-day optimization. Attempting to achieve such operating objectives is a tricky undertaking that can result in flooding. Operators and advanced control strategies alike rely on the conventional use of delta-pressure instrumentation to approximate the column’s approach to flood. But column delta-pressure is more an inference of the column’s approach to flood than it is an actual measurement of it. As a consequence, delta pressure limits are established conservatively in order to operate in a regime where the column is never expected to flood. As a result, there is much “left on the table” when operating in such a regime, i.e. the capacity difference between controlling the column to an upper delta-pressure limit and controlling it to the actual hydraulic limit. The Flooding Predictor™, an innovative pattern recognition technology, controls columns at their actual hydraulic limit, which research shows leads to a throughput increase of over 6%. Controlling closer to the hydraulic limit also permits operation in a sweet spot of increased energy-efficiency. In this region of increased column loading, the Flooding Predictor is able to exploit the benefits of higher liquid

  8. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  9. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  10. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  11. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  12. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  13. Geotechnologies to estimate flooded rice crop area

    Directory of Open Access Journals (Sweden)

    Marcos Adami

    2006-12-01

    Full Text Available The estimate of the main crops is fundamental to plan the agricultural sector and for the generation of statistics on the future crops. Among several products of importance in the national agricultural scenery, the rice crop represents an important participation of the total Brazilian grain production. Most of this production is found in the Southern States of the country, mainly in Rio Grande do Sul state, standing for about 47% of the national production. Several producing areas in this state showed a big expansion of this crop in its extreme southern region, not only in the technological development but also in the development of new varieties. The objective of this work was to calculate the area cultivated with flooded rice in the county of Santa Vitória of Palmar, Rio Grande do Sul State, using TM and ETM+ sensors images from the satellites Landsat 5 and 7, respectively. The images were acquired along the crop season in order to define the most favorable period for their acquisition. The multitemporal analysis of the images allowed estimating the flooded rice area in 60.557 ha, underestimating in 1,67% the official data from the IRGA (Rice Institute of Rio Grande do Sul. When analyzed the dates individually the March image showed an excellent result. In the multitemporal analysis, the classifications using the months of November + December and only December obtained satisfactory results and with the advantage to supply a forecast planted area with flooded rice.

  14. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  15. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  16. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  17. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  18. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  19. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  20. DARPA's Big Mechanism program.

    Science.gov (United States)

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  1. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  2. Edible Astronomy Demonstrations

    Science.gov (United States)

    Lubowich, Donald A.

    2007-12-01

    Astronomy demonstrations with edible ingredients are an effective way to increase student interest and knowledge of astronomical concepts. This approach has been successful with all age groups from elementary school through college students - and the students remember these demonstrations after they are presented. In this poster I describe edible demonstrations I have created to simulate the expansion of the universe (using big-bang chocolate chip cookies); differentiation during the formation of the Earth and planets (using chocolate or chocolate milk with marshmallows, cereal, candy pieces or nuts); and radioactivity/radioactive dating (using popcorn). Other possible demonstrations include: plate tectonics (crackers with peanut butter and jelly); convection (miso soup or hot chocolate); mud flows on Mars (melted chocolate poured over angel food cake); formation of the Galactic disk (pizza); formation of spiral arms (coffee with cream); the curvature of Space (Pringles); constellations patterns with chocolate chips and chocolate chip cookies; planet shaped cookies; star shaped cookies with different colored frostings; coffee or chocolate milk measurement of solar radiation; Oreo cookie lunar phases. Sometimes the students eat the results of the astronomical demonstrations. These demonstrations are an effective teaching tool and can be adapted for cultural, culinary, and ethnic differences among the students.

  3. Big Data Knowledge Mining

    Directory of Open Access Journals (Sweden)

    Huda Umar Banuqitah

    2016-11-01

    Full Text Available Big Data (BD era has been arrived. The ascent of big data applications where information accumulation has grown beyond the ability of the present programming instrument to catch, manage and process within tolerable short time. The volume is not only the characteristic that defines big data, but also velocity, variety, and value. Many resources contain BD that should be processed. The biomedical research literature is one among many other domains that hides a rich knowledge. MEDLINE is a huge biomedical research database which remain a significantly underutilized source of biological information. Discovering the useful knowledge from such huge corpus leading to many problems related to the type of information such as the related concepts of the domain of texts and the semantic relationship associated with them. In this paper, an agent-based system of two–level for Self-supervised relation extraction from MEDLINE using Unified Medical Language System (UMLS Knowledgebase, has been proposed . The model uses a Self-supervised Approach for Relation Extraction (RE by constructing enhanced training examples using information from UMLS with hybrid text features. The model incorporates Apache Spark and HBase BD technologies with multiple data mining and machine learning technique with the Multi Agent System (MAS. The system shows a better result in comparison with the current state of the art and naïve approach in terms of Accuracy, Precision, Recall and F-score.

  4. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma.

  5. Risk-trading in flood management: An economic model.

    Science.gov (United States)

    Chang, Chiung Ting

    2017-09-15

    Although flood management is no longer exclusively a topic of engineering, flood mitigation continues to be associated with hard engineering options. Flood adaptation or the capacity to adapt to flood risk, as well as a demand for internalizing externalities caused by flood risk between regions, complicate flood management activities. Even though integrated river basin management has long been recommended to resolve the above issues, it has proven difficult to apply widely, and sometimes even to bring into existence. This article explores how internalization of externalities as well as the realization of integrated river basin management can be encouraged via the use of a market-based approach, namely a flood risk trading program. In addition to maintaining efficiency of optimal resource allocation, a flood risk trading program may also provide a more equitable distribution of benefits by facilitating decentralization. This article employs a graphical analysis to show how flood risk trading can be implemented to encourage mitigation measures that increase infiltration and storage capacity. A theoretical model is presented to demonstrate the economic conditions necessary for flood risk trading. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Late Pleistocene outburst flooding from pluvial Lake Alvord into the Owyhee River, Oregon

    Science.gov (United States)

    Carter, Deron T.; Ely, Lisa L.; O'Connor, Jim E.; Fenton, Cassandra R.

    2006-05-01

    At least one large, late Pleistocene flood traveled into the Owyhee River as a result of a rise and subsequent outburst from pluvial Lake Alvord in southeastern Oregon. Lake Alvord breached Big Sand Gap in its eastern rim after reaching an elevation of 1292 m, releasing 11.3 km 3 of water into the adjacent Coyote Basin as it eroded the Big Sand Gap outlet channel to an elevation of about 1280 m. The outflow filled and then spilled out of Coyote Basin through two outlets at 1278 m and into Crooked Creek drainage, ultimately flowing into the Owyhee and Snake Rivers. Along Crooked Creek, the resulting flood eroded canyons, stripped bedrock surfaces, and deposited numerous boulder bars containing imbricated clasts up to 4.1 m in diameter, some of which are located over 30 m above the present-day channel. Critical depth calculations at Big Sand Gap show that maximum outflow from a 1292- to 1280-m drop in Lake Alvord was ˜ 10,000 m 3 s - 1 . Flooding became confined to a single channel approximately 40 km downstream of Big Sand Gap, where step-backwater calculations show that a much larger peak discharge of 40,000 m 3 s - 1 is required to match the highest geologic evidence of the flood in this channel. This inconsistency can be explained by (1) a single 10,000 m 3 s - 1 flood that caused at least 13 m of vertical incision in the channel (hence enlarging the channel cross-section); (2) multiple floods of 10,000 m 3 s - 1 or less, each producing some incision of the channel; or (3) an earlier flood of 40,000 m 3 s - 1 creating the highest flood deposits and crossed drainage divides observed along Crooked Creek drainage, followed by a later 10,000 m 3 s - 1 flood associated with the most recent shorelines in Alvord and Coyote Basins. Well-developed shorelines of Lake Alvord at 1280 m and in Coyote Basin at 1278 m suggest that after the initial flood, postflood overflow persisted for an extended period, connecting Alvord and Coyote Basins with the Owyhee River of the

  7. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1983-01-01

    Free radical chlorination of methane is used in organic chemistry to introduce free radical/chain reactions. In spite of its common occurrence, demonstrations of the reaction are uncommon. Therefore, such a demonstration is provided, including background information, preparation of reactants/reaction vessel, introduction of reactants, irradiation,…

  8. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1983-01-01

    Discusses a supplement to the "water to rose" demonstration in which a pink color is produced. Also discusses blood buffer demonstrations, including hydrolysis of sodium bicarbonate, simulated blood buffer, metabolic acidosis, natural compensation of metabolic acidosis, metabolic alkalosis, acidosis treatment, and alkalosis treatment. Procedures…

  9. Complete Demonstration.

    Science.gov (United States)

    Yelon, Stephen; Maddocks, Peg

    1986-01-01

    Describes four-step approach to educational demonstration: tell learners they will have to perform; what they should notice; describe each step before doing it; and require memorization of steps. Examples illustrate use of this process to demonstrate a general mental strategy, and industrial design, supervisory, fine motor, and specific…

  10. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1987-01-01

    Describes two laboratory demonstrations in chemistry. One uses dry ice, freon, and freezer bags to demonstrate volume changes, vapor-liquid equilibrium, a simulation of a rain forest, and vaporization. The other uses the clock reaction technique to illustrate fast reactions and kinetic problems in releasing carbon dioxide during respiration. (TW)

  11. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1986-01-01

    Outlines a simple, inexpensive way of demonstrating electroplating using the reaction between nickel ions and copper metal. Explains how to conduct a demonstration of the electrolysis of water by using a colored Na2SO4 solution as the electrolyte so that students can observe the pH changes. (TW)

  12. Meta-analyses of Big Six Interests and Big Five Personality Factors.

    Science.gov (United States)

    Larson, Lisa M.; Rottinghaus, Patrick J.; Borgen, Fred H.

    2002-01-01

    Meta-analysis of 24 samples demonstrated overlap between Holland's vocational interest domains (measured by Self Directed Search, Strong Interest Inventory, and Vocational Preference Inventory) and Big Five personality factors (measured by Revised NEO Personalty Inventory). The link is stronger for five interest-personality pairs:…

  13. Flood insurance in Canada: implications for flood management and residential vulnerability to flood hazards.

    Science.gov (United States)

    Oulahen, Greg

    2015-03-01

    Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability.

  14. Flood Insurance in Canada: Implications for Flood Management and Residential Vulnerability to Flood Hazards

    Science.gov (United States)

    Oulahen, Greg

    2015-03-01

    Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability.

  15. Predicting floods with Flickr tags.

    Science.gov (United States)

    Tkachenko, Nataliya; Jarvis, Stephen; Procter, Rob

    2017-01-01

    Increasingly, user generated content (UGC) in social media postings and their associated metadata such as time and location stamps are being used to provide useful operational information during natural hazard events such as hurricanes, storms and floods. The main advantage of these new sources of data are twofold. First, in a purely additive sense, they can provide much denser geographical coverage of the hazard as compared to traditional sensor networks. Second, they provide what physical sensors are not able to do: By documenting personal observations and experiences, they directly record the impact of a hazard on the human environment. For this reason interpretation of the content (e.g., hashtags, images, text, emojis, etc) and metadata (e.g., keywords, tags, geolocation) have been a focus of much research into social media analytics. However, as choices of semantic tags in the current methods are usually reduced to the exact name or type of the event (e.g., hashtags '#Sandy' or '#flooding'), the main limitation of such approaches remains their mere nowcasting capacity. In this study we make use of polysemous tags of images posted during several recent flood events and demonstrate how such volunteered geographic data can be used to provide early warning of an event before its outbreak.

  16. Unexpected flood loss correlations across Europe

    Science.gov (United States)

    Booth, Naomi; Boyd, Jessica

    2017-04-01

    Floods don't observe country borders, as highlighted by major events across Europe that resulted in heavy economic and insured losses in 1999, 2002, 2009 and 2013. Flood loss correlations between some countries occur along multi-country river systems or between neighbouring nations affected by the same weather systems. However, correlations are not so obvious and whilst flooding in multiple locations across Europe may appear independent, for a re/insurer providing cover across the continent, these unexpected correlations can lead to high loss accumulations. A consistent, continental-scale method that allows quantification and comparison of losses, and identifies correlations in loss between European countries is therefore essential. A probabilistic model for European river flooding was developed that allows estimation of potential losses to pan-European property portfolios. By combining flood hazard and exposure information in a catastrophe modelling platform, we can consider correlations between river basins across Europe rather than being restricted to country boundaries. A key feature of the model is its statistical event set based on extreme value theory. Using historical river flow data, the event set captures spatial and temporal patterns of flooding across Europe and simulates thousands of events representing a full range of possible scenarios. Some known correlations were identified, such as between neighbouring Belgium and Luxembourg where 28% of events that affect either country produce a loss in both. However, our model identified some unexpected correlations including between Austria and Poland, and Poland and France, which are geographically distant. These correlations in flood loss may be missed by traditional methods and are key for re/insurers with risks in multiple countries. The model also identified that 46% of European river flood events affect more than one country. For more extreme events with a return period higher than 200 years, all events

  17. The Protection of China's Ancient Cities from Flood Damage.

    Science.gov (United States)

    Qingzhou, W

    1989-09-01

    Over many centuries, the repeated and serious flooding of many of China's ancient cities has led to the development of various measures to mitigate the impact of floods. These have included structural measures, such as the construction of walls, dams and dykes, with tree planting for soil consolidation; installation of drainage systems and water storage capacity; the raising of settlement levels and the strengthening of building materials. Non-structural measures include warning systems and planning for emergency evacuation. Urban planning and architectural design have evolved to reduce flood damage, and government officials have been appointed with specific responsibilities for managing the flood control systems. In view of the serious consequences of modern neglect of these well-tried methods, this paper examines China's historical experience of flooding and demonstrates its continuing relevance for today. A brief historical survey is followed by a detailed discussion of various flood prevention measures. The paper is illustrated by city plans from ancient local chronicles.

  18. Effect of flooding on C metabolism of flood-tolerant (Quercus robur) and non-tolerant (Fagus sylvatica) tree species.

    Science.gov (United States)

    Ferner, Eleni; Rennenberg, Heinz; Kreuzwieser, Jürgen

    2012-02-01

    Flooding is assumed to cause an energy crisis in plants because-due to a lack of O(2)-mitochondrial respiration is replaced by alcoholic fermentation which yields considerably less energy equivalents. In the present study, the effect of flooding on the carbon metabolism of flooding-tolerant pedunculate oak (Quercus robur L.) and flooding-sensitive European beech (Fagus sylvatica L.) seedlings was characterized. Whereas soluble carbohydrate concentrations dropped in roots of F. sylvatica, they were constant in Q. robur during flooding. At the same time, root alcohol dehydrogenase activities were decreased in beech but not in oak, suggesting substrate limitation of alcoholic fermentation in beech roots. Surprisingly, leaf and phloem sap sugar concentrations increased in both species but to a much higher degree in beech. This finding suggests that the phloem unloading process in flooding-sensitive beech was strongly impaired. It is assumed that root-derived ethanol is transported to the leaves via the transpiration stream. This mechanism is considered an adaptation to flooding because it helps avoid the accumulation of toxic ethanol in the roots and supports the whole plant's carbon metabolism by channelling ethanol into the oxidative metabolism of the leaves. A labelling experiment demonstrated that in the leaves of flooded trees, ethanol metabolism does not differ between flooded beech and oak, indicating that processes in the roots are crucial for the trees' flooding tolerance.

  19. Modeling of Flood Risk for the Continental United States

    Science.gov (United States)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  20. Interactive Web-based Floodplain Simulation System for Realistic Experiments of Flooding and Flood Damage

    Science.gov (United States)

    Demir, I.

    2013-12-01

    Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.

  1. Dynamic Flood Vulnerability Mapping with Google Earth Engine

    Science.gov (United States)

    Tellman, B.; Kuhn, C.; Max, S. A.; Sullivan, J.

    2015-12-01

    Satellites capture the rate and character of environmental change from local to global levels, yet integrating these changes into flood exposure models can be cost or time prohibitive. We explore an approach to global flood modeling by leveraging satellite data with computing power in Google Earth Engine to dynamically map flood hazards. Our research harnesses satellite imagery in two main ways: first to generate a globally consistent flood inundation layer and second to dynamically model flood vulnerability. Accurate and relevant hazard maps rely on high quality observation data. Advances in publicly available spatial, spectral, and radar data together with cloud computing allow us to improve existing efforts to develop a comprehensive flood extent database to support model training and calibration. This talk will demonstrate the classification results of algorithms developed in Earth Engine designed to detect flood events by combining observations from MODIS, Landsat 8, and Sentinel-1. Our method to derive flood footprints increases the number, resolution, and precision of spatial observations for flood events both in the US, recorded in the NCDC (National Climatic Data Center) storm events database, and globally, as recorded events from the Colorado Flood Observatory database. This improved dataset can then be used to train machine learning models that relate spatial temporal flood observations to satellite derived spatial temporal predictor variables such as precipitation, antecedent soil moisture, and impervious surface. This modeling approach allows us to rapidly update models with each new flood observation, providing near real time vulnerability maps. We will share the water detection algorithms used with each satellite and discuss flood detection results with examples from Bihar, India and the state of New York. We will also demonstrate how these flood observations are used to train machine learning models and estimate flood exposure. The final stage of

  2. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L.

    1990-01-01

    Included are three demonstrations that include the phase change of ice when under pressure, viscoelasticity and colloid systems, and flame tests for metal ions. The materials, procedures, probable results, and applications to real life situations are included. (KR)

  3. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1980-01-01

    Presented is a Corridor Demonstration which can be set up in readily accessible areas such as hallways or lobbies. Equipment is listed for a display of three cells (solar cells, fuel cells, and storage cells) which develop electrical energy. (CS)

  4. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1987-01-01

    Presents three demonstrations suitable for undergraduate chemistry classes. Focuses on experiments with calcium carbide, the induction by iron of the oxidation of iodide by dichromate, and the classical iodine clock reaction. (ML)

  5. Flood Risk Management in Iowa through an Integrated Flood Information System

    Science.gov (United States)

    Demir, Ibrahim; Krajewski, Witold

    2013-04-01

    communities in advance to help minimize damage of floods. This presentation provides an overview and live demonstration of the tools and interfaces in the IFIS developed to date to provide a platform for one-stop access to flood related data, visualizations, flood conditions, and forecast.

  6. Random Forests for Big Data

    OpenAIRE

    Genuer, Robin; Poggi, Jean-Michel; Tuleau-Malot, Christine; Villa-Vialaneix, Nathalie

    2017-01-01

    Big Data is one of the major challenges of statistical science and has numerous consequences from algorithmic and theoretical viewpoints. Big Data always involve massive data but they also often include online data and data heterogeneity. Recently some statistical methods have been adapted to process Big Data, like linear regression models, clustering methods and bootstrapping schemes. Based on decision trees combined with aggregation and bootstrap ideas, random forests were introduced by Bre...

  7. FLOODPLAIN, FLOOD COUNTY, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Floodplain Mapping/Redelineation study deliverables depict and quantify the flood risks for the study area. The primary risk classifications used are the...

  8. Localized Flood Management

    Science.gov (United States)

    practitioners will cover a range of practices that can help communities build flood resilience, from small scale interventions such as rain gardens and permeable pavement to coordinated open space and floodplain preservation

  9. Floods and Mold Growth

    Science.gov (United States)

    Mold growth may be a problem after flooding. Excess moisture in the home is cause for concern about indoor air quality primarily because it provides breeding conditions for pests, molds and other microorganisms.

  10. The Terrible Flood

    Institute of Scientific and Technical Information of China (English)

    Dorine; Houston

    1998-01-01

    Dear Xiao Lan. ’Several times a week, no matter which of the major television news networksI turn to, the screen is filled with tragic pictures of flooding along the YangtzeRiver, and I grieve for the suffering people whose lives are being so terriblydisrupted by this disaster. Even more to be grieved is the terrible number of peoplewho have been killed by the floods and their effects.

  11. Achieving Natural Flood Management through collaboration

    Science.gov (United States)

    Nicholson, Alex; Byers, Samantha; Thomas, Ted; Welton, Phil

    2016-04-01

    Recent flooding in the UK has brought much attention to the field of Natural flood Management (NFM) as a means of helping to reduce flood risk to communities. Key questions exist in the field, which include quantifying the impact of NFM and maintaining it. In addition, agencies and at-risk communities look for ways of delivering NFM in a tightly stretched financial climate. Well-implemented NFM has the effect of restoring more natural catchment hydrological and sedimentological processes, which in turn can have significant flood risk and WFD benefits for catchment waterbodies. These catchment scale improvements in-turn allow more 'natural' processes to be returned to rivers and streams, creating a more resilient system. NFM can tick many boxes and target many funding opportunities. This paper discusses the NFM component of the Lustrum Beck Flood Alleviation Scheme (Stockton-On-Tees, UK), and explains how a multi-agency approach had to be considered to allow elements of the scheme to be delivered. A startling 70 different landowners and agencies manage the land in the Lustrum Beck catchment (~40km2). A partnership between the Environment Agency and the Forestry Commission is planning to work on a demonstration site in the centre of the catchment. The paper goes on to explain the importance of this demonstration area in the context of the wider scheme.

  12. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  13. Flood Bypass Capacity Optimization

    Science.gov (United States)

    Siclari, A.; Hui, R.; Lund, J. R.

    2015-12-01

    Large river flows can damage adjacent flood-prone areas, by exceeding river channel and levee capacities. Particularly large floods are difficult to contain in leveed river banks alone. Flood bypasses often can efficiently reduce flood risks, where excess river flow is diverted over a weir to bypasses, that incur much less damage and cost. Additional benefits of bypasses include ecosystem protection, agriculture, groundwater recharge and recreation. Constructing or expanding an existing bypass costs in land purchase easements, and levee setbacks. Accounting for such benefits and costs, this study develops a simple mathematical model for optimizing flood bypass capacity using benefit-cost and risk analysis. Application to the Yolo Bypass, an existing bypass along the Sacramento River in California, estimates optimal capacity that economically reduces flood damage and increases various benefits, especially for agriculture. Land availability is likely to limit bypass expansion. Compensation for landowners could relax such limitations. Other economic values could affect the optimal results, which are shown by sensitivity analysis on major parameters. By including land geography into the model, location of promising capacity expansions can be identified.

  14. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  15. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  16. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  17. Building a flood climatology and rethinking flood risk at continental scales

    Science.gov (United States)

    Andreadis, Konstantinos; Schumann, Guy; Stampoulis, Dimitrios; Smith, Andrew; Neal, Jeffrey; Bates, Paul; Sampson, Christopher; Brakenridge, Robert; Kettner, Albert

    2016-04-01

    Floods are one of the costliest natural disasters and the ability to understand their characteristics and their interactions with population, land cover and climate changes is of paramount importance. In order to accurately reproduce flood characteristics such as water inundation and heights both in the river channels and floodplains, hydrodynamic models are required. Most of these models operate at very high resolutions and are computationally very expensive, making their application over large areas very difficult. However, a need exists for such models to be applied at regional to global scales so that the effects of climate change with regards to flood risk can be examined. We use the a modeling framework that includes the VIC hydrologic and the LISFLOOD-FP hydrodynamic model to simulate a 40-year history of flood characteristics at the continental scale, particularly Australia. In order to extend the simulated flood climatology to 50-100 years in a consistent manner, reanalysis datasets have to be used as meteorological forcings to the models. The objective of this study is the evaluation of multiple atmospheric reanalysis datasets (ERA, NCEP, MERRA, JRA) as inputs to the VIC/LISFLOOD-FP model. Comparisons of the simulated flood characteristics are made with both satellite observations of inundation and a benchmark simulation of LISFLOOD-FP being forced by observed flows. The implications of having a climatology of flood characteristics are discussed, and in particular We found the magnitude and timing of floodplain water storage to significantly differ from streamflow in terms of their distribution. Furthermore, floodplain volume gave a much sharper discrimination of high hazard and low hazard periods than discharge, and using the latter can lead to significant overestimation. These results demonstrate that global streamflow statistics or precipitation should not be used to infer flood hazard and risk, but instead a flood inundation climatology is necessary.

  18. Explorers Presentation: Flooding and Coastal Communities

    OpenAIRE

    Institute, Marine

    2015-01-01

    : Explorers Flooding and Coastal Communities presentation provides an introduction to flooding. This can be used with the lesson plan on building flood defences. It covers: What is a flood? Why does it flood? Where does the water come from? The water cycle; Where is water stored? Examples of Pluvial vs. Coastal flooding; Impacts of flooding; Flood defences; What else influences flooding - Human impacts, Urbanisation, Deforestation, Sea level rise

  19. Can Pleasant Goat and Big Big Wolf Save China's Animation Industry?

    Institute of Scientific and Technical Information of China (English)

    Guo Liqin

    2009-01-01

    "My dreamed husband is big big wolf," claimed Miss Fang, a young lady who works in KPMG Beijing Office. This big big wolf is a lovely cartoon wolf appeared in a Pleasant Goat and Big Big Wolf produced independently by Chinese.

  20. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1987-01-01

    Describes two demonstrations to illustrate characteristics of substances. Outlines a method to detect the changes in pH levels during the electrolysis of water. Uses water pistols, one filled with methane gas and the other filled with water, to illustrate the differences in these two substances. (TW)

  1. ICT Demonstration

    DEFF Research Database (Denmark)

    Jensen, Tine Wirenfeldt; Bay, Gina

    In this demonstration we present and discuss two interrelated on-line learning resources aimed at supporting international students at Danish universities in building study skills (the Study Metro) and avoiding plagiarism (Stopplagiarism). We emphasize the necessity of designing online learning r...

  2. Flexibility in Flood Management Design: Proactive Planning Under Climate Change Uncertainty

    Science.gov (United States)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2015-12-01

    This paper presents an innovative, value-enhancing procedure for effective planning and design of long-lived flood management infrastructure given uncertain future flooding threats due to climate change. Designing infrastructure that can be adapted over time is a method to safeguard the efficacy of current design decisions given uncertainty about rates and future impacts of climate change. This paper explores the value of embedding "options" in a physical structure, where an option is the right but not the obligation to do something at a later date (e.g. over-dimensioning a floodwall foundation now facilitates a future height addition in response to observed increases in sea level; building of extra pump bays in a pumping station now enables the addition of pumping capacity whenever increased precipitation warrants an expansion.) The proposed procedure couples a simulation model that captures future climate induced changes to the hydrologic operating environment of a structure, with an economic model that estimates the lifetime economic performance of alternative investments. The economic model uses Real "In" Options analysis, a type of cash flow analysis that quantifies the implicit value of options and the flexibility they provide. This procedure is demonstrated using replacement planning for the multi-functional pumping station IJmuiden on the North Sea Canal in the Netherlands. Flexibility in design decisions is modelled, varying the size and specific options included in the new structure. Results indicate that the incorporation of options within the structural design has the potential to improve its economic performance, as compared to more traditional, "build it once and build it big" designs where flexibility is not an explicit design criterion. The added value resulting from the incorporation of flexibility varies with the range of future conditions considered, as well as the options examined. This procedure could be applied more broadly to explore

  3. The Thames Gateway: planning policy and flood risk scenarios

    Science.gov (United States)

    Eldridge, Jillian; Horn, Diane

    2010-05-01

    The Thames Gateway, currently Europe's largest regeneration project, presents a valuable case study area in which to examine the interrelated issues of planning policy, flood risk and insurance loss potential. The region is typified by a significant exposure to flooding due to its location, which as developments proceed, could result in increased areas of vulnerability with consequential insurance loss and hotspots of risk. With 160,000 new homes planned by 2016, positive use of planning policy is fundamental in minimising potential flood risk as well as ensuring long term economic and social goals can be met. This project focuses on several planning scenarios within the Gateway for the areas of Barking and Medway, and models flood risk using a commercial flood model to develop the flood risk under alternative planning policy scenarios. The two areas chosen demonstrate major regeneration and redevelopment sites located on Thames tidal floodplain. The areas are protected by flood defences although are both downstream of the Thames Barrier. However, it is expected that defences will be maintained and upgraded over the next several years, particularly in the Medway, which is currently protected to a lower level than most other areas in the Thames Gateway. The progress of development is more advanced in Barking with the major regeneration site, Barking Riverside, hosting 2000 new homes. The study sites have been chosen based on their location and proximity to the Thames and allow for an analysis of planning policy and its influence in minimising risk into the future. The reflected change in flood risk due to both the planned developments and flood defences will help to understand change in risk over time and the intricacies expected with delivering planning policy in a multi-governed area subject to conflicting objectives. Flood risk for both sites are modelled using a commercial flood model to estimate flood risk based on several flood scenarios for both current and

  4. Asteroids Were Born Big

    CERN Document Server

    Morbidelli, Alessandro; Nesvorny, David; Levison, Harold F

    2009-01-01

    How big were the first planetesimals? We attempt to answer this question by conducting coagulation simulations in which the planetesimals grow by mutual collisions and form larger bodies and planetary embryos. The size frequency distribution (SFD) of the initial planetesimals is considered a free parameter in these simulations, and we search for the one that produces at the end objects with a SFD that is consistent with asteroid belt constraints. We find that, if the initial planetesimals were small (e.g. km-sized), the final SFD fails to fulfill these constraints. In particular, reproducing the bump observed at diameter D~100km in the current SFD of the asteroids requires that the minimal size of the initial planetesimals was also ~100km. This supports the idea that planetesimals formed big, namely that the size of solids in the proto-planetary disk ``jumped'' from sub-meter scale to multi-kilometer scale, without passing through intermediate values. Moreover, we find evidence that the initial planetesimals ...

  5. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  6. A method for mapping flood hazard along roads.

    Science.gov (United States)

    Kalantari, Zahra; Nickman, Alireza; Lyon, Steve W; Olofsson, Bo; Folkeson, Lennart

    2014-01-15

    A method was developed for estimating and mapping flood hazard probability along roads using road and catchment characteristics as physical catchment descriptors (PCDs). The method uses a Geographic Information System (GIS) to derive candidate PCDs and then identifies those PCDs that significantly predict road flooding using a statistical modelling approach. The method thus allows flood hazards to be estimated and also provides insights into the relative roles of landscape characteristics in determining road-related flood hazards. The method was applied to an area in western Sweden where severe road flooding had occurred during an intense rain event as a case study to demonstrate its utility. The results suggest that for this case study area three categories of PCDs are useful for prediction of critical spots prone to flooding along roads: i) topography, ii) soil type, and iii) land use. The main drivers among the PCDs considered were a topographical wetness index, road density in the catchment, soil properties in the catchment (mainly the amount of gravel substrate) and local channel slope at the site of a road-stream intersection. These can be proposed as strong indicators for predicting the flood probability in ungauged river basins in this region, but some care is needed in generalising the case study results other potential factors are also likely to influence the flood hazard probability. Overall, the method proposed represents a straightforward and consistent way to estimate flooding hazards to inform both the planning of future roadways and the maintenance of existing roadways.

  7. Mitigating flood exposure

    Science.gov (United States)

    Shultz, James M; McLean, Andrew; Herberman Mash, Holly B; Rosen, Alexa; Kelly, Fiona; Solo-Gabriele, Helena M; Youngs Jr, Georgia A; Jensen, Jessica; Bernal, Oscar; Neria, Yuval

    2013-01-01

    Introduction. In 2011, following heavy winter snowfall, two cities bordering two rivers in North Dakota, USA faced major flood threats. Flooding was foreseeable and predictable although the extent of risk was uncertain. One community, Fargo, situated in a shallow river basin, successfully mitigated and prevented flooding. For the other community, Minot, located in a deep river valley, prevention was not possible and downtown businesses and one-quarter of the homes were inundated, in the city’s worst flood on record. We aimed at contrasting the respective hazards, vulnerabilities, stressors, psychological risk factors, psychosocial consequences, and disaster risk reduction strategies under conditions where flood prevention was, and was not, possible. Methods. We applied the “trauma signature analysis” (TSIG) approach to compare the hazard profiles, identify salient disaster stressors, document the key components of disaster risk reduction response, and examine indicators of community resilience. Results. Two demographically-comparable communities, Fargo and Minot, faced challenging river flood threats and exhibited effective coordination across community sectors. We examined the implementation of disaster risk reduction strategies in situations where coordinated citizen action was able to prevent disaster impact (hazard avoidance) compared to the more common scenario when unpreventable disaster strikes, causing destruction, harm, and distress. Across a range of indicators, it is clear that successful mitigation diminishes both physical and psychological impact, thereby reducing the trauma signature of the event. Conclusion. In contrast to experience of historic flooding in Minot, the city of Fargo succeeded in reducing the trauma signature by way of reducing risk through mitigation. PMID:28228985

  8. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  9. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  10. The Rise of Big Data in Neurorehabilitation.

    Science.gov (United States)

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  11. Flood Risk Analysis and Flood Potential Losses Assessment

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The heavy floods in the Taihu Basin showed increasing trend in recent years. In thiswork, a typical area in the northern Taihu Basin was selected for flood risk analysis and potentialflood losses assessment. Human activities have strong impact on the study area' s flood situation (asaffected by the polders built, deforestation, population increase, urbanization, etc. ), and havemade water level higher, flood duration shorter, and flood peaks sharper. Five years of differentflood return periods [(1970), 5 (1962), 10 (1987), 20 (1954), 50 (1991)] were used to cal-culate the potential flood risk area and its losses. The potential flood risk map, economic losses,and flood-impacted population were also calculated. The study's main conclusions are: 1 ) Humanactivities have strongly changed the natural flood situation in the study area, increasing runoff andflooding; 2) The flood risk area is closely related with the precipitation center; 3) Polder construc-tion has successfully protected land from flood, shortened the flood duration, and elevated waterlevel in rivers outside the polders; 4) Economic and social development have caused flood losses toincrease in recent years.

  12. Probabilistic flood extent estimates from social media flood observations

    Science.gov (United States)

    Brouwer, Tom; Eilander, Dirk; van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-05-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from Twitter messages that mention locations of flooding. A deterministic flood map created for the December 2015 flood in the city of York (UK) showed good performance (F(2) = 0.69; a statistic ranging from 0 to 1, with 1 expressing a perfect fit with validation data). The probabilistic flood maps we created showed that, in the York case study, the uncertainty in flood extent was mainly induced by errors in the precise locations of flood observations as derived from Twitter data. Errors in the terrain elevation data or in the parameters of the applied algorithm contributed less to flood extent uncertainty. Although these maps tended to overestimate the actual probability of flooding, they gave a reasonable representation of flood extent uncertainty in the area. This study illustrates that inherently uncertain data from social media can be used to derive information about flooding.

  13. Impacts of dyke development in flood prone areas in the Vietnamese Mekong Delta to downstream flood hazard

    Science.gov (United States)

    Khanh Triet Nguyen, Van; Dung Nguyen, Viet; Fujii, Hideto; Kummu, Matti; Merz, Bruno; Apel, Heiko

    2016-04-01

    The Vietnamese Mekong Delta (VMD) plays an important role in food security and socio-economic development of the country. Being a low-lying coastal region, the VMD is particularly susceptible to both riverine and tidal floods, which provide, on (the) one hand, the basis for the rich agricultural production and the livelihood of the people, but on the other hand pose a considerable hazard depending on the severity of the floods. But despite of potentially hazardous flood, the area remain active as a rice granary due to its nutrient-rich soils and sediment input, and dense waterways, canals and the long standing experience of the population living with floods. In response to both farmers' requests and governmental plans, the construction of flood protection infrastructure in the delta progressed rapidly in the last twenty years, notably at areas prone to deep flooding, i.e. the Plain of Reeds (PoR) and Long Xuyen Quadrangle (LXQ). Triple rice cropping becomes possible in farmlands enclosed by "full-dykes", i.e. dykes strong and high enough to prevent flooding of the flood plains for most of the floods. In these protected flood plains rice can be grown even during the peak flood period (September to November). However, little is known about the possibly (and already alleged) negative impacts of this fully flood protection measure to downstream areas. This study aims at quantifying how the flood regime in the lower part of the VMD (e.g. Can Tho, My Thuan, …) has been changed in the last 2 recent "big flood" events of 2000 and 2011 due to the construction of the full-dyke system in the upper part. First, an evaluation of 35 years of daily water level data was performed in order to detect trends at key gauging stations: Kratie: upper boundary of the Delta, Tan Chau and Chau Doc: areas with full-dyke construction, Can Tho and My Thuan: downstream. Results from the Mann-Kendall (MK) test show a decreasing trend of the annual maximum water level at 3 stations Kratie, Tan

  14. Kalman filter estimation model in flood forecasting

    Science.gov (United States)

    Husain, Tahir

    Elementary precipitation and runoff estimation problems associated with hydrologic data collection networks are formulated in conjunction with the Kalman Filter Estimation Model. Examples involve the estimation of runoff using data from a single precipitation station and also from a number of precipitation stations. The formulations demonstrate the role of state-space, measurement, and estimation equations of the Kalman Filter Model in flood forecasting. To facilitate the formulation, the unit hydrograph concept and antecedent precipitation index is adopted in the estimation model. The methodology is then applied to estimate various flood events in the Carnation Creek of British Columbia.

  15. Crowdsourcing detailed flood data

    Science.gov (United States)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  16. Phantom cosmology without Big Rip singularity

    Energy Technology Data Exchange (ETDEWEB)

    Astashenok, Artyom V. [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation); Nojiri, Shin' ichi, E-mail: nojiri@phys.nagoya-u.ac.jp [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan); Kobayashi-Maskawa Institute for the Origin of Particles and the Universe, Nagoya University, Nagoya 464-8602 (Japan); Odintsov, Sergei D. [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan); Institucio Catalana de Recerca i Estudis Avancats - ICREA and Institut de Ciencies de l' Espai (IEEC-CSIC), Campus UAB, Facultat de Ciencies, Torre C5-Par-2a pl, E-08193 Bellaterra (Barcelona) (Spain); Tomsk State Pedagogical University, Tomsk (Russian Federation); Yurov, Artyom V. [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation)

    2012-03-23

    We construct phantom energy models with the equation of state parameter w which is less than -1, w<-1, but finite-time future singularity does not occur. Such models can be divided into two classes: (i) energy density increases with time ('phantom energy' without 'Big Rip' singularity) and (ii) energy density tends to constant value with time ('cosmological constant' with asymptotically de Sitter evolution). The disintegration of bound structure is confirmed in Little Rip cosmology. Surprisingly, we find that such disintegration (on example of Sun-Earth system) may occur even in asymptotically de Sitter phantom universe consistent with observational data. We also demonstrate that non-singular phantom models admit wormhole solutions as well as possibility of Big Trip via wormholes.

  17. BIG DATA TECHNOLOGY ACCELERATE GENOMICS PRECISION MEDICINE

    Directory of Open Access Journals (Sweden)

    HAO LI

    2017-01-01

    Full Text Available During genomics life science research, the data volume of whole genomics and life science algorithm is going bigger and bigger, which is calculated as TB, PB or EB etc. The key problem will be how to store and analyze the data with optimized way. This paper demonstrates how Intel Big Data Technology and Architecture help to facilitate and accelerate the genomics life science research in data store and utilization. Intel defines high performance GenomicsDB for variant call data query and Lustre filesystem with Hierarchal Storage Management for genomics data store. Based on these great technology, Intel defines genomics knowledge share and exchange architecture, which is landed and validated in BGI China and Shanghai Children Hospital with very positive feedback. And these big data technology can definitely be scaled to much more genomics life science partners in the world

  18. On the Power of Randomization in Big Data Analytics

    DEFF Research Database (Denmark)

    Pham, Ninh Dang

    for big data analytics. That is how to efficiently handle and analyze such big data in order to bridge the gap between data and information. In wide range of application domains, data are represented as high-dimensional vectors in the Euclidean space in order to benefit from computationally advanced...... techniques from numerical linear algebra. The computational efficiency and scalability of such techniques have been growing demands for not only novel platform system architectures but also efficient and effective algorithms to address the fast paced big data needs. In the thesis we will tackle...... properties will be used to solve fundamental data analysis tasks, including outlier detection, classification and similarity search. The main contribution of the PhD dissertation is the demonstration of the power of randomization in big data analytics. We illustrate a happy marriage between randomized...

  19. Channel Shallowing as Mitigation of Coastal Flooding

    Directory of Open Access Journals (Sweden)

    Philip M. Orton

    2015-07-01

    Full Text Available Here, we demonstrate that reductions in the depth of inlets or estuary channels can be used to reduce or prevent coastal flooding. A validated hydrodynamic model of Jamaica Bay, New York City (NYC, is used to test nature-based adaptation measures in ameliorating flooding for NYC's two largest historical coastal flood events. In addition to control runs with modern bathymetry, three altered landscape scenarios are tested: (1 increasing the area of wetlands to their 1879 footprint and bathymetry, but leaving deep shipping channels unaltered; (2 shallowing all areas deeper than 2 m in the bay to be 2 m below Mean Low Water; (3 shallowing only the narrowest part of the inlet to the bay. These three scenarios are deliberately extreme and designed to evaluate the leverage each approach exerts on water levels. They result in peak water level reductions of 0.3%, 15%, and 6.8% for Hurricane Sandy, and 2.4%, 46% and 30% for the Category-3 hurricane of 1821, respectively (bay-wide averages. These results suggest that shallowing can provide greater flood protection than wetland restoration, and it is particularly effective at reducing "fast-pulse" storm surges that rise and fall quickly over several hours, like that of the 1821 storm. Nonetheless, the goal of flood mitigation must be weighed against economic, navigation, and ecological needs, and practical concerns such as the availability of sediment.

  20. Assessing Flood Risk Using Reservoir Flood Control Rules

    Institute of Scientific and Technical Information of China (English)

    Xiang Fu; Yadong Mei; Zhihuai Xiao

    2016-01-01

    The application of conventional flood operation regulation is restricted due to insufficient description of flood control rules for the Pubugou Reservoir in southern China. Based on the require-ments of different flood control objects, this paper proposes to optimize flood control rules with punish-ment mechanism by defining different parameters of flood control rules in response to flood inflow fore-cast and reservoir water level. A genetic algorithm is adopted for solving parameter optimization problem. The failure risk and overflow volume of the downstream insufficient flood control capacity are assessed through the reservoir operation policies. The results show that an optimised regulation can provide better performance than the current flood control rules.

  1. Target reservoirs for CO/sub 2/ miscible flooding. Task two: summary of available reservoir and geological data. Vol. II: Rocky Mountain states geological and reservoir data. Part 4: Paradox, Uinta, eastern Utah overthrust, Big Horn, Wind River, Powder River, Red Desert, and Great Divide basins; CACHE-Ismay through WERTZ-Madison fields. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Cobb, L.B.; Marlow, R.

    1981-10-01

    This report describes work performed by Gruy Federal, Inc., as the second of six tasks under contract with the US Department of Energy. The stated objective of this study is to build a solid engineering foundation to serve as the basis for field mini- and pilot tests in both high and low oil saturation carbonate reservoirs for the purpose of extending the technology base in carbon dioxide miscible flooding. The six tasks in this study are: (1) summary of available CO/sub 2/ field test data; (2) summary of existing reservoir and geological data; (3) selection of target reservoirs; (4) selection of specific reservoirs for CO/sub 2/ injection tests; (5) selection of specific sites for test wells in carbonate reservoirs; and (6) drilling and coring activities. The report for Task Two consists of a summary of existing reservoir and geological data on carbonate reservoirs located in west Texas, southeast New Mexico, and the Rocky Mountain states. It is contained in two volumes, each with several parts. The present volume, in four parts, is a summary of reservoir data for fields in the Rocky Mountain states. Volume One contains data for Permian basin fields in west Texas and southeast New Mexico. While a serious effort was made to obtain all publicly available data for the fields considered, sufficiently reliable data on important reservoir parameters were not always available for every field. The data in Volume II show 143 carbonate reservoirs in the study area may be suitable for CO/sub 2/ miscible flooding. Using a general estimate of enhanced oils recovery by CO/sub 2/ flooding of 10% of original oil in place, some 619 million barrels of oil could be recovered by widespread application of CO/sub 2/ flooding in the study area. Mississippian and Ordovician reservoirs appear to be the most promising targets for the process.

  2. Big Hero 6

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    看《超能陆战队》如何让普通人变身超级英雄拯救城市!Hiro Hamada,14,lives in the future city of San Fransokyo.He has a robot(机器人)friend Baymax.Baymax is big and soft.His job is to nurse sick(生病的)people.One day,a bad man wants to take control of(控制)SanFransokyo.Hiro hopes to save(挽救)the city with Baymax.ButBaymax is just a nursing robot.This is not a problem for Hiro,(ho一we套ve盔r.甲He)knows a lot about robots.He makes a suit of armorfor Baymax and turns him into a super robot!

  3. Avoiding a Big Catastrophe

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Before last October,the South China tiger had almost slipped into mythi- cal status as it had been absent for so long from the public eye.In the previous 20-plus years,these tigers could not be found in the wild in China and the number of those in captivity numbered only around 60. The species—a direct descendent of the earliest tigers thought to have originat- ed in China 2 million years ago—is functionally extinct,according to experts. The big cat’s return to the media spotlight was completely unexpected. On October 12,2007,a digital picture,showing a wild South China tiger

  4. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  5. Big Data Aesthetics

    DEFF Research Database (Denmark)

    Bjørnsten, Thomas

    2016-01-01

    of large data sets – or Big Data – into the sphere of art and the aesthetic. Central to the discussion here is the analysis of how different structuring principles of data and the discourses that surround these principles shape our perception of data. This discussion involves considerations on various......This article discusses artistic practices and artifacts that are occupied with exploring data through visualization and sonification strategies as well as with translating data into materially solid formats and embodied processes. By means of these examples the overall aim of the article...... is to critically question how and whether such artistic practices can eventually lead to the experience and production of knowledge that could not otherwise be obtained via more traditional ways of data representation. The article, thus, addresses both the problems and possibilities entailed in extending the use...

  6. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  7. Composite Flood Risk for New Jersery

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Composite Flood Risk layer combines flood hazard datasets from Federal Emergency Management Agency (FEMA) flood zones, NOAA's Shallow Coastal Flooding, and the...

  8. Composite Flood Risk for Virgin Island

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Composite Flood Risk layer combines flood hazard datasets from Federal Emergency Management Agency (FEMA) flood zones, NOAA's Shallow Coastal Flooding, and the...

  9. Flood Risk Management In Europe: European flood regulation

    NARCIS (Netherlands)

    Hegger, D.L.T.; Bakker, M.H.; Green, C.; Driessen, Peter; Delvaux, B.; Rijswick, H.F.M.W. van; Suykens, C.; Beyers, J-C.; Deketelaere, K.; Doorn-Hoekveld, W. van; Dieperink, C.

    2013-01-01

    In Europe, water management is moving from flood defense to a risk management approach, which takes both the probability and the potential consequences of flooding into account. In this report, we will look at Directives and (non-)EU- initiatives in place to deal with flood risk in Europe indirectly

  10. Improving Global Flood Forecasting using Satellite Detected Flood Extent

    NARCIS (Netherlands)

    Revilla Romero, B.

    2016-01-01

    Flooding is a natural global phenomenon but in many cases is exacerbated by human activity. Although flooding generally affects humans in a negative way, bringing death, suffering, and economic impacts, it also has potentially beneficial effects. Early flood warning and forecasting systems, as well

  11. Improving Global Flood Forecasting using Satellite Detected Flood Extent

    NARCIS (Netherlands)

    Revilla Romero, B.

    2016-01-01

    Flooding is a natural global phenomenon but in many cases is exacerbated by human activity. Although flooding generally affects humans in a negative way, bringing death, suffering, and economic impacts, it also has potentially beneficial effects. Early flood warning and forecasting systems, as well

  12. Flood Risk Management in Hungary's Upper Tisza Basin: the Potential Use of a Flood Catastrophe Model

    Science.gov (United States)

    Linerooth-Bayer, J.; Ermoliev, Y.; Ermolieva, T.; Galambos, I.

    2001-05-01

    This paper is based on the preliminary results of an IIASA-based study of flood-risk management for the Hungarian Upper Tisza River, where recent devastating floods have been exacerbated by cyanide and heavy metal pollution episodes originating in Romania. Hungary ranks only behind countries like Bangladesh and the Netherlands with regard to the extent of its territory exposed to flood risks, yet the government does not have a clear risk-management strategy in place. In the past, the national government has taken full responsibility for flood prevention, mainly through the construction of dikes, as well as for the post-disaster compensation of losses. This policy, however, is placing an increasing strain on the national budget. Like in many other countries, Hungarians recognize that a national flood program must be developed that effectively links private and public responsibility for the losses, private insurance and loss mitigation. The development of an insurance/mitigation program, however, faces distributive-value problems (the Hungarian public is skeptical of private insurance). Moreover, if private insurance is to be a policy option, it is necessary to devise improved tools and models for estimating spatially dependent risks in cases of little historical data. This is an area in which hydrologic models can be particularly useful. In this discussion, we describe a flood catastrophe model based on Monte Carlo simulation that can be of use in analyzing policy options for reducing the losses of floods in the Upper Tisza region, as well as for improving the insurability of the losses. The policy scenarios examined in the model, which are limited by data availability, have been developed by Hungarian policy makers. While the results are modest, the study demonstrates a methodology and process that may have considerable potential for aiding Hungarian policy makers in designing a national flood program.

  13. FEMA DFIRM Flood Hazard Areas

    Data.gov (United States)

    Minnesota Department of Natural Resources — FEMA flood hazard delineations are used by the Federal Emergency Management Agency (FEMA) to designate the Special Flood Hazard Area (SFHA) and for insurance rating...

  14. FEMA DFIRM Base Flood Elevations

    Data.gov (United States)

    Minnesota Department of Natural Resources — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally,...

  15. 2013 FEMA Flood Hazard Boundaries

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  16. FLOOD CHARACTERISTICS AND MANAGEMENT ADAPTATIONS ...

    African Journals Online (AJOL)

    Dr Osondu

    2011-10-26

    , bearing flood losses and land ... Engineering control of the major tributaries of the Imo River system is required to ..... on previous knowledge of physical nature of flood ... uptake; other factors include a lack of formal titles to.

  17. 2013 FEMA Base Flood Elevation

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  18. Base Flood Elevation (BFE) Lines

    Data.gov (United States)

    Department of Homeland Security — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally if...

  19. National Flood Hazard Layer (NFHL)

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The National Flood Hazard Layer (NFHL) is a compilation of GIS data that comprises a nationwide digital Flood Insurance Rate Map. The GIS data and services are...

  20. FEMA Q3 Flood Data

    Data.gov (United States)

    Kansas Data Access and Support Center — The Q3 Flood Data are derived from the Flood Insurance Rate Maps (FIRMS) published by the Federal Emergency Management Agency (FEMA). The file is georeferenced to...

  1. FEMA 100 year Flood Data

    Data.gov (United States)

    California Department of Resources — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...

  2. 2013 FEMA Flood Control Structures

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  3. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    Science.gov (United States)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the

  4. Flood characteristics of the Haor area in Bangladesh

    Science.gov (United States)

    Suman, Asadusjjaman; Bhattacharya, Biswa

    2013-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Bangladesh is a country, which is frequently suffering from flooding. The current research is conducted in the framework of a project, which focuses on the flooding issues in the Haor region in the north-east of Bangladesh. A haor is a saucer-shaped depression, which is used during the dry period (December to mid-May) for agriculture and as a fishery during the wet period (June-November), and thereby presents a very interesting socio-economic perspective of flood risk management. Pre-monsoon flooding till mid-May causes agricultural loss and lot of distress whereas monsoon flooding brings benefits. The area is bordering India, thereby presenting trans-boundary issues as well, and is fed by some flashy Indian catchments. The area is drained mainly through the Surma-Kushiyara river system. The terrain generally is flat and the flashy characteristics die out within a short distance from the border. Limited studies on the region, particularly with the help of numerical models, have been carried out in the past. Therefore, an objective of the current research was to set up numerical models capable of reasonably emulating the physical system. Such models could, for example, associate different gauges to the spatio-temporal variation of hydrodynamic variables and help in carrying out a systemic study on the impact of climate changes. A 1D2D model, with one-dimensional model for the rivers (based on MIKE 11 modelling tool from Danish Hydraulic Institute) and a two

  5. GASIS demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Vidas, E.H. [Energy and Environmental Analysis, Inc., Arlington, VA (United States)

    1995-04-01

    A prototype of the GASIS database and retrieval software has been developed and is the subject of this poster session and computer demonstration. The prototype consists of test or preliminary versions of the GASIS Reservoir Data System and Source Directory datasets and the software for query and retrieval. The prototype reservoir database covers the Rocky Mountain region and contains the full GASIS data matrix (all GASIS data elements) that will eventually be included on the CD-ROM. It is populated for development purposes primarily by the information included in the Rocky Mountain Gas Atlas. The software has been developed specifically for GASIS using Foxpro for Windows. The application is an executable file that does not require Foxpro to run. The reservoir database software includes query and retrieval, screen display, report generation, and data export functions. Basic queries by state, basin, or field name will be assisted by scrolling selection lists. A detailed query screen will allow record selection on the basis of any data field, such as depth, cumulative production, or geological age. Logical operators can be applied to any-numeric data element or combination of elements. Screen display includes a {open_quotes}browse{close_quotes} display with one record per row and a detailed single record display. Datasets can be exported in standard formats for manipulation with other software packages. The Source Directory software will allow record retrieval by database type or subject area.

  6. Multivariate pluvial flood damage models

    Energy Technology Data Exchange (ETDEWEB)

    Van Ootegem, Luc [HIVA — University of Louvain (Belgium); SHERPPA — Ghent University (Belgium); Verhofstadt, Elsy [SHERPPA — Ghent University (Belgium); Van Herck, Kristine; Creten, Tom [HIVA — University of Louvain (Belgium)

    2015-09-15

    Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks.

  7. Big Bang of Massenergy and Negative Big Bang of Spacetime

    Science.gov (United States)

    Cao, Dayong

    2017-01-01

    There is a balance between Big Bang of Massenergy and Negative Big Bang of Spacetime in the universe. Also some scientists considered there is an anti-Big Bang who could produce the antimatter. And the paper supposes there is a structure balance between Einstein field equation and negative Einstein field equation, a balance between massenergy structure and spacetime structure, a balance between an energy of nucleus of the stellar matter and a dark energy of nucleus of the dark matter-dark energy, and a balance between the particle and the wave-a balance system between massenergy (particle) and spacetime (wave). It should explain of the problems of the Big Bang. http://meetings.aps.org/Meeting/APR16/Session/M13.8

  8. Hydroclimatological Aspects of the Extreme 2011 Assiniboine River Basin Flood

    Science.gov (United States)

    Brimelow, J.; Szeto, K.; Bonsal, B. R.; Hanesiak, J.; Kochtubajda, B.; Stewart, R. E.

    2014-12-01

    In the spring and early summer of 2011, the Assiniboine River Basin in Canada experienced an extreme flood that was unprecedented in terms of duration and volume of water. The flood had significant socioeconomic impacts and caused over one billion dollars in damage. Contrary to what one might expect for such an extreme flood, individual precipitation events before and during the 2011 flood were not extreme; instead, it was the cumulative impact and timing of precipitation events going back to the summer of 2010 that played a key role in the 2011 flood. The summer and fall of 2010 were exceptionally wet, resulting in soil moisture levels being much above normal at the time of freeze up. This was followed by above-average precipitation during the winter of 2010-2011, and record-breaking basin-averaged snow-water equivalent values in March and April 2011. Abnormally cold temperatures in March delayed the spring melt by about two weeks, with the result that the above-average seasonal melt freshet occurred close to the onset of abnormally heavy rains in May and June. The large-scale atmospheric flow during May and June 2011 favoured increased cyclone activity over the central and northern U.S., which produced an anomalously large number of heavy rainfall events over the basin. All of these factors combined to generate extreme surface runoff and flooding. We used JRA-55 reanalysis data to quantify the relative importance of snowmelt, soil moisture and spring precipitation in contributing to the unprecedented flood and to demonstrate how the 2011 flood was unique compared to previous floods in the basin. Data and research from this study can be used to validate and improve flood forecasting techniques over this important basin; our findings also raise important questions regarding the impact of climate change on basins that experience pluvial and nival flooding.

  9. Big Data: present and future

    OpenAIRE

    Mircea Raducu TRIFU; Mihaela Laura IVAN

    2014-01-01

    The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ...

  10. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  11. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  12. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  13. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest.

  14. Optimal strategies for flood prevention

    NARCIS (Netherlands)

    Eijgenraam, Carel; Brekelmans, Ruud; den Hertog, Dick; Roos, C.

    2016-01-01

    Flood prevention policy is of major importance to the Netherlands since a large part of the country is below sea level and high water levels in rivers may also cause floods. In this paper we propose a dike height optimization model to determine economically efficient flood protection standards. We i

  15. Floods in a changing climate

    Science.gov (United States)

    Theresa K. Andersen; Marshall J. Shepherd

    2013-01-01

    Atmospheric warming and associated hydrological changes have implications for regional flood intensity and frequency. Climate models and hydrological models have the ability to integrate various contributing factors and assess potential changes to hydrology at global to local scales through the century. This survey of floods in a changing climate reviews flood...

  16. The use of Natural Flood Management to mitigate local flooding in the rural landscape

    Science.gov (United States)

    Wilkinson, Mark; Quinn, Paul; Ghimire, Sohan; Nicholson, Alex; Addy, Steve

    2014-05-01

    The past decade has seen increases in the occurrence of flood events across Europe, putting a growing number of settlements of varying sizes at risk. The issue of flooding in smaller villages is usually not well publicised. In these small communities, the cost of constructing and maintaining traditional flood defences often outweigh the potential benefits, which has led to a growing quest for more cost effective and sustainable approaches. Here we aim to provide such an approach that alongside flood risk reduction, also has multipurpose benefits of sediment control, water quality amelioration, and habitat creation. Natural flood management (NFM) aims to reduce flooding by working with natural features and characteristics to slow down or temporarily store flood waters. NFM measures include dynamic water storage ponds and wetlands, interception bunds, channel restoration and instream wood placement, and increasing soil infiltration through soil management and tree planting. Based on integrated monitoring and modelling studies, we demonstrate the potential to manage runoff locally using NFM in rural systems by effectively managing flow pathways (hill slopes and small channels) and by exploiting floodplains and buffers strips. Case studies from across the UK show that temporary storage ponds (ranging from 300 to 3000m3) and other NFM measures can reduce peak flows in small catchments (5 to 10 km2) by up to 15 to 30 percent. In addition, increasing the overall effective storage capacity by a network of NFM measures was found to be most effective for total reduction of local flood peaks. Hydraulic modelling has shown that the positioning of such features within the catchment, and how they are connected to the main channel, may also affect their effectiveness. Field evidence has shown that these ponds can collect significant accumulations of fine sediment during flood events. On the other hand, measures such as wetlands could also play an important role during low flow

  17. Big Data is invading big places as CERN

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  18. Rethinking the relationship between flood risk perception and flood management.

    Science.gov (United States)

    Birkholz, S; Muro, M; Jeffrey, P; Smith, H M

    2014-04-15

    Although flood risk perceptions and their concomitant motivations for behaviour have long been recognised as significant features of community resilience in the face of flooding events, there has, for some time now, been a poorly appreciated fissure in the accompanying literature. Specifically, rationalist and constructivist paradigms in the broader domain of risk perception provide different (though not always conflicting) contexts for interpreting evidence and developing theory. This contribution reviews the major constructs that have been applied to understanding flood risk perceptions and contextualises these within broader conceptual developments around risk perception theory and contemporary thinking around flood risk management. We argue that there is a need to re-examine and re-invigorate flood risk perception research, in a manner that is comprehensively underpinned by more constructivist thinking around flood risk management as well as by developments in broader risk perception research. We draw attention to an historical over-emphasis on the cognitive perceptions of those at risk to the detriment of a richer understanding of a wider range of flood risk perceptions such as those of policy-makers or of tax-payers who live outside flood affected areas as well as the linkages between these perspectives and protective measures such as state-supported flood insurance schemes. Conclusions challenge existing understandings of the relationship between risk perception and flood management, particularly where the latter relates to communication strategies and the extent to which those at risk from flooding feel responsible for taking protective actions.

  19. Flooding on Elbe River

    Science.gov (United States)

    2002-01-01

    Heavy rains in Central Europe over the past few weeks have led to some of the worst flooding the region has witnessed in more than a century. The floods have killed more than 100 people in Germany, Russia, Austria, Hungary, and the Czech Republic and have led to as much as $20 billion in damage. This false-color image of the Elbe River and its tributaries was taken on August 20, 2002, by the Moderate Resolution Imaging Spectroradiometer (MODIS), flying aboard NASA's Terra satellite. The floodwaters that inundated Dresden, Germany, earlier this week have moved north. As can be seen, the river resembles a fairly large lake in the center of the image just south of the town of Wittenberg. Flooding was also bad further downriver in the towns of Maqgdeburge and Hitzacker. Roughly 20,000 people were evacuated from their homes in northern Germany. Fifty thousand troops, border police, and technical assistance workers were called in to combat the floods along with 100,000 volunteers. The floodwaters are not expected to badly affect Hamburg, which sits on the mouth of the river on the North Sea. Credit:Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  20. On Flood Alert

    Institute of Scientific and Technical Information of China (English)

    LI LI

    2010-01-01

    @@ Aseries of heavy storms since early May led to severe flooding and landslides in south and southwest China,causing heaw casualties and economic losses.Severe convective weather such as downpours,gusts,hail and thunderstorms attacked these areas over a week from May 5.

  1. Fast Flooding over Manhattan

    CERN Document Server

    Clementi, Andrea; Silvestri, Riccardo

    2010-01-01

    We consider a Mobile Ad-hoc NETwork (MANET) formed by n agents that move at speed V according to the Manhattan Random-Way Point model over a square region of side length L. The resulting stationary (agent) spatial probability distribution is far to be uniform: the average density over the "central zone" is asymptotically higher than that over the "suburb". Agents exchange data iff they are at distance at most R within each other. We study the flooding time of this MANET: the number of time steps required to broadcast a message from one source agent to all agents of the network in the stationary phase. We prove the first asymptotical upper bound on the flooding time. This bound holds with high probability, it is a decreasing function of R and V, and it is tight for a wide and relevant range of the network parameters (i.e. L, R and V). A consequence of our result is that flooding over the sparse and highly-disconnected suburb can be as fast as flooding over the dense and connected central zone. Rather surprisin...

  2. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  3. Imprecise probabilistic estimation of design floods with epistemic uncertainties

    Science.gov (United States)

    Qi, Wei; Zhang, Chi; Fu, Guangtao; Zhou, Huicheng

    2016-06-01

    An imprecise probabilistic framework for design flood estimation is proposed on the basis of the Dempster-Shafer theory to handle different epistemic uncertainties from data, probability distribution functions, and probability distribution parameters. These uncertainties are incorporated in cost-benefit analysis to generate the lower and upper bounds of the total cost for flood control, thus presenting improved information for decision making on design floods. Within the total cost bounds, a new robustness criterion is proposed to select a design flood that can tolerate higher levels of uncertainty. A variance decomposition approach is used to quantify individual and interactive impacts of the uncertainty sources on total cost. Results from three case studies, with 127, 104, and 54 year flood data sets, respectively, show that the imprecise probabilistic approach effectively combines aleatory and epistemic uncertainties from the various sources and provides upper and lower bounds of the total cost. Between the total cost and the robustness of design floods, a clear trade-off which is beyond the information that can be provided by the conventional minimum cost criterion is identified. The interactions among data, distributions, and parameters have a much higher contribution than parameters to the estimate of the total cost. It is found that the contributions of the various uncertainty sources and their interactions vary with different flood magnitude, but remain roughly the same with different return periods. This study demonstrates that the proposed methodology can effectively incorporate epistemic uncertainties in cost-benefit analysis of design floods.

  4. Land Use Scenario Modeling for Flood Risk Mitigation

    Directory of Open Access Journals (Sweden)

    José I. Barredo

    2010-05-01

    Full Text Available It is generally accepted that flood risk has been increasing in Europe in the last decades. Accordingly, it becomes a priority to better understand its drivers and mechanisms. Flood risk is evaluated on the basis of three factors: hazard, exposure and vulnerability. If one of these factors increases, then so does the risk. Land use change models used for ex-ante assessment of spatial trends provide planners with powerful tools for territorial decision making. However, until recently this type of model has been largely neglected in strategic planning for flood risk mitigation. Thus, ex-ante assessment of flood risk is an innovative application of land use change models. The aim of this paper is to propose a flood risk mitigation approach using exposure scenarios. The methodology is applied in the Pordenone province in northern Italy. In the past 50 years Pordenone has suffered several heavy floods, the disastrous consequences of which demonstrated the vulnerability of the area. Results of this study confirm that the main driving force of increased flood risk is found in new urban developments in flood-prone areas.

  5. Intelligent Real-Time Reservoir Operation for Flood Control

    Science.gov (United States)

    Chang, L.; Hsu, H.

    2008-12-01

    Real-time flood control of a multi-purpose reservoir should consider decreasing the flood peak stage downstream and storing floodwaters for future usage during typhoon seasons. It is a continuous and instant decision-making process based on relevant operating rules, policy and water laws, in addition the immediate rainfall and the hydrology information; however, it is difficult to learn the intelligent experience from the elder operators. The main purpose of this study is to establish the automatic reservoir flood control model to achieve the goal of a reservoir operation during flood periods. In this study, we propose an intelligent reservoir operating methodology for real-time flood control. First, the genetic algorithm is used to search the optimal solutions, which can be considered as extracting the knowledge of reservoir operation strategies. Then, the adaptive network-based fuzzy inference system (ANFIS), which uses a hybrid learning procedure for extracting knowledge in the form of fuzzy if-then rules, is used to learn the input-output patterns and then to estimate the optimal flood operation. The Shihmen reservoir in Northern Taiwan was used as a case study, where its 26 typhoon events are investigated by the proposed method. The results demonstrate that the proposed control model can perform much better than the original reservoir operator in 26 flood events and effectively achieve decreasing peak flood stage downstream and storing floodwaters for future usage.

  6. Modeling and Analysis in Marine Big Data: Advances and Challenges

    Directory of Open Access Journals (Sweden)

    Dongmei Huang

    2015-01-01

    Full Text Available It is aware that big data has gathered tremendous attentions from academic research institutes, governments, and enterprises in all aspects of information sciences. With the development of diversity of marine data acquisition techniques, marine data grow exponentially in last decade, which forms marine big data. As an innovation, marine big data is a double-edged sword. On the one hand, there are many potential and highly useful values hidden in the huge volume of marine data, which is widely used in marine-related fields, such as tsunami and red-tide warning, prevention, and forecasting, disaster inversion, and visualization modeling after disasters. There is no doubt that the future competitions in marine sciences and technologies will surely converge into the marine data explorations. On the other hand, marine big data also brings about many new challenges in data management, such as the difficulties in data capture, storage, analysis, and applications, as well as data quality control and data security. To highlight theoretical methodologies and practical applications of marine big data, this paper illustrates a broad view about marine big data and its management, makes a survey on key methods and models, introduces an engineering instance that demonstrates the management architecture, and discusses the existing challenges.

  7. Flood model for Brazil

    Science.gov (United States)

    Palán, Ladislav; Punčochář, Petr

    2017-04-01

    Looking on the impact of flooding from the World-wide perspective, in last 50 years flooding has caused over 460,000 fatalities and caused serious material damage. Combining economic loss from ten costliest flood events (from the same period) returns a loss (in the present value) exceeding 300bn USD. Locally, in Brazil, flood is the most damaging natural peril with alarming increase of events frequencies as 5 out of the 10 biggest flood losses ever recorded have occurred after 2009. The amount of economic and insured losses particularly caused by various flood types was the key driver of the local probabilistic flood model development. Considering the area of Brazil (being 5th biggest country in the World) and the scattered distribution of insured exposure, a domain covered by the model was limited to the entire state of Sao Paolo and 53 additional regions. The model quantifies losses on approx. 90 % of exposure (for regular property lines) of key insurers. Based on detailed exposure analysis, Impact Forecasting has developed this tool using long term local hydrological data series (Agencia Nacional de Aguas) from riverine gauge stations and digital elevation model (Instituto Brasileiro de Geografia e Estatística). To provide most accurate representation of local hydrological behaviour needed for the nature of probabilistic simulation, a hydrological data processing focused on frequency analyses of seasonal peak flows - done by fitting appropriate extreme value statistical distribution and stochastic event set generation consisting of synthetically derived flood events respecting realistic spatial and frequency patterns visible in entire period of hydrological observation. Data were tested for homogeneity, consistency and for any significant breakpoint occurrence in time series so the entire observation or only its subparts were used for further analysis. The realistic spatial patterns of stochastic events are reproduced through the innovative use of d-vine copula

  8. Big Data and Perioperative Nursing.

    Science.gov (United States)

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  9. Big Lake Dam Inspection Report

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes an inspection of the Big Lake Dam that was done in September of 1983. The inspection did not reveal any conditions that constitute and...

  10. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  11. Big, Fat World of Lipids

    Science.gov (United States)

    ... Science Home Page The Big, Fat World of Lipids By Emily Carlson Posted August 9, 2012 Cholesterol ... ways to diagnose and treat lipid-related conditions. Lipid Encyclopedia Just as genomics and proteomics spurred advances ...

  12. EHR Big Data Deep Phenotyping

    National Research Council Canada - National Science Library

    L. J. Frey; L. Lenert; G. Lopez-Campos S. M. Meystre; G. K. Savova; K. C. Kipper-Schuler; J. F. Hurdle J. Zvárová; T. Dostálová; P. Hanzlíc∨ek; Z. Teuberová; M. Nagy; M. Pieš; M. Seydlová; Eliášová; H. Šimková Petra Knaup; Oliver Bott; Christian Kohl; Christian Lovis; Sebastian Garde

    2014-01-01

    Objectives: Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support...

  13. Complex optimization for big computational and experimental neutron datasets

    Science.gov (United States)

    Bao, Feng; Archibald, Richard; Niedziela, Jennifer; Bansal, Dipanshu; Delaire, Olivier

    2016-12-01

    We present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. We use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, and refine first principles calculations to better describe the experimental data.

  14. Hydrochemical characteristics of the natural waters associated with the flooding of the Meirama open pit (A Coruna, NW Spain)

    Energy Technology Data Exchange (ETDEWEB)

    Delgado, J.; Juncosa, R.; Vazquez, A.; Falcon, I.; Canal, J.; Hernandez, H.; Padilla, F.; Rodriguez-Vellando, P.; Delgado, J.L. [University of La Coruna, La Coruna (Spain). School of Civil Engineering

    2008-02-15

    In December, 2007, after 30 years of operations, the mine of Meirama ceased extraction of brown lignite. Since then operations have begun which will lead to the formation of a big mining lake (about 2 km{sup 2} surface and up to 180 m deep) after controlled flooding of the open pit. In the process of flooding, both surface and ground waters are involved, each with their own chemical signature. According to the information available, the diversion of surface waters towards the pit hole should lead to the formation of a water body of acceptable quality. However, all unassisted flooding process could eventually form all acidic lake.

  15. Big Data Comes to School

    OpenAIRE

    Bill Cope; Mary Kalantzis

    2016-01-01

    The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-me...

  16. The role of big laboratories

    Science.gov (United States)

    Heuer, R.-D.

    2013-12-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  17. Big Data for Precision Medicine

    OpenAIRE

    Daniel Richard Leff; Guang-Zhong Yang

    2015-01-01

    This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of onl...

  18. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  19. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  20. Big Bang Nucleosynthesis: 2015

    CERN Document Server

    Cyburt, Richard H; Olive, Keith A; Yeh, Tsung-Han

    2015-01-01

    Big-bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. We briefly overview the essentials of this physics, and present new calculations of light element abundances through li6 and li7, with updated nuclear reactions and uncertainties including those in the neutron lifetime. We provide fits to these results as a function of baryon density and of the number of neutrino flavors, N_nu. We review recent developments in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom, n_eff. These measurements allow for a tight test of BBN and of cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. We include a ...

  1. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  2. Powering Big Data for Nursing Through Partnership.

    Science.gov (United States)

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  3. GIS Support for Flood Rescue

    DEFF Research Database (Denmark)

    Liang, Gengsheng; Mioc, Darka; Anton, François

    2007-01-01

    Under flood events, the ground traffic is blocked in and around the flooded area due to damages to roads and bridges. The traditional transportation network may not always help people to make a right decision for evacuation. In order to provide dynamic road information needed for flood rescue, we...... developed an adaptive web-based transportation network application using Oracle technology. Moreover, the geographic relationships between the road network and flood areas are taken into account. The overlay between the road network and flood polygons is computed on the fly. This application allows users...... to retrieve the shortest and safest route in Fredericton road network during flood event. It enables users to make a timely decision for flood rescue. We are using Oracle Spatial to deal with emergency situations that can be applied to other constrained network applications as well....

  4. Flood marks of the 1813 flood in the Central Europe

    Science.gov (United States)

    Miklanek, Pavol; Pekárová, Pavla; Halmová, Dana; Pramuk, Branislav; Bačová Mitková, Veronika

    2014-05-01

    In August 2013, 200 years have passed since the greatest and most destructive floods known in the Slovak river basins. The flood affected almost the entire territory of Slovakia, northeastern Moravia, south of Poland. River basins of Váh (Orava, Kysuca), Poprad, Nitra, Hron, Torysa, Hornád, upper and middle Vistula, Odra have been most affected. The aim of this paper is to map the flood marks documenting this catastrophic flood in Slovakia. Flood marks and registrations on the 1813 flood in the Váh river basin are characterized by great diversity and are written in Bernolák modification of Slovak, in Latin, German and Hungarian. Their descriptions are stored in municipal chronicles and Slovak and Hungarian state archives. The flood in 1813 devastated the entire Váh valley, as well as its tributaries. Following flood marks were known in the Vah river basin: Dolná Lehota village in the Orava river basin, historical map from 1817 covering the Sučany village and showing three different cross-sections of the Váh river during the 1813 flood, flood mark in the city of Trenčín, Flood mark in the gate of the Brunovce mansion, cross preserved at the old linden tree at Drahovce, and some records in written documents, e.g. Cifer village. The second part of the study deals with flood marks mapping in the Hron, Hnilec and Poprad River basins, and Vistula River basin in Krakow. On the basis of literary documents and the actual measurement, we summarize the peak flow rates achieved during the floods in 1813 in the profile Hron: Banská Bystrica. According to recent situation the 1813 flood peak was approximately by 1.22 m higher, than the flood in 1974. Also in the Poprad basin is the August 1813 flood referred as the most devastating flood in last 400 years. The position of the flood mark is known, but the building was unfortunately removed later. The water level in 1813 was much higher than the water level during the recent flood in June 2010. In Cracow the water level

  5. Citizen involvement in flood risk governance: flood groups and networks

    Directory of Open Access Journals (Sweden)

    Twigger-Ross Clare

    2016-01-01

    Full Text Available Over the past decade has been a policy shift withinUK flood risk management towards localism with an emphasis on communities taking ownership of flood risk. There is also an increased focus on resilience and, more specifically, on community resilience to flooding. This paper draws on research carried out for UK Department for Environment Food and Rural Affairs to evaluate the Flood Resilience Community Pathfinder (FRCP scheme in England. Resilience is conceptualised as multidimensional and linked to exisiting capacities within a community. Creating resilience to flooding is an ongoing process of adaptation, learning from past events and preparing for future risks. This paper focusses on the development of formal and informal institutions to support improved flood risk management: institutional resilience capacity. It includes new institutions: e.g. flood groups, as well as activities that help to build inter- and intra- institutional resilience capacity e.g. community flood planning. The pathfinder scheme consisted of 13 projects across England led by local authorities aimed at developing community resilience to flood risk between 2013 – 2015. This paper discusses the nature and structure of flood groups, the process of their development, and the extent of their linkages with formal institutions, drawing out the barriers and facilitators to developing institutional resilience at the local level.

  6. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  7. An Evaluation of Selected Extraordinary Floods in the United States Reported by the U.S. Geological Survey and Implications for Future Advancement of Flood Science

    Science.gov (United States)

    Costa, John E.; Jarrett, Robert D.

    2008-01-01

    discharges that were estimated by an inappropriate method (slope-area) (Big Creek near Waynesville, North Carolina; Day Creek near Etiwanda, California). Original field notes and records could not be found for three of the floods, however, some data (copies of original materials, records of reviews) were available for two of these floods. A rating was assigned to each of seven peak discharges that had no rating. Errors identified in the reviews include misidentified flow processes, incorrect drainage areas for very small basins, incorrect latitude and longitude, improper field methods, arithmetic mistakes in hand calculations, omission of measured high flows when developing rating curves, and typographical errors. Common problems include use of two-section slope-area measurements, poor site selection, uncertainties in Manning's n-values, inadequate review, lost data files, and insufficient and inadequately described high-water marks. These floods also highlight the extreme difficulty in making indirect discharge measurements following extraordinary floods. Significantly, none of the indirect measurements are rated better than fair, which indicates the need to improve methodology to estimate peak discharge. Highly unsteady flow and resulting transient hydraulic phenomena, two-dimensional flow patterns, debris flows at streamflow-gaging stations, and the possibility of disconnected flow surfaces are examples of unresolved problems not well handled by current indirect discharge methodology. On the basis of a comprehensive review of 50,000 annual peak discharges and miscellaneous floods in California, problems with individual flood peak discharges would be expected to require a revision of discharge or rating curves at a rate no greater than about 0.10 percent of all floods. Many extraordinary floods create complex flow patterns and processes that cannot be adequately documented with quasi-steady, uniform one-dimensional analyses. These floods are most accura

  8. Effects of a flooding event on a threatened black bear population in Louisiana

    Science.gov (United States)

    O'Connell-Goode, Kaitlin C.; Lowe, Carrie L.; Clark, Joseph D.

    2014-01-01

    The Louisiana black bear, Ursus americanus luteolus, is listed as threatened under the Endangered Species Act as a result of habitat loss and human-related mortality. Information on population-level responses of large mammals to flooding events is scarce, and we had a unique opportunity to evaluate the viability of the Upper Atchafalaya River Basin (UARB) black bear population before and after a significant flooding event. We began collecting black bear hair samples in 2007 for a DNA mark-recapture study to estimate abundance (N) and apparent survival (φ). In 2011, the Morganza Spillway was opened to divert floodwaters from the Mississippi River through the UARB, inundating > 50% of our study area, potentially impacting recovery of this important bear population. To evaluate the effects of this flooding event on bear population dynamics, we used a robust design multistate model to estimate changes in transition rates from the flooded area to non-flooded area (ψF→NF) before (2007–2010), during (2010–2011) and after (2011–2012) the flood. Average N across all years of study was 63.2 (SE = 5.2), excluding the year of the flooding event. Estimates of ψF→NF increased from 0.014 (SE = 0.010; meaning that 1.4% of the bears moved from the flooded area to non-flooded areas) before flooding to 0.113 (SE = 0.045) during the flood year, and then decreased to 0.028 (SE= 0.035) after the flood. Although we demonstrated a flood effect on transition rates as hypothesized, the effect was small (88.7% of the bears remained in the flooded area during flooding) and φ was unchanged, suggesting that the 2011 flooding event had minimal impact on survival and site fidelity.

  9. Dip-angle influence on areal DNAPL recovery by co-solvent flooding with and without pre-flooding.

    Science.gov (United States)

    Boyd, Glen R; Li, Minghua; Husserl, Johana; Ocampo-Gómez, Ana M

    2006-01-10

    A two-dimensional (2D) laboratory model was used to study effects of gravity on areal recovery of a representative dense non-aqueous phase liquid (DNAPL) contaminant by an alcohol pre-flood and co-solvent flood in dipping aquifers. Recent studies have demonstrated that injection of alcohol and co-solvent solutions can be used to reduce in-situ the density of DNAPL globules and displace the contaminant from the source zone. However, contact with aqueous alcohol reduces interfacial tension and causes DNAPL swelling, thus facilitating risk of uncontrolled downward DNAPL migration. The 2D laboratory model was operated with constant background gradient flow and a DNAPL spill was simulated using tetrachloroethene (PCE). The spill was dispersed to a trapped, immobile PCE saturation by a water flood. Areal PCE recovery was studied using a double-triangle well pattern to simulate a remediation scheme consisting of an alcohol pre-flood using aqueous isobutanol ( approximately 10% vol.) followed by a co-solvent flood using a solution of ethylene glycol (65%) and 1-propanol (35%). Experiments were conducted with the 2D model oriented in the horizontal plane and compared to experiments at the 15 degrees and 30 degrees dip-angle orientations. Injection was applied either in the downward or upward direction of flow. Experimental results were compared to theoretical predictions for flood front stability and used to evaluate effects of gravity on areal PCE recovery. Sensitivity experiments were performed to evaluate effects of the alcohol pre-flood on PCE areal recovery. For experiments conducted with the alcohol pre-flood and the 2D model oriented in the horizontal plane, results indicate that 89-93% of source zone PCE was recovered. With injection oriented downward, results indicate that areal PCE recovery was 70-77% for a 15 degrees dip angle and 57-59% for a 30 degrees dip angle. With injection oriented upward, results indicate that areal PCE recovery was 57-60% at the 30

  10. Flood of April and May 2008 in Northern Maine

    Science.gov (United States)

    Lombard, Pamela J.

    2010-01-01

    Severe flooding occurred in Aroostook and Penobscot Counties in northern Maine between April 28 and May 1, 2008, and damage was extensive in the town of Fort Kent. Aroostook County was declared a Federal disaster area on May 9, and the declaration was expanded to include Penobscot County on May 16-qualifying the entire region for federal assistance. Water in the St. John River peaked at 30.17 feet in Fort Kent (5 feet above flood stage), hit the low steel of the International Bridge connecting Fort Kent to New Brunswick, caused closure of international bridges in Fort Kent, Van Buren, and Hamlin and came within inches of the top of a 30-foot-high earthen dike constructed to protect the downtown area of Fort Kent. Longterm streamgages with 25 to 84 years of record on the Big Black, St. John, Allagash, Fish, and Aroostook Rivers recorded maximum streamflows for their respective periods of record. Northern Maine experienced major floods in 1923, 1973, 1974, 1979, and 1983 (Maloney and Bartlett, 1991). All of these floods were in late April or early May when heavy rain combined with snowmelt runoff.

  11. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    ., 2008) The methodology includes estimates of flood probabilities due to coastal- and fluvial-driven processes occurring individually or jointly, mechanisms of flooding and their impacts on urban environment. Various flood scenarios are examined in order to demonstrate that this methodology is necessary to quantify the important physical processes in coastal flood predictions. Cork City, located on the south of Ireland subject to frequent coastal-fluvial flooding, is used as a study case.

  12. Unlocking the Power of Big Data at the National Institutes of Health.

    Science.gov (United States)

    Coakley, Meghan F; Leerkes, Maarten R; Barnett, Jason; Gabrielian, Andrei E; Noble, Karlynn; Weber, M Nick; Huyen, Yentram

    2013-09-01

    The era of "big data" presents immense opportunities for scientific discovery and technological progress, with the potential to have enormous impact on research and development in the public sector. In order to capitalize on these benefits, there are significant challenges to overcome in data analytics. The National Institute of Allergy and Infectious Diseases held a symposium entitled "Data Science: Unlocking the Power of Big Data" to create a forum for big data experts to present and share some of the creative and innovative methods to gleaning valuable knowledge from an overwhelming flood of biological data. A significant investment in infrastructure and tool development, along with more and better-trained data scientists, may facilitate methods for assimilation of data and machine learning, to overcome obstacles such as data security, data cleaning, and data integration.

  13. Unlocking the Power of Big Data at the National Institutes of Health

    Science.gov (United States)

    Coakley, Meghan F.; Leerkes, Maarten R.; Barnett, Jason; Gabrielian, Andrei E.; Noble, Karlynn; Weber, M. Nick

    2013-01-01

    Abstract The era of “big data” presents immense opportunities for scientific discovery and technological progress, with the potential to have enormous impact on research and development in the public sector. In order to capitalize on these benefits, there are significant challenges to overcome in data analytics. The National Institute of Allergy and Infectious Diseases held a symposium entitled “Data Science: Unlocking the Power of Big Data” to create a forum for big data experts to present and share some of the creative and innovative methods to gleaning valuable knowledge from an overwhelming flood of biological data. A significant investment in infrastructure and tool development, along with more and better-trained data scientists, may facilitate methods for assimilation of data and machine learning, to overcome obstacles such as data security, data cleaning, and data integration. PMID:27442200

  14. Floods and tsunamis.

    Science.gov (United States)

    Llewellyn, Mark

    2006-06-01

    Floods and tsunamis cause few severe injuries, but those injuries can overwhelm local areas, depending on the magnitude of the disaster. Most injuries are extremity fractures, lacerations, and sprains. Because of the mechanism of soft tissue and bone injuries, infection is a significant risk. Aspiration pneumonias are also associated with tsunamis. Appropriate precautionary interventions prevent communicable dis-ease outbreaks. Psychosocial health issues must be considered.

  15. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California.

    Directory of Open Access Journals (Sweden)

    Juliano Calil

    Full Text Available Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S.Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as "repetitive loss." During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds.

  16. Atmospheric Rivers, Floods, and Flash Floods in California

    Science.gov (United States)

    Skelly, Klint T.

    Atmospheric Rivers (ARs) are long (>2000 km), narrow (<1000 km) corridors of enhanced vertically integrated water vapor (IWV) and enhanced IWV transport (IVT). The landfall of ARs along the U.S. West Coast have been linked to extreme precipitation and flooding/flash flooding in regions of complex topography. The objective of this study is to investigate the relationship between a 10 water-year (2005-2014) climatology of floods, flash floods, and landfalling ARs. The ARs in this study are defined using IVT following the Rutz et al. (2013) methodology, whereas floods and flash floods are identified by the National Centers for Environmental Information (NCEI) Storm Events Database. The results of this study indicate that landfalling ARs are present on a majority of days that there are floods in northern California. Landfalling ARs are predominantly present on a majority of days that there are flash flood reports during the cold-season (November-March); however, the North American monsoon is present on days that there are flash flood reports during the warm-season (April-October). Two exemplary case studies are provided to illustrate the hydrologic impact of landfalling ARs. The first case study illustrated a flood event that occurred in associated with three landfalling ARs that produced 800 mm in regions over the Russian River watershed in northern California and the second case study illustrated a flash flood event that occurred in association with a landfalling AR that produced ˜225 mm of precipitation in regions over the Santa Ynez xii watershed in which produced a flash flood over the southern portions of Santa Barbara County in southern California.

  17. Statistics and Analysis of the Relations between Rainstorm Floods and Earthquakes

    Directory of Open Access Journals (Sweden)

    Baodeng Hou

    2016-01-01

    Full Text Available The frequent occurrence of geophysical disasters under climate change has drawn Chinese scholars to pay their attention to disaster relations. If the occurrence sequence of disasters could be identified, long-term disaster forecast could be realized. Based on the Earth Degassing Effect (EDE which is valid, this paper took the magnitude, epicenter, and occurrence time of the earthquake, as well as the epicenter and occurrence time of the rainstorm floods as basic factors to establish an integrated model to study the correlation between rainstorm floods and earthquakes. 2461 severe earthquakes occurred in China or within 3000 km from China and the 169 heavy rainstorm floods occurred in China over the past 200+ years as the input data of the model. The computational results showed that although most of the rainstorm floods have nothing to do with the severe earthquakes from a statistical perspective, some floods might relate to earthquakes. This is especially true when the earthquakes happen in the vapor transmission zone where rainstorms lead to abundant water vapors. In this regard, earthquakes are more likely to cause big rainstorm floods. However, many cases of rainstorm floods could be found after severe earthquakes with a large extent of uncertainty.

  18. Societal and economic impacts of flood hazards in Turkey – an overview

    Directory of Open Access Journals (Sweden)

    Koç Gamze

    2016-01-01

    Full Text Available Turkey has been severely affected by many natural hazards, in particular earthquakes and floods. Although there is a large body of literature on earthquake hazards and risks in Turkey, comparatively little is known about flood hazards and risks. Therefore, with this study it is aimed to investigate flood patterns, societal and economic impacts of flood hazards in Turkey, as well as providing a comparative overview of the temporal and spatial distribution of flood losses by analysing EM-DAT (Emergency Events Database and TABB (Turkey Disaster Data Base databases on disaster losses throughout Turkey for the years 1960-2014. The comparison of these two databases reveals big mismatches of the flood data, e.g. the reported number of events, number of affected people and economic loss, differ dramatically. With this paper, it has been explored reasons for mismatches. Biases and fallacies for loss data in the two databases has been discussed as well. Since loss data collection is gaining more and more attention, e.g. in the Sendai Framework for Disaster Risk Reduction 2015-2030 (SFDRR, the study could offer a base-work for developing guidelines and procedures on how to standardize loss databases and implement across the other hazard events, as well as substantial insights for flood risk mitigation and adaptation studies in Turkey and will offer valuable insights for other (European countries.

  19. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  20. Recent advances in flood forecasting and flood risk assessment

    Directory of Open Access Journals (Sweden)

    G. Arduino

    2005-01-01

    Full Text Available Recent large floods in Europe have led to increased interest in research and development of flood forecasting systems. Some of these events have been provoked by some of the wettest rainfall periods on record which has led to speculation that such extremes are attributable in some measure to anthropogenic global warming and represent the beginning of a period of higher flood frequency. Whilst current trends in extreme event statistics will be difficult to discern, conclusively, there has been a substantial increase in the frequency of high floods in the 20th century for basins greater than 2x105 km2. There is also increasing that anthropogenic forcing of climate change may lead to an increased probability of extreme precipitation and, hence, of flooding. There is, therefore, major emphasis on the improvement of operational flood forecasting systems in Europe, with significant European Community spending on research and development on prototype forecasting systems and flood risk management projects. This Special Issue synthesises the most relevant scientific and technological results presented at the International Conference on Flood Forecasting in Europe held in Rotterdam from 3-5 March 2003. During that meeting 150 scientists, forecasters and stakeholders from four continents assembled to present their work and current operational best practice and to discuss future directions of scientific and technological efforts in flood prediction and prevention. The papers presented at the conference fall into seven themes, as follows.

  1. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  2. Considerations on Geospatial Big Data

    Science.gov (United States)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  3. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  4. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  5. Fault tree analysis for urban flooding

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.; Van Gelder, P.H.A.J.M.

    2008-01-01

    Traditional methods to evaluate flood risk mostly focus on storm events as the main cause of flooding. Fault tree analysis is a technique that is able to model all potential causes of flooding and to quantify both the overall probability of flooding and the contributions of all causes of flooding to

  6. Big Canyon Creek Ecological Restoration Strategy.

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Lynn; Richardson, Shannon

    2007-10-01

    then used data collected from the District's stream assessment and inventory, utilizing the Stream Visual Assessment Protocol (SVAP), to determine treatment necessary to bring 90% of reaches ranked Poor or Fair through the SVAP up to good or excellent. In 10 year's time, all reaches that were previously evaluated with SVAP will be reevaluated to determine progress and to adapt methods for continued success. Over 400 miles of stream need treatment in order to meet identified restoration goals. Treatments include practices which result in riparian habitat improvements, nutrient reductions, channel condition improvements, fish habitat improvements, invasive species control, water withdrawal reductions, improved hydrologic alterations, upland sediment reductions, and passage barrier removal. The Nez Perce Soil and Water Conservation District (District) and the Nez Perce Tribe Department of Fisheries Resource Management Watershed Division (Tribe) developed this document to guide restoration activities within the Big Canyon Creek watershed for the period of 2008-2018. This plan was created to demonstrate the ongoing need and potential for anadromous fish habitat restoration within the watershed and to ensure continued implementation of restoration actions and activities. It was developed not only to guide the District and the Tribe, but also to encourage cooperation among all stakeholders, including landowners, government agencies, private organizations, tribal governments, and elected officials. Through sharing information, skills, and resources in an active, cooperative relationships, all concerned parties will have the opportunity to join together to strengthen and maintain a sustainable natural resource base for present and future generations within the watershed. The primary goal of the strategy is to address aquatic habitat restoration needs on a watershed level for resident and anadromous fish species, promoting quality habitat within a self

  7. Should seasonal rainfall forecasts be used for flood preparedness?

    Directory of Open Access Journals (Sweden)

    E. Coughlan de Perez

    2017-09-01

    Full Text Available In light of strong encouragement for disaster managers to use climate services for flood preparation, we question whether seasonal rainfall forecasts should indeed be used as indicators of the likelihood of flooding. Here, we investigate the primary indicators of flooding at the seasonal timescale across sub-Saharan Africa. Given the sparsity of hydrological observations, we input bias-corrected reanalysis rainfall into the Global Flood Awareness System to identify seasonal indicators of floodiness. Results demonstrate that in some regions of western, central, and eastern Africa with typically wet climates, even a perfect tercile forecast of seasonal total rainfall would provide little to no indication of the seasonal likelihood of flooding. The number of extreme events within a season shows the highest correlations with floodiness consistently across regions. Otherwise, results vary across climate regimes: floodiness in arid regions in southern and eastern Africa shows the strongest correlations with seasonal average soil moisture and seasonal total rainfall. Floodiness in wetter climates of western and central Africa and Madagascar shows the strongest relationship with measures of the intensity of seasonal rainfall. Measures of rainfall patterns, such as the length of dry spells, are least related to seasonal floodiness across the continent. Ultimately, identifying the drivers of seasonal flooding can be used to improve forecast information for flood preparedness and to avoid misleading decision-makers.

  8. Uncertainty evaluation of design rainfall for urban flood risk analysis.

    Science.gov (United States)

    Fontanazza, C M; Freni, G; La Loggia, G; Notaro, V

    2011-01-01

    A reliable and long dataset describing urban flood locations, volumes and depths would be an ideal prerequisite for assessing flood frequency distributions. However, data are often piecemeal and long-term hydraulic modelling is often adopted to estimate floods from historical rainfall series. Long-term modelling approaches are time- and resource-consuming, and synthetically designed rainfalls are often used to estimate flood frequencies. The present paper aims to assess the uncertainty of such an approach and for suggesting improvements in the definition of synthetic rainfall data for flooding frequency analysis. According to this aim, a multivariate statistical analysis based on a copula method was applied to rainfall features (total depth, duration and maximum intensity) to generate synthetic rainfalls that are more consistent with historical events. The procedure was applied to a real case study, and the results were compared with those obtained by simulating other typical synthetic rainfall events linked to intensity-duration-frequency (IDF) curves. The copula-based multi-variate analysis is more robust and adapts well to experimental flood locations even if it is more complex and time-consuming. This study demonstrates that statistical correlations amongst rainfall frequency, duration, volume and peak intensity can partially explain the weak reliability of flood-frequency analyses based on synthetic rainfall events.

  9. Should seasonal rainfall forecasts be used for flood preparedness?

    Science.gov (United States)

    Coughlan de Perez, Erin; Stephens, Elisabeth; Bischiniotis, Konstantinos; van Aalst, Maarten; van den Hurk, Bart; Mason, Simon; Nissan, Hannah; Pappenberger, Florian

    2017-09-01

    In light of strong encouragement for disaster managers to use climate services for flood preparation, we question whether seasonal rainfall forecasts should indeed be used as indicators of the likelihood of flooding. Here, we investigate the primary indicators of flooding at the seasonal timescale across sub-Saharan Africa. Given the sparsity of hydrological observations, we input bias-corrected reanalysis rainfall into the Global Flood Awareness System to identify seasonal indicators of floodiness. Results demonstrate that in some regions of western, central, and eastern Africa with typically wet climates, even a perfect tercile forecast of seasonal total rainfall would provide little to no indication of the seasonal likelihood of flooding. The number of extreme events within a season shows the highest correlations with floodiness consistently across regions. Otherwise, results vary across climate regimes: floodiness in arid regions in southern and eastern Africa shows the strongest correlations with seasonal average soil moisture and seasonal total rainfall. Floodiness in wetter climates of western and central Africa and Madagascar shows the strongest relationship with measures of the intensity of seasonal rainfall. Measures of rainfall patterns, such as the length of dry spells, are least related to seasonal floodiness across the continent. Ultimately, identifying the drivers of seasonal flooding can be used to improve forecast information for flood preparedness and to avoid misleading decision-makers.

  10. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    Science.gov (United States)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water

  11. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  12. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  13. Multiwavelength astronomy and big data

    Science.gov (United States)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  14. 淀粉Big Bang!

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Big Bang,也叫"大爆炸",指的是宇宙诞生时期从密度极大且温度极高的太初状态开始发生不断膨胀的过程。换句话说,从Big Bang开始,我们现在的宇宙慢慢形成了。0K,从本期开始,"少电"将在微博引发Big Bang!——淀粉大爆炸!具体怎么爆呢?我想,看到本页版式的你已经明白了七八分了吧?

  15. Big Data: Astronomical or Genomical?

    Science.gov (United States)

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  16. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  17. Flood Risk and Flood hazard maps - Visualisation of hydrological risks

    Energy Technology Data Exchange (ETDEWEB)

    Spachinger, Karl; Dorner, Wolfgang; Metzka, Rudolf [University of Applied Sciences Deggendorf (Germany); Serrhini, Kamal [Universite de Technologie de Compiegne, Genie des Systemes Urbains, France, and Universite Francois Rabelais, Unite Mixte de Recherche, Tours (France); Fuchs, Sven [Institute of Mountain Risk Engineering, University of Natural Resources and Applied Life Sciences, Vienna (Austria)], E-mail: karl.spachinger@fhd.edu

    2008-11-01

    Hydrological models are an important basis of flood forecasting and early warning systems. They provide significant data on hydrological risks. In combination with other modelling techniques, such as hydrodynamic models, they can be used to assess the extent and impact of hydrological events. The new European Flood Directive forces all member states to evaluate flood risk on a catchment scale, to compile maps of flood hazard and flood risk for prone areas, and to inform on a local level about these risks. Flood hazard and flood risk maps are important tools to communicate flood risk to different target groups. They provide compiled information to relevant public bodies such as water management authorities, municipalities, or civil protection agencies, but also to the broader public. For almost each section of a river basin, run-off and water levels can be defined based on the likelihood of annual recurrence, using a combination of hydrological and hydrodynamic models, supplemented by an analysis of historical records and mappings. In combination with data related to the vulnerability of a region risk maps can be derived. The project RISKCATCH addressed these issues of hydrological risk and vulnerability assessment focusing on the flood risk management process. Flood hazard maps and flood risk maps were compiled for Austrian and German test sites taking into account existing national and international guidelines. These maps were evaluated by eye-tracking using experimental graphic semiology. Sets of small-scale as well as large-scale risk maps were presented to test persons in order to (1) study reading behaviour as well as understanding and (2) deduce the most attractive components that are essential for target-oriented risk communication. A cognitive survey asking for negative and positive aspects and complexity of each single map complemented the experimental graphic semiology. The results indicate how risk maps can be improved to fit the needs of different user

  18. Flood Inundation Modelling in Data Sparse Deltas

    Science.gov (United States)

    Hawker, Laurence; Bates, Paul; Neal, Jeffrey

    2017-04-01

    An estimated 7% of global population currently live in deltas, and this number is increasing over time. This has resulted in numerous human induced impacts on deltas ranging from subsidence, upstream sediment trapping and coastal erosion amongst others. These threats have already impacted on flood dynamics in deltas and could intensify in line with human activities. However, the myriad of threats creates a large number of potential scenarios that need to be evaluated. Therefore, to assess the impacts of these scenarios, a pre-requisite is a flood inundation model that is both computationally efficient and flexible in its setup so it can be applied in data-sparse settings. An intermediate scale, which compromises between the computational speed of a global model and the detail of a case specific bespoke model, was chosen to achieve this. To this end, we have developed an intermediate scale flood inundation model at a resolution of 540m of the Mekong Delta, built with freely available data, using the LISFLOOD-FP hydrodynamic model. The purpose of this is to answer the following questions: 1) How much detail is required to accurately simulate flooding in the Mekong Delta? , 2) What characteristics of deltas are most important to include in flood inundation models? Models were run using a vegetation removed SRTM DEM and a hind-casting of tidal heights as a downstream boundary. Results indicate the importance of vegetation removal in the DEM for inundation extent and the sensitivity of water level to roughness coefficients. The propagation of the tidal signal was found to be sensitive to bathymetry, both within the river channel and offshore, yet data availability for this is poor, meaning the modeller has to be careful in his or her choice of bathymetry interpolation Supplementing global river channel data with more localised data demonstrated minor improvements in results suggesting detailed channel information is not always needed to produce good results. It is

  19. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  20. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  1. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  2. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  3. Was There A Big Bang?

    CERN Document Server

    Soberman, Robert K

    2008-01-01

    The big bang hypothesis is widely accepted despite numerous physics conflicts. It rests upon two experimental supports, galactic red shift and the cosmic microwave background. Both are produced by dark matter, shown here to be hydrogen dominated aggregates with a few percent of helium nodules. Scattering from these non-radiating intergalactic masses produce a red shift that normally correlates with distance. Warmed by our galaxy to an Eigenvalue of 2.735 K, drawn near the Earth, these bodies, kept cold by ablation, resonance radiate the Planckian microwave signal. Several tests are proposed that will distinguish between this model and the big bang.

  4. Time-dependent Reliability Analysis of Flood Defence Assets Using Generic Fragility Curve

    Directory of Open Access Journals (Sweden)

    Nepal Jaya

    2016-01-01

    Full Text Available Flood defence assets such as earth embankments comprise the vital part of linear flood defences in many countries including the UK and protect inland from flooding. The risks of flooding are likely to increase in the future due to increasing pressure on land use, increasing rainfall events and rising sea level caused by climate change also affect aging flood defence assets. Therefore, it is important that the flood defence assets are maintained at a high level of safety and serviceability. The high costs associated with preserving these deteriorating flood defence assets and the limited funds available for their maintenance require the development of systematic approaches to ensure the sustainable flood-risk management system. The integration of realistic deterioration measurement and reliabilitybased performance assessment techniques has tremendous potential for structural safety and economic feasibility of flood defence assets. Therefore, the need for reliability-based performance assessment is evident. However, investigations on time-dependent reliability analysis of flood defence assets are limited. This paper presents a novel approach for time-dependent reliability analysis of flood defence assets. In the analysis, time-dependent fragility curve is developed by using the state-based stochastic deterioration model. The applicability of the proposed approach is then demonstrated with a case study.

  5. Flood resilience urban territories. Flood resilience urban territories.

    Science.gov (United States)

    Beraud, Hélène; Barroca, Bruno; Hubert, Gilles

    2010-05-01

    The flood's impact during the last twenty years on French territory reveals our lack of preparation towards large-extended floods which might cause the stopping of companies' activity, services, or lead to housing unavailability during several months. New Orleans' case has to exemplify us: four years after the disaster, the city still couldn't get back its dynamism. In France, more than 300 towns are flood-exposed. While these towns are the mainspring of territory's development, it is likely that the majority of them couldn't get up quickly after a large-extended flood. Therefore, to understand and improve the urban territory's resilience facing floods is a real stake for territory's development. Urban technical networks supply, unify and irrigate all urban territories' constituents. Characterizing their flood resilience can be interesting to understand better urban resilience. In this context, waste management during and after floods is completely crucial. During a flood, the waste management network can become dysfunctional (roads cut, waste storage installations or waste treatment flooded). How can the mayor respect his obligation to guarantee salubrity and security in his city? In post flood the question is even more problematic. The waste management network presents a real stake for territory's restart. After a flood, building materials, lopped-of branches, furniture, business stocks, farm stocks, mud, rubbles, animal cadavers are wet, mixed, even polluted by hydrocarbons or toxic substances. The waste's volume can be significant. Sanitary and environmental risks can be crucial. In view of this situation, waste's management in post crisis period raises a real problem. What to make of this waste? How to collect it? Where to stock it? How to process it? Who is responsible? Answering these questions is all the more strategic since this waste is the mark of disaster. Thus, cleaning will be the first population's and local actor's reflex in order to forget the

  6. Little Science to Big Science: Big Scientists to Little Scientists?

    Science.gov (United States)

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  7. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed

  8. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  9. Application of flood index in monitoring Flood-plain ecosystems (by the example of the Middle Ob flood-plain)

    OpenAIRE

    Bolotnov, V. P.

    2007-01-01

    The concept of regional hydroecological monitoring has been developed for the flood-plain of the Middle Ob. Its object is to control the state of flood-plain ecosystem productivity for organization of scientific, regional-adopted and ecologically regulated nature management. For this purpose hydroecological zoning of flood-plain territory performed, the most representative stations of water-gauge observations for each flood-plain zone organized, the scheme of flood-plain flooding was prepared...

  10. Electricity Consumption Forecasting in the Age of Big Data

    Directory of Open Access Journals (Sweden)

    Xiaojia Wang

    2013-09-01

    Full Text Available In the age of big data, information mining technology has undergone tremendous change; traditional forecasting mining technology has not been able to solve the information mining problems under a large scale of data. this paper put forward a modeling mechanism of information analysis and mining under the age of big data, the modeling mechanism is, first, construct the model of task decomposition of information by MapReduce tool, then, make data preprocessing and mining operation according to the single task data sheet, use mathematical model, artificial intelligence and other methods to construct the new ideas of information analysis and data mining under the age of big data, finally, a case study presented to demonstrate the feasibility and rationality of our approach.

  11. Can companies benefit from Big Science? Science and Industry

    CERN Document Server

    Autio, Erkko; Bianchi-Streit, M

    2003-01-01

    Several studies have indicated that there are significant returns on financial investment via "Big Science" centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields - for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm's organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings i...

  12. The Shallow Waters of the Big-Bang

    CERN Document Server

    Laguna, P

    2006-01-01

    Loop quantum cosmology homogeneous models with a massless scalar field show that the big-bang singularity can be replaced by a big quantum bounce. To gain further insight on the nature of this bounce, we study the semi-discrete loop quantum gravity Hamiltonian constraint equation from the point of view of numerical analysis. We show that the bounce is closely related to the method for the temporal update of the system and demonstrate that, in particular, explicit time-updates in general yield bounces. These bounces can be understood as spurious reflections in finite difference discretizations of wave equations in nonuniform grids or, equivalently, as spurious reflections found when solving wave equations with varying coefficients, such as the shallow water equations. We present an implicit time-update devoid of bounces and show back-in-time, deterministic evolutions that reach and partially jump over the big-bang singularity.

  13. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    Full Text Available The WTI2017 project is responsible for the development of flood defence assessment tools for the 3600 km of Dutch primary flood defences, dikes/levees, dunes and hydraulic structures. These tools are necessary, as per January 1st 2017, the new flood risk management policy for the Netherlands will be implemented. Then, the seven decades old design practice (maximum water level methodology of 1958 and two decades old safety standards (and maximum hydraulic load methodology of 1996 will formally be replaced by a more risked based approach for the national policy in flood risk management. The formal flood defence assessment is an important part of this new policy, especially for flood defence managers, since national and regional funding for reinforcement is based on this assessment. This new flood defence policy is based on a maximum allowable probability of flooding. For this, a maximum acceptable individual risk was determined at 1/100 000 per year, this is the probability of life loss of for every protected area in the Netherlands. Safety standards of flood defences were then determined based on this acceptable individual risk. The results were adjusted based on information from cost -benefit analysis, societal risk and large scale societal disruption due to the failure of critical infrastructure e.g. power stations. The resulting riskbased flood defence safety standards range from a 300 to a 100 000 year return period for failure. Two policy studies, WV21 (Safety from floods in the 21st century and VNK-2 (the National Flood Risk in 2010 provided the essential information to determine the new risk based safety standards for flood defences. The WTI2017 project will provide the safety assessment tools based on these new standards and is thus an essential element for the implementation of this policy change. A major issue to be tackled was the development of user-friendly tools, as the new assessment is to be carried out by personnel of the

  14. Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?

    Science.gov (United States)

    Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy

    2016-10-01

    The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.

  15. Towards Interactive Flood Governance: changing approaches in Dutch flood policy

    NARCIS (Netherlands)

    J.A. van Ast (Jacko)

    2013-01-01

    markdownabstract__Abstract__ In the course of history, flooding of rivers and the sea brought misery to humanity. Low lying delta’s of large rivers like Bangladesh, New Orleans, the Nile delta or the Netherlands belong to the most vulnerable for flood disasters. Since ancient times people pondered

  16. FLOOD AND FLOOD CONTROL OF THE YELLOW RIVER

    Institute of Scientific and Technical Information of China (English)

    Wenxue LI; Huirang WANG; Yunqi SU; Naiqian JIANG; Yuanfeng ZHANG

    2002-01-01

    The Yellow River is the cradle of China. It had long been the center of politics, economics and culture of China in history. Large coverage flood disaster occurred frequently in the Yellow River basin and the losses were often heavy. Thus, the Yellow River is also considered as the serious hidden danger of China. Since the founding of new China, structural and non-structural systems of flood control have been established basically. Tremendous successes have been made on flood control. Into the 21century, flood control standard of the Lower Yellow River has been increased significantly with the operation of the Xiaolangdi Reservoir. However, problems of the Yellow River are complicated and the tasks for solving these problems are arduous. Particularly, the sedimentation problem can't be solved completely in the near future. The situation of "suspended river" and threat of flood will long exist.Therefore, supported by rapid social and economical development of the nation and relied on advanced technology, the flood control system shall be perfected. Meantime, study of the Yellow River shall be enhanced in order to better understand the flood, get with it and use it thus to reduce flood disaster.

  17. Applications of ASFCM(Assessment System of Flood Control Measurement) in Typhoon Committee Members

    Science.gov (United States)

    Kim, C.

    2013-12-01

    Due to extreme weather environment such as global warming and greenhouse effect, the risks of having flood damage has been increased with larger scale of flood damages. Therefore, it became necessary to consider modifying climate change, flood damage and its scale to the previous dimension measurement evaluation system. In this regard, it is needed to establish a comprehensive and integrated system to evaluate the most optimized measures for flood control through eliminating uncertainties of socio-economic impacts. Assessment System of Structural Flood Control Measures (ASFCM) was developed for determining investment priorities of the flood control measures and establishing the social infrastructure projects. ASFCM consists of three modules: 1) the initial setup and inputs module, 2) the flood and damage estimation module, and 3) the socio-economic analysis module. First, we have to construct the D/B for flood damage estimation, which is the initial and input data about the estimation unit, property, historical flood damages, and applied area's topographic & hydrological data. After that, it is important to classify local characteristic for constructing flood damage data. Five local characteristics (big city, medium size city, small city, farming area, and mountain area) are classified by criterion of application (population density). Next step is the floodplain simulation with HEC-RAS which is selected to simulate inundation. Through inputting the D/B and damage estimation, it is able to estimate the total damage (only direct damage) that is the amount of cost to recover the socio-economic activities back to the safe level before flood did occur. The last module suggests the economic analysis index (B/C ratio) with Multidimensional Flood Damage Analysis. Consequently, ASFCM suggests the reference index in constructing flood control measures and planning non-structural systems to reduce water-related damage. It is possible to encourage flood control planners and

  18. Sedimentary record of Warta river floods in summer 2010 and winter 2011 nearby Poznan, W Poland

    Science.gov (United States)

    Skolasińska, Katarzyna; Szczuciński, Witold; Mitręga, Marta; Rotnicka, Joanna; Jagodziński, Robert; Lorenc, Stanisław

    2013-04-01

    The Warta River valley nearby Poznań (W Poland) represents a meandering lowland river changed during the last 150 years by hydro-engineering works. Floods represent a major natural hazard in the region. However, historical records are not complete - particularly for former rural areas. Thus, sedimentary record may potentially offer additional insights into the flooding history. The big floods in the summer 2010 (the largest during the last 31 years) and winter 2011 offered opportunity to study their sedimentary record. The particular purposes were to identify sedimentary characteristics of summer and winter floods, interpret various phases of particular floods in the record, and assess impact of early post-depositional changes of the flood deposits. The surveys were conducted in six areas just after the floods and were repeated after several months, one and two years. The deposits spatial extent, thickness, surface bedforms and sediment type were assessed in the field. Sediment samples were further investigated for grain size distribution, organic matter content, roundness and sand grains surface features (SEM). The sandy flood deposits mostly build natural levee, side bars (climbing ripple lamination and planar cross laminations (only in crevasse splays). Vertical grain size changes in levee deposits revealed pensymmetric and/or reverse grading interpreted as effect of changing velocity during the rising water level. The sand grains were similar to the river channel sands and dominated by polished and sub-rounded quartz grains with preserved dissolution and dulled surface microfeatures. Further from the channel bank (few to few hundreds of meters) only discontinuous up to few mm thick organic rich mud layer was left, which after the summer flood was covered by algal mats. However the mud and mats were quickly reworked by new vegetation. The follow up surveys revealed that the preservation potential of flood deposits is low to moderate (only for sandy deposits

  19. Storage and flood routing

    Science.gov (United States)

    Carter, R.W.; Godfrey, R.G.

    1960-01-01

    The basic equations used in flood routing are developed from the law of continuity. In each method the assumptions are discussed to enable the user to select an appropriate technique. In the stage-storage method the storage is related to the mean gage height in the reach under consideration. In the discharge-storage method the storage is determined, from weighted values of inflow and outflow discharge. In the reservoir-storage method the storage is considered as a function of outflow discharge alone. A detailed example is given for each method to illustrate that particular technique.

  20. Flood Insurance Rate Maps and Base Flood Elevations, FIRM, DFIRM, BFE - MO 2014 Springfield FEMA Base Flood Elevations (SHP)

    Data.gov (United States)

    NSGIC State | GIS Inventory — This polyline layer indicates the approximate effective FEMA Base Flood Elevation (BFE) associated with the corresponding Special Flood Hazard Area (SFHA). Each line...

  1. Flood Insurance Rate Maps and Base Flood Elevations, FIRM, DFIRM, BFE - MO 2010 Springfield FEMA Base Flood Elevations (SHP)

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This polyline layer indicates the approximate effective FEMA Base Flood Elevations (BFE) associated with the corresponding Special Flood Hazard Area (SFHA). Each...

  2. Estancia Special Flood Hazard Areas (SFHA)

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...

  3. Elephant Butte Special Flood Hazard Areas (SFHA)

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...

  4. Sierra County Special Flood Hazard Areas (SFHA)

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...

  5. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  6. Flood Risk, Flood Mitigation, and Location Choice: Evaluating the National Flood Insurance Program's Community Rating System.

    Science.gov (United States)

    Fan, Qin; Davlasheridze, Meri

    2016-06-01

    Climate change is expected to worsen the negative effects of natural disasters like floods. The negative impacts, however, can be mitigated by individuals' adjustments through migration and relocation behaviors. Previous literature has identified flood risk as one significant driver in relocation decisions, but no prior study examines the effect of the National Flood Insurance Program's voluntary program-the Community Rating System (CRS)-on residential location choice. This article fills this gap and tests the hypothesis that flood risk and the CRS-creditable flood control activities affect residential location choices. We employ a two-stage sorting model to empirically estimate the effects. In the first stage, individuals' risk perception and preference heterogeneity for the CRS activities are considered, while mean effects of flood risk and the CRS activities are estimated in the second stage. We then estimate heterogeneous marginal willingness to pay (WTP) for the CRS activities by category. Results show that age, ethnicity and race, educational attainment, and prior exposure to risk explain risk perception. We find significant values for the CRS-creditable mitigation activities, which provides empirical evidence for the benefits associated with the program. The marginal WTP for an additional credit point earned for public information activities, including hazard disclosure, is found to be the highest. Results also suggest that water amenities dominate flood risk. Thus, high amenity values may increase exposure to flood risk, and flood mitigation projects should be strategized in coastal regions accordingly.

  7. Water NOT wanted - Coastal Floods and Flooding Protection in Denmark

    DEFF Research Database (Denmark)

    Sørensen, Carlo Sass

    2016-01-01

    vulnerability towards coastal flooding, the country has experienced severe storm surges throughout history, and hitherto safe areas will become increasingly at risk this century as the climate changes. Historically a seafarers’ nation, Denmark has always been connected with the sea. From medieval time ports...... acceptance of floods has decreased from a “this is a natural consequence of living by the sea” to an explicit: Water Not Wanted! This paper provides a brief overview of floods and flooding protection issues in Denmark (Ch. 2 & Ch. 3), the current legislation (Ch. 4), and discusses challenges in relation...... to climate change adaptation, risk reduction, and to potential ways of rethinking flooding protection in strategies that also incorporate other uses (Ch. 5)....

  8. Data Partitioning View of Mining Big Data

    OpenAIRE

    Zhang, Shichao

    2016-01-01

    There are two main approximations of mining big data in memory. One is to partition a big dataset to several subsets, so as to mine each subset in memory. By this way, global patterns can be obtained by synthesizing all local patterns discovered from these subsets. Another is the statistical sampling method. This indicates that data partitioning should be an important strategy for mining big data. This paper recalls our work on mining big data with a data partitioning and shows some interesti...

  9. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge...: r3planning@fws.gov . Include ``Big Stone Draft CCP/ EA'' in the subject line of the message. Fax:...

  10. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  11. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  12. The Big European Bubble Chamber

    CERN Multimedia

    1977-01-01

    The 3.70 metre Big European Bubble Chamber (BEBC), dismantled on 9 August 1984. During operation it was one of the biggest detectors in the world, producing direct visual recordings of particle tracks. 6.3 million photos of interactions were taken with the chamber in the course of its existence.

  13. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  14. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  15. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  16. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  17. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  18. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  19. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  20. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  1. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  2. Big sagebrush seed bank densities following wildfires

    Science.gov (United States)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  3. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  4. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  5. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  6. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data Revolutio

  7. A survey of big data research

    OpenAIRE

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  8. Similarity criterion of flood discharge atomization

    Institute of Scientific and Technical Information of China (English)

    Zhou Hui; Wu Shiqiang; Chen Huiling; Zhou Jie; Wu Xiufeng

    2008-01-01

    By combining the results of prototype observation of flood discharge atomization at the Wujiangdu Hydropower Station, and by adopting the serial model test method, the model scale effect was examined, the influences of the Reynolds and Weber numbers of water flow on the rain intensity of flood discharge atomization were analyzed and a rain intensity conversion relation was established. It is demonstrated that the level of atomization follows the geometric similarity relations and it is possible to ignore the influence of the surface tension of the flow when the Weber number is greater than 500. Despite limitations such as incomplete data sets, it is undoubtedly helpful to study the scale effect of atomization flow, and it is beneficial to identify the rules of the model test results in order to extrapolate to prototype prediction.

  9. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  10. Simple Method for Assessing Spread of Flood Prone Areas under Historical and Future Rainfall in the Upper Citarum Watershed

    Directory of Open Access Journals (Sweden)

    Bambang Dwi Dasanto

    2014-06-01

    Full Text Available From 1931 to 2010 the flood frequency in Upper Citarum Watershed had increased sharply indicating the decline of the wateshed quality. With the change of climate, risk of the flood may get worse. This study aims to determine effective rainfall that caused flooding and to evaluate the impact of future rainfall changes on the flood prone areas. Effective rainfall which contributes to direct runoff (DRO and leads to flooding was determined using regression equation relating the DRO and cumulative rainfall of a number of consecutive days. Mapping the flood prone areas was developed using the GIS techniques. Results showed that the effective rainfall which caused flooding was the rainfall accumulation for four consecutive days before occurrence of peak of DRO. The percentage of accuracy between estimated and actual flood maps was about 76.9%. According to historical rainfall, the flood prone areas spreaded at right and left directions of the Upstream Citarum River. If this area experiences the climate change, the frequency and flood extents will increase. This study can only identify locations and possibility of flood occurrence but it cannot demonstrate widespread of flood inundation precisely. However, this simple approach can evaluate the flood frequency and intensity quite well.

  11. Flood Risk and Asset Management

    Science.gov (United States)

    2012-09-01

    Within the UK for example, the flooding of the village of Boscastle (August, 2004), that took place over a day, Roca -Collel and Davison (2010), can...Hazard Research Centre. Roca -Collel, M. and Davison, M. (2010). "Two dimensional model analysis of flash- flood processes: application to the Boscastle

  12. Geomorphological factors of flash floods

    Science.gov (United States)

    Kuznetsova, Yulia

    2016-04-01

    Growing anthropogenic load, rise of extreme meteorological events frequency and total precipitation depth often lead to increasing danger of catastrophic fluvial processes worldwide. Flash floods are one of the most dangerous and less understood types of them. Difficulties of their study are mainly related to short duration of single events, remoteness and hard access to origin areas. Most detailed researches of flash floods focus on hydrological parameters of the flow itself and its meteorological factors. At the same time, importance of the basin geological and geomorphological structure for flash floods generation and the role they play in global sediment redistribution is yet poorly understood. However, understanding and quantitative assessment of these features is a real basis for a complete concept of factors, characteristics and dynamics of flash floods. This work is a review of published data on flash floods, and focuses on the geomorphological factors of the phenomenon. We consider both individual roles and interactions between different geomorphological features (the whole basin parameters, characteristics of the single slopes and valley bottom). Special attention is paid to critical values of certain factors. This approach also highlights the gaps or less studied factors of flash floods. Finally, all data is organized into a complex diagram that may be used for flash floods modeling. This also may help to reach a new level of flash flood predictions and risk assessment.

  13. Extreme flooding tolerance in Rorippa

    NARCIS (Netherlands)

    Akman, M.; Bhikharie, A.; Mustroph, A.; Sasidharan, Rashmi

    2014-01-01

    Low oxygen stress imposed by floods creates a strong selection force shaping plant ecosystems in flood-prone areas. Plants inhabiting these environments adopt various adaptations and survival strategies to cope with increasing water depths. Two Rorippa species, R. sylvestris and R. amphibia that gro

  14. Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions

    Science.gov (United States)

    Ostenaa, D.; O'Connell, D.; Creed, B.

    2009-05-01

    The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the

  15. Big biomedical data as the key resource for discovery science.

    Science.gov (United States)

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's.

  16. Big Biomedical data as the key resource for discovery science

    Energy Technology Data Exchange (ETDEWEB)

    Toga, Arthur; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan; Glusman, Gustavo; Heavner, Benjamin D.; Dinov, Ivo D.

    2015-07-21

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.

  17. Astrophysics and Big Data: Challenges, Methods, and Tools

    Science.gov (United States)

    Garofalo, Mauro; Botta, Alessio; Ventre, Giorgio

    2017-06-01

    Nowadays there is no field research which is not flooded with data. Among the sciences, astrophysics has always been driven by the analysis of massive amounts of data. The development of new and more sophisticated observation facilities, both ground-based and spaceborne, has led data more and more complex (Variety), an exponential growth of both data Volume (i.e., in the order of petabytes), and Velocity in terms of production and transmission. Therefore, new and advanced processing solutions will be needed to process this huge amount of data. We investigate some of these solutions, based on machine learning models as well as tools and architectures for Big Data analysis that can be exploited in the astrophysical context.

  18. Using participatory agent-based models to measure flood managers' decision thresholds in extreme event response

    Science.gov (United States)

    Metzger, A.; Douglass, E.; Gray, S. G.

    2016-12-01

    Extreme flooding impacts to coastal cities are not only a function of storm characteristics, but are heavily influenced by decision-making and preparedness in event-level response. While recent advances in climate and hydrological modeling make it possible to predict the influence of climate change on storm and flooding patterns, flood managers still face a great deal of uncertainty related to adapting organizational responses and decision thresholds to these changing conditions. Some decision thresholds related to mitigation of extreme flood impacts are well-understood and defined by organizational protocol, but others are difficult to quantify due to reliance on contextual expert knowledge, experience, and complexity of information necessary to make certain decisions. Our research attempts to address this issue by demonstrating participatory modeling methods designed to help flood managers (1) better understand and parameterize local decision thresholds in extreme flood management situations, (2) collectively learn about scaling management decision thresholds to future local flooding scenarios and (3) identify effective strategies for adaptating flood mitigation actions and organizational response to climate change-intensified flooding. Our agent-based system dynamic models rely on expert knowledge from local flood managers and sophisticated, climate change-informed hydrological models to simulate current and future flood scenarios. Local flood managers from interact with these models by receiving dynamic information and making management decisions as a flood scenario progresses, allowing parametrization of decision thresholds under different scenarios. Flooding impacts are calculated in each iteration as a means of discussing effectiveness of responses and prioritizing response alternatives. We discuss the findings of this participatory modeling and educational process from a case study of Boston, MA, and discuss transferability of these methods to other types

  19. Modeling tools for the assessment of microbiological risks during floods: a review

    Science.gov (United States)

    Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin

    2015-04-01

    Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing

  20. Developing a Malaysia flood model

    Science.gov (United States)

    Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina

    2014-05-01

    Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.

  1. Evaluating the impact and risk of pluvial flash flood on intra-urban road network: A case study in the city center of Shanghai, China

    Science.gov (United States)

    Yin, Jie; Yu, Dapeng; Yin, Zhane; Liu, Min; He, Qing

    2016-06-01

    Urban pluvial flood are attracting growing public concern due to rising intense precipitation and increasing consequences. Accurate risk assessment is critical to an efficient urban pluvial flood management, particularly in transportation sector. This paper describes an integrated methodology, which initially makes use of high resolution 2D inundation modeling and flood depth-dependent measure to evaluate the potential impact and risk of pluvial flash flood on road network in the city center of Shanghai, China. Intensity-Duration-Frequency relationships of Shanghai rainstorm and Chicago Design Storm are combined to generate ensemble rainfall scenarios. A hydrodynamic model (FloodMap-HydroInundation2D) is used to simulate overland flow and flood inundation for each scenario. Furthermore, road impact and risk assessment are respectively conducted by a new proposed algorithm and proxy. Results suggest that the flood response is a function of spatio-temporal distribution of precipitation and local characteristics (i.e. drainage and topography), and pluvial flash flood is found to lead to proportionate but nonlinear impact on intra-urban road inundation risk. The approach tested here would provide more detailed flood information for smart management of urban street network and may be applied to other big cities where road flood risk is evolving in the context of climate change and urbanization.

  2. Entering the ‘big data’ era in medicinal chemistry: molecular promiscuity analysis revisited

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2017-01-01

    The ‘big data’ concept plays an increasingly important role in many scientific fields. Big data involves more than unprecedentedly large volumes of data that become available. Different criteria characterizing big data must be carefully considered in computational data mining, as we discuss herein focusing on medicinal chemistry. This is a scientific discipline where big data is beginning to emerge and provide new opportunities. For example, the ability of many drugs to specifically interact with multiple targets, termed promiscuity, forms the molecular basis of polypharmacology, a hot topic in drug discovery. Compound promiscuity analysis is an area that is much influenced by big data phenomena. Different results are obtained depending on chosen data selection and confidence criteria, as we also demonstrate. PMID:28670471

  3. Entering the 'big data' era in medicinal chemistry: molecular promiscuity analysis revisited.

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2017-06-01

    The 'big data' concept plays an increasingly important role in many scientific fields. Big data involves more than unprecedentedly large volumes of data that become available. Different criteria characterizing big data must be carefully considered in computational data mining, as we discuss herein focusing on medicinal chemistry. This is a scientific discipline where big data is beginning to emerge and provide new opportunities. For example, the ability of many drugs to specifically interact with multiple targets, termed promiscuity, forms the molecular basis of polypharmacology, a hot topic in drug discovery. Compound promiscuity analysis is an area that is much influenced by big data phenomena. Different results are obtained depending on chosen data selection and confidence criteria, as we also demonstrate.

  4. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  5. The Obstacles in Big Data Process

    Directory of Open Access Journals (Sweden)

    Rasim M. Alguliyev

    2017-04-01

    Full Text Available The increasing amount of data and a need to analyze the given data in a timely manner for multiple purposes has created a serious barrier in the big data analysis process. This article describes the challenges that big data creates at each step of the big data analysis process. These problems include typical analytical problems as well as the most uncommon challenges that are futuristic for the big data only. The article breaks down problems for each step of the big data analysis process and discusses these problems separately at each stage. It also offers some simplistic ways to solve these problems.

  6. Subsidence and Deforestation: Implications for Flooding in Delta’s Southeast and East Asia

    Directory of Open Access Journals (Sweden)

    . Nurhamidah

    2011-01-01

    Full Text Available Delta is a low-lying area which can be found at the mouth of a river. Nowadays, concentration of flooding occurs in many deltaic areas due to combination of several factors. Meanwhile, a big number of people live on flood plain of main rivers and river deltas which will be threatened by flooding.  Land subsidence and deforestation are two phenomena which had been occurring very high until recently in SEE Asia region. Increasing of population strongly influences the natural hydrological processes. Due to pressure for land, substantial areas of peat swamps in SEE Asia have been presently are being reclaimed for agriculture or for other land use. In natural conditions swamp areas functioned as a retention area by adsorbing flood water, thereby preventing or mitigation flooding in downstream areas. But unfortunately, large areas of the original forests in large peat swamp forests have disappeared due to human activities such as illegal logging and fires. In other side increasing population, industries, agricultures and plantations will increase water demand. Activities of ground water extraction will be increasing as well. It can cause land subsidence and furthermore tide can easily propagate into deltaic areas moreover compounded by lowering of land surface due to land subsidence. Since flooding is an issue has been identified then these two phenomena need to be identified as well.

  7. Flash Flooding and 'Muddy Floods' on Arable Land

    Science.gov (United States)

    Boardman, J.

    2012-04-01

    Flash flooding is often associated with upland, grazed catchments. It does, however, occur in lowland arable-dominated areas. In southern England, notable examples have occurred at Rottingdean (Brighton) in 1987, at Faringdon (Oxfordshire) in 1993 and at Breaky Bottom vineyard (near Brighton) in 1987 and 2000. All resulted in damage to nearby property. Runoff was largely from recently cultivated ground. The characteristics of such floods are: Rapid runoff from bare soil surfaces. Saturated excess overland flow is likely in the early parts of storms but high intensity rainfall on loamy soils results in crusting and Hortonian overland flow; High rates of erosion; Sediment transport to downvalley sites causing property damage ('muddy flooding'). Muddy floods are known from several areas of Europe e.g. Belgium, northern France, South Limburg (Netherlands) and Slovakia (Boardman et al 2006). In other areas they occur but have gone unreported or are classified under different terms. The necessary conditions for occurrence are areas of arable land which is bare at times of the year when there is a risk of storms. For muddy floods to cause damage (and hence be reported), vulnerable property must lie downstream from such areas of arable land. In some areas the incidence of muddy floods relates to autumn and early winter rainfall and winter cereal crops (e.g. southern England). In continental Europe, flooding is more common in summer and is associated with convectional storms and land uses including sugar beet, maize and potatoes. Predictions of increased numbers of high-intensity storms with future climate change, suggest that arable areas will continue to generate both flash floods and muddy floods.

  8. Base (100-year) flood elevations for selected sites in Marion County, Missouri

    Science.gov (United States)

    Southard, Rodney E.; Wilson, Gary L.

    1998-01-01

    The primary requirement for community participation in the National Flood Insurance Program is the adoption and enforcement of floodplain management requirements that minimize the potential for flood damages to new construction and avoid aggravating existing flooding conditions. This report provides base flood elevations (BFE) for a 100-year recurrence flood for use in the management and regulation of 14 flood-hazard areas designated by the Federal Emergency Management Agency as approximate Zone A areas in Marion County, Missouri. The one-dimensional surface-water flow model, HEC-RAS, was used to compute the base (100-year) flood elevations for the 14 Zone A sites. The 14 sites were located at U.S., State, or County road crossings and the base flood elevation was determined at the upstream side of each crossing. The base (100-year) flood elevations for BFE 1, 2, and 3 on the South Fork North River near Monroe City, Missouri, are 627.7, 579.2, and 545.9 feet above sea level. The base (100-year) flood elevations for BFE 4, 5, 6, and 7 on the main stem of the North River near or at Philadelphia and Palmyra, Missouri, are 560.5, 539.7, 504.2, and 494.4 feet above sea level. BFE 8 is located on Big Branch near Philadelphia, a tributary to the North River, and the base (100-year) flood elevation at this site is 530.5 feet above sea level. One site (BFE 9) is located on the South River near Monroe City, Missouri. The base (100-year) flood elevation at this site is 619.1 feet above sea level. Site BFE 10 is located on Bear Creek near Hannibal, Missouri, and the base (100-year) elevation is 565.5 feet above sea level. The four remaining sites (BFE 11, 12, 13, and 14) are located on the South Fabius River near Philadelphia and Palmyra, Missouri. The base (100-year) flood elevations for BFE 11, 12, 13, and 14 are 591.2, 578.4, 538.7, and 506.9 feet above sea level.

  9. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  10. Solution of a Braneworld Big Crunch/Big Bang Cosmology

    CERN Document Server

    McFadden, P; Turok, N G; Fadden, Paul Mc; Steinhardt, Paul J.; Turok, Neil

    2005-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c)^2. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly-separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  11. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  12. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  13. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  14. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  15. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  16. Improving Gas Flooding Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Reid Grigg; Robert Svec; Zheng Zeng; Alexander Mikhalin; Yi Lin; Guoqiang Yin; Solomon Ampir; Rashid Kassim

    2008-03-31

    This study focuses on laboratory studies with related analytical and numerical models, as well as work with operators for field tests to enhance our understanding of and capabilities for more efficient enhanced oil recovery (EOR). Much of the work has been performed at reservoir conditions. This includes a bubble chamber and several core flood apparatus developed or modified to measure interfacial tension (IFT), critical micelle concentration (CMC), foam durability, surfactant sorption at reservoir conditions, and pressure and temperature effects on foam systems.Carbon dioxide and N{sub 2} systems have been considered, under both miscible and immiscible conditions. The injection of CO2 into brine-saturated sandstone and carbonate core results in brine saturation reduction in the range of 62 to 82% brine in the tests presented in this paper. In each test, over 90% of the reduction occurred with less than 0.5 PV of CO{sub 2} injected, with very little additional brine production after 0.5 PV of CO{sub 2} injected. Adsorption of all considered surfactant is a significant problem. Most of the effect is reversible, but the amount required for foaming is large in terms of volume and cost for all considered surfactants. Some foams increase resistance to the value beyond what is practical in the reservoir. Sandstone, limestone, and dolomite core samples were tested. Dissolution of reservoir rock and/or cement, especially carbonates, under acid conditions of CO2 injection is a potential problem in CO2 injection into geological formations. Another potential change in reservoir injectivity and productivity will be the precipitation of dissolved carbonates as the brine flows and pressure decreases. The results of this report provide methods for determining surfactant sorption and can be used to aid in the determination of surfactant requirements for reservoir use in a CO{sub 2}-foam flood for mobility control. It also provides data to be used to determine rock permeability

  17. Flood hazard and management: a UK perspective.

    Science.gov (United States)

    Wheater, Howard S

    2006-08-15

    This paper discusses whether flood hazard in the UK is increasing and considers issues of flood risk management. Urban development is known to increase fluvial flood frequency, hence design measures are routinely implemented to minimize the impact. Studies suggest that historical effects, while potentially large at small scale, are not significant for large river basins. Storm water flooding within the urban environment is an area where flood hazard is inadequately defined; new methods are needed to assess and manage flood risk. Development on flood plains has led to major capital expenditure on flood protection, but government is attempting to strengthen the planning role of the environmental regulator to prevent this. Rural land use management has intensified significantly over the past 30 years, leading to concerns that flood risk has increased, at least at local scale; the implications for catchment-scale flooding are unclear. New research is addressing this issue, and more broadly, the role of land management in reducing flood risk. Climate change impacts on flooding and current guidelines for UK practice are reviewed. Large uncertainties remain, not least for the occurrence of extreme precipitation, but precautionary guidance is in place. Finally, current levels of flood protection are discussed. Reassessment of flood hazard has led to targets for increased flood protection, but despite important developments to communicate flood risk to the public, much remains to be done to increase public awareness of flood hazard.

  18. Somerset County Flood Information System

    Science.gov (United States)

    Hoppe, Heidi L.

    2007-01-01

    The timely warning of a flood is crucial to the protection of lives and property. One has only to recall the floods of August 2, 1973, September 16 and 17, 1999, and April 16, 2007, in Somerset County, New Jersey, in which lives were lost and major property damage occurred, to realize how costly, especially in terms of human life, an unexpected flood can be. Accurate forecasts and warnings cannot be made, however, without detailed information about precipitation and streamflow in the drainage basin. Since the mid 1960's, the National Weather Service (NWS) has been able to forecast flooding on larger streams in Somerset County, such as the Raritan and Millstone Rivers. Flooding on smaller streams in urban areas was more difficult to predict. In response to this problem the NWS, in cooperation with the Green Brook Flood Control Commission, installed a precipitation gage in North Plainfield, and two flash-flood alarms, one on Green Brook at Seeley Mills and one on Stony Brook at Watchung, in the early 1970's. In 1978, New Jersey's first countywide flood-warning system was installed by the U.S. Geological Survey (USGS) in Somerset County. This system consisted of a network of eight stage and discharge gages equipped with precipitation gages linked by telephone telemetry and eight auxiliary precipitation gages. The gages were installed throughout the county to collect precipitation and runoff data that could be used to improve flood-monitoring capabilities and flood-frequency estimates. Recognizing the need for more detailed hydrologic information for Somerset County, the USGS, in cooperation with Somerset County, designed and installed the Somerset County Flood Information System (SCFIS) in 1990. This system is part of a statewide network of stream gages, precipitation gages, weather stations, and tide gages that collect data in real time. The data provided by the SCFIS improve the flood forecasting ability of the NWS and aid Somerset County and municipal agencies in

  19. Lateral Flooding Associated to Wave Flood Generation on River Surface

    Science.gov (United States)

    Ramírez-Núñez, C.; Parrot, J.-F.

    2016-06-01

    This research provides a wave flood simulation using a high resolution LiDAR Digital Terrain Model. The simulation is based on the generation of waves of different amplitudes that modify the river level in such a way that water invades the adjacent areas. The proposed algorithm firstly reconstitutes the original river surface of the studied river section and then defines the percentage of water loss when the wave floods move downstream. This procedure was applied to a gently slope area in the lower basin of Coatzacoalcos river, Veracruz (Mexico) defining the successive areas where lateral flooding occurs on its downstream movement.

  20. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  1. Raising awareness of the importance of engineering protections against floods with "Flood-o-poly" v.2

    Science.gov (United States)

    Valyrakis, Manousos; Cheng, Ming

    2017-04-01

    This study presents the results of a survey focusing on the use of a new fit for purpose designed city in a sand-box model, namely "Flood-o-poly" version 2, which is building on the success of the previous model. "Flood-o-poly" has been successfully presented to thousands of students of all ages in the School of Engineering Open days (University of Glasgow), Widening Participation, Glasgow Science Festival, Glasgow Science Museum and Engineering Hydraulics classes and Design projects, over the last four years. The new design involves a new and extended 3D scaled model that accurately replicates the topography of a city along with its rivers, towards demonstrating the impacts of flooding (induced artificially in the scaled physical model via the use of small water pumps). "Flood-o-poly" is a highly visual and well popularized engineering outreach project (developed from the applicant at the University of Glasgow), which has already been extensively used to showcase the detrimental impacts of flooding, for both the natural ecosystems and the build infrastructure alike (see https://twitter.com/WaterEngLab/status/758270564561784832 on Twitter and https://youtu.be/H5oThT6QaTc on Youtube). This involves a highly interactive session where the students simulate the scenarios of "urbanization" (by placing more buildings on the flood-planes) and "climate change" where more extreme flow rates have to be routed through the river. The project demonstrates how this design can benefit the cohorts of the 3rd and 4rth year Civil Engineering undergraduate students, the students attending the School's Open days, Widening Participation Days, Glasgow Science Festival and Glasgow Science Museum events. "Flood-o-poly" focuses on personalizing the student experience with regard to flood impacts and promotes the need for resilient and sustainable flood protection designs. Further, using novel presentation and student-centered technologies, the students are given a truly unique experience

  2. Large-scale application of the flood damage model RAilway Infrastructure Loss (RAIL)

    Science.gov (United States)

    Kellermann, Patric; Schönberger, Christine; Thieken, Annegret H.

    2016-11-01

    Experience has shown that river floods can significantly hamper the reliability of railway networks and cause extensive structural damage and disruption. As a result, the national railway operator in Austria had to cope with financial losses of more than EUR 100 million due to flooding in recent years. Comprehensive information on potential flood risk hot spots as well as on expected flood damage in Austria is therefore needed for strategic flood risk management. In view of this, the flood damage model RAIL (RAilway Infrastructure Loss) was applied to estimate (1) the expected structural flood damage and (2) the resulting repair costs of railway infrastructure due to a 30-, 100- and 300-year flood in the Austrian Mur River catchment. The results were then used to calculate the expected annual damage of the railway subnetwork and subsequently analysed in terms of their sensitivity to key model assumptions. Additionally, the impact of risk aversion on the estimates was investigated, and the overall results were briefly discussed against the background of climate change and possibly resulting changes in flood risk. The findings indicate that the RAIL model is capable of supporting decision-making in risk management by providing comprehensive risk information on the catchment level. It is furthermore demonstrated that an increased risk aversion of the railway operator has a marked influence on flood damage estimates for the study area and, hence, should be considered with regard to the development of risk management strategies.

  3. A study of the climate change impacts on fluvial flood propagation in the Vietnamese Mekong Delta

    Directory of Open Access Journals (Sweden)

    V. P. Dang Tri

    2012-06-01

    Full Text Available The present paper investigated what would be the flood propagation in the Vietnamese Mekong Delta (VMD, due to different projected climate change scenarios, if the 2000 flood event (the most recent highest flood in the history was taken as a base for computation. The analysis herein was done to demonstrate the particular complexity of the flood dynamics. The future floods, on short term horizon, year 2050, were studied by considering the projected sea level rise (SLR (+30 cm. At the same time, future flood hydrograph changes at Kratie, Cambodia were applied for the upstream boundary condition. In this study, the future flood hydrograph was separated into two scenarios in which: (i Scenario 1 was projected in 2050 according to the adjusted regional climate model without any development in the Upper Mekong Basin; and, (ii Scenario 2 was projected as in Scenario 1 but with the development of the Upper Mekong Basin after 2030. Analyses were done to identify the high sensitive areas in terms of flood conditions (i.e. with and without flood according to the uncertainty of the projection of both the upstream and downstream boundary conditions. In addition, due to the rice-dominated culture in the VMD, possible impacts of flood on the rice-based farming systems were analysed.

  4. Management of flood victims: Chainat Province, central Thailand.

    Science.gov (United States)

    Wisitwong, Anchaleeporn; McMillan, Margaret

    2010-03-01

    This article focuses on the processes of flood management and the experiences of flood victims in Chainat Province, central Thailand, so as to develop knowledge about the future handling of such disasters. A phenomenological qualitative approach was used to describe the processes of providing assistance to flood victims. In-depth interviews and observation were used to collect the data. Criterion sampling was used to select 23 participants. Content analysis of the data revealed that some flood victims could predict flooding based on prior experiences, so they prepared themselves. The data revealed six themes that demonstrated that those who could not predict how floods would impact on them were unprepared and suffered losses and disruption to their daily life. Damaged routes meant people could not go to work, resulting in the loss of income. There was a lack of sanitary appliances and clean drinking water, people were sick, and experienced stress. At the community level, people helped one another, making sandbags and building walls as a defense against water. They formed support groups to enable the processing of stressful experiences. However, later, the water became stagnant and contaminated, creating an offensive smell. The government provided assistance to cut off electricity services, food and water, toilets and health services, and water drainage. In the recovery phase, the victims needed money for investment, employment opportunities, books for children, extra time to pay off loans, reconnection of electricity, surveys of damage, and pensions to deal with damage and recovery.

  5. Flash flood area mapping utilising SENTINEL-1 radar data

    Science.gov (United States)

    Psomiadis, Emmanouil

    2016-10-01

    The new European Observatory radar data of polar orbiting satellite system Sentinel-1 provide a continuous and systematic data acquisition, enabling flood events monitoring and mapping. The study area is the basin of Sperchios River in Fthiotida Prefecture, Central Greece, having an increased ecological, environmental and socio-economic interest. The catchment area and especially the river delta, faces several problems and threats caused by anthropogenic activities and natural processes. The geomorphology of Sperchios catchment area and the drainage network formation provoke the creation of floods. A large flash flood event took place in late January early February 2015 following an intense and heavy rainfall that occurred in the area. Two space born radar images, obtained from Sentinel-1 covering the same area, one before and another one during the flood event, were processed. Two different methods were utilized so as to produce flood hazard maps, which demonstrate the inundated areas. The results of the two methods were similar and the flooded area was detected and delineated ideally.

  6. Dealing with Uncertainty in Flood Management Through Diversification

    Directory of Open Access Journals (Sweden)

    Jeroen C. J. H. Aerts

    2008-06-01

    Full Text Available This paper shows, through a numerical example, how to develop portfolios of flood management activities that generate the highest return under an acceptable risk for an area in the central part of the Netherlands. The paper shows a method based on Modern Portfolio Theory (MPT that contributes to developing flood management strategies. MPT aims at finding sets of investments that diversify risks thereby reducing the overall risk of the total portfolio of investments. This paper shows that through systematically combining four different flood protection measures in portfolios containing three or four measures; risk is reduced compared with portfolios that only contain one or two measures. Adding partly uncorrelated measures to the portfolio diversifies risk. We demonstrate how MPT encourages a systematic discussion of the relationship between the return and risk of individual flood mitigation activities and the return and risk of complete portfolios. It is also shown how important it is to understand the correlation of the returns of various flood management activities. The MPT approach, therefore, fits well with the notion of adaptive water management, which perceives the future as inherently uncertain. Through applying MPT on flood protection strategies current vulnerability will be reduced by diversifying risk.

  7. Socio-hydrological flood models

    Science.gov (United States)

    Barendrecht, Marlies; Viglione, Alberto; Blöschl, Günter

    2017-04-01

    Long-term feedbacks between humans and floods may lead to complex phenomena such as coping strategies, levee effects, call effects, adaptation effects, and poverty traps. Such phenomena cannot be represented by traditional flood risk approaches that are based on scenarios. Instead, dynamic models of the coupled human-flood interactions are needed. These types of models should include both social and hydrological variables as well as other relevant variables, such as economic, environmental, political or technical, in order to adequately represent the feedbacks and processes that are of importance in human-flood systems. These socio-hydrological models may play an important role in integrated flood risk management by exploring a wider range of possible futures, including unexpected phenomena, than is possible by creating and studying scenarios. New insights might come to light about the long term effects of certain measures on society and the natural system. Here we discuss a dynamic framework for flood risk and review the models that are presented in literature. We propose a way forward for socio-hydrological modelling of the human-flood system.

  8. Accounting For Greenhouse Gas Emissions From Flooded ...

    Science.gov (United States)

    Nearly three decades of research has demonstrated that the inundation of rivers and terrestrial ecosystems behind dams can lead to enhanced rates of greenhouse gas emissions, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. The research approaches include 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions. To inform th

  9. Wetland restoration, flood pulsing, and disturbance dynamics

    Science.gov (United States)

    Middleton, Beth A.

    1999-01-01

    While it is generally accepted that flood pulsing and disturbance dynamics are critical to wetland viability, there is as yet no consensus among those responsible for wetland restoration about how best to plan for those phenomena or even whether it is really necessary to do so at all. In this groundbreaking book, Dr. Beth Middleton draws upon the latest research from around the world to build a strong case for making flood pulsing and disturbance dynamics integral to the wetland restoration planning process.While the initial chapters of the book are devoted to laying the conceptual foundations, most of the coverage is concerned with demonstrating the practical implications for wetland restoration and management of the latest ecological theory and research. It includes a fascinating case history section in which Dr. Middleton explores the restoration models used in five major North American, European, Australian, African, and Asian wetland projects, and analyzes their relative success from the perspective of flood pulsing and disturbance dynamics planning.Wetland Restoration also features a wealth of practical information useful to all those involved in wetland restoration and management, including: * A compendium of water level tolerances, seed germination, seedling recruitment, adult survival rates, and other key traits of wetland plant species * A bibliography of 1,200 articles and monographs covering all aspects of wetland restoration * A comprehensive directory of wetland restoration ftp sites worldwide * An extensive glossary of essential terms

  10. Accounting For Greenhouse Gas Emissions From Flooded ...

    Science.gov (United States)

    Nearly three decades of research has demonstrated that the inundation of rivers and terrestrial ecosystems behind dams can lead to enhanced rates of greenhouse gas emissions, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. The research approaches include 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions. To inform th

  11. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  12. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  13. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  14. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  15. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  16. Big Data in Transport Geography

    DEFF Research Database (Denmark)

    Reinau, Kristian Hegner; Agerholm, Niels; Lahrmann, Harry Spaabæk

    The emergence of new tracking technologies and Big Data has caused a transformation of the transport geography field in recent years. One new datatype, which is starting to play a significant role in public transport, is smart card data. Despite the growing focus on smart card data, there is a need...... for studies that explicitly compare the quality of this new type of data to traditional data sources. With the current focus on Big Data in the transport field, public transport planners are increasingly looking towards smart card data to analyze and optimize flows of passengers. However, in many cases...... it is not all public transport passengers in a city, region or country with a smart card system that uses the system, and in such cases, it is important to know what biases smart card data has in relation to giving a complete view upon passenger flows. This paper therefore analyses the quality and biases...

  17. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  18. From flood management systems to flood resilient systems: integration of flood resilient technologies

    Science.gov (United States)

    Salagnac, J.-L.; Diez, J.; Tourbier, J.

    2012-04-01

    Flooding has always been a major risk world-wide. Humans chose to live and develop settlements close to water (rivers, seas) due to the resources water brings, i.e. food, energy, capacity to economically transport persons and goods, and recreation. However, the risk from flooding, including pluvial flooding, often offsets these huge advantages. Floods sometimes have terrible consequences from both a human and economic point of view. The permanence and growth of urban areas in flood-prone zones despite these risks is a clear indication of the choices of concerned human groups. The observed growing concentration of population along the sea shore, the increase of urban population worldwide, the exponential growth of the world population and possibly climate change are factors that confirm flood will remain a major issue for the next decades. Flood management systems are designed and implemented to cope with such situations. In spite of frequent events, lessons look to be difficult to draw out and progresses are rather slow. The list of potential triggers to improve flood management systems is nevertheless well established: information, education, awareness raising, alert, prevention, protection, feedback from events, ... Many disciplines are concerned which cover a wide range of soft and hard sciences. A huge amount of both printed and electronic literature is available. Regulations are abundant. In spite of all these potentially favourable elements, similar questions spring up after each new significant event: • Was the event forecast precise enough? • Was the alert system efficient? • Why were buildings built in identified flood prone areas? • Why did the concerned population not follow instructions? • Why did the dike break? • What should we do to avoid it happens again? • What about damages evaluation, wastes and debris evacuation, infrastructures and buildings repair, activity recovery, temporary relocation of inhabitants, health concerns, insurance

  19. Flooding Fragility Experiments and Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Tahhan, Antonio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Muchmore, Cody [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nichols, Larinda [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bhandari, Bishwo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pope, Chad [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This report describes the work that has been performed on flooding fragility, both the experimental tests being carried out and the probabilistic fragility predictive models being produced in order to use the text results. Flooding experiments involving full-scale doors have commenced in the Portal Evaluation Tank. The goal of these experiments is to develop a full-scale component flooding experiment protocol and to acquire data that can be used to create Bayesian regression models representing the fragility of these components. This work is in support of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluation research and development.

  20. Big data processing with Hadoop

    OpenAIRE

    Wu, Shiqi

    2015-01-01

    Computing technology has changed the way we work, study, and live. The distributed data processing technology is one of the popular topics in the IT field. It provides a simple and centralized computing platform by reducing the cost of the hardware. The characteristics of distributed data processing technology have changed the whole industry. Hadoop, as the open source project of Apache foundation, is the most representative platform of distributed big data processing. The Hadoop distribu...

  1. Statistical Inference: The Big Picture.

    Science.gov (United States)

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  2. Adaptive flood risk management in urban areas

    NARCIS (Netherlands)

    Mees, H.L.P.; Driessen, P.P.J.; Runhaar, H.A.C.

    2012-01-01

    In recent times a shift has occurred from traditional flood management focused on the prevention of flooding (reduction of the probability) only, to more adaptive strategies focused on the reduction of the impacts of floods as a means to improve the resilience of occupied flood plains to increased r

  3. Safety in the Chemical Laboratory: Flood Control.

    Science.gov (United States)

    Pollard, Bruce D.

    1983-01-01

    Describes events leading to a flood in the Wehr Chemistry Laboratory at Marquette University, discussing steps taken to minimize damage upon discovery. Analyzes the problem of flooding in the chemical laboratory and outlines seven steps of flood control: prevention; minimization; early detection; stopping the flood; evaluation; clean-up; and…

  4. Local Flood Action Groups: Governance And Resilience

    NARCIS (Netherlands)

    Forrest, Steven; Trell, Elen-Maarja; Woltjer, Johan; Macoun, Milan; Maier, Karel

    2015-01-01

    A diverse range of citizen groups focusing on flood risk management have been identified in several European countries. The paper discusses the role of flood action (citizen) groups in the context of flood resilience and will do this by analysing the UK and its diverse range of flood groups. These c

  5. A framework for global river flood risk assessments

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2013-05-01

    Full Text Available There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population. The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE. We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from

  6. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  7. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  8. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  9. A Big Data Approach to Computational Creativity

    CERN Document Server

    Varshney, Lav R; Varshney, Kush R; Bhattacharjya, Debarun; Schoergendorfer, Angela; Chee, Yi-Min

    2013-01-01

    Computational creativity is an emerging branch of artificial intelligence that places computers in the center of the creative process. Broadly, creativity involves a generative step to produce many ideas and a selective step to determine the ones that are the best. Many previous attempts at computational creativity, however, have not been able to achieve a valid selective step. This work shows how bringing data sources from the creative domain and from hedonic psychophysics together with big data analytics techniques can overcome this shortcoming to yield a system that can produce novel and high-quality creative artifacts. Our data-driven approach is demonstrated through a computational creativity system for culinary recipes and menus we developed and deployed, which can operate either autonomously or semi-autonomously with human interaction. We also comment on the volume, velocity, variety, and veracity of data in computational creativity.

  10. Flood damage, vulnerability and risk perception - challenges for flood damage research

    OpenAIRE

    2005-01-01

    The current state-of-the-art in flood damage analysis mainly focuses on the economic evaluation of tangible flood effects. It is contended in this discussion paper that important economic, social and ecological aspects of flood-related vulnerabilities are neglected. It is a challenge for flood research to develop a wider perspective for flood damage evaluation.

  11. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    Science.gov (United States)

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-08-05

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation.

  12. Evaluation of the magnitude and frequency of floods in urban watersheds in Phoenix and Tucson, Arizona

    Science.gov (United States)

    Kennedy, Jeffrey R.; Paretti, Nicholas V.

    2014-01-01

    Flooding in urban areas routinely causes severe damage to property and often results in loss of life. To investigate the effect of urbanization on the magnitude and frequency of flood peaks, a flood frequency analysis was carried out using data from urbanized streamgaging stations in Phoenix and Tucson, Arizona. Flood peaks at each station were predicted using the log-Pearson Type III distribution, fitted using the expected moments algorithm and the multiple Grubbs-Beck low outlier test. The station estimates were then compared to flood peaks estimated by rural-regression equations for Arizona, and to flood peaks adjusted for urbanization using a previously developed procedure for adjusting U.S. Geological Survey rural regression peak discharges in an urban setting. Only smaller, more common flood peaks at the 50-, 20-, 10-, and 4-percent annual exceedance probabilities (AEPs) demonstrate any increase in magnitude as a result of urbanization; the 1-, 0.5-, and 0.2-percent AEP flood estimates are predicted without bias by the rural-regression equations. Percent imperviousness was determined not to account for the difference in estimated flood peaks between stations, either when adjusting the rural-regression equations or when deriving urban-regression equations to predict flood peaks directly from basin characteristics. Comparison with urban adjustment equations indicates that flood peaks are systematically overestimated if the rural-regression-estimated flood peaks are adjusted upward to account for urbanization. At nearly every streamgaging station in the analysis, adjusted rural-regression estimates were greater than the estimates derived using station data. One likely reason for the lack of increase in flood peaks with urbanization is the presence of significant stormwater retention and detention structures within the watershed used in the study.

  13. Coupling Modelling of Urban Development and Flood Risk – An Attempt for a Combined Software Framework

    DEFF Research Database (Denmark)

    Löwe, Roland; Sto Domingo, Nina; Urich, Christian;

    2015-01-01

    to use the results of the hydraulic simulation to condition DANCE4WATER and to account for flood risk in the simulated urban development. In an Australian case study, we demonstrate that future flood risk can be significantly reduced while maintaining the overall speed of urban development....

  14. Sequential planning of flood protection infrastructure under limited historic flood record and climate change uncertainty

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Straub, Daniel

    2017-04-01

    Flood protection is often designed to safeguard people and property following regulations and standards, which specify a target design flood protection level, such as the 100-year flood level prescribed in Germany (DWA, 2011). In practice, the magnitude of such an event is only known within a range of uncertainty, which is caused by limited historic records and uncertain climate change impacts, among other factors (Hall & Solomatine, 2008). As more observations and improved climate projections become available in the future, the design flood estimate changes and the capacity of the flood protection may be deemed insufficient at a future point in time. This problem can be mitigated by the implementation of flexible flood protection systems (that can easily be adjusted in the future) and/or by adding an additional reserve to the flood protection, i.e. by applying a safety factor to the design. But how high should such a safety factor be? And how much should the decision maker be willing to pay to make the system flexible, i.e. what is the Value of Flexibility (Špačková & Straub, 2017)? We propose a decision model that identifies cost-optimal decisions on flood protection capacity in the face of uncertainty (Dittes et al. 2017). It considers sequential adjustments of the protection system during its lifetime, taking into account its flexibility. The proposed framework is based on pre-posterior Bayesian decision analysis, using Decision Trees and Markov Decision Processes, and is fully quantitative. It can include a wide range of uncertainty components such as uncertainty associated with limited historic record or uncertain climate or socio-economic change. It is shown that since flexible systems are less costly to adjust when flood estimates are changing, they justify initially lower safety factors. Investigation on the Value of Flexibility (VoF) demonstrates that VoF depends on the type and degree of uncertainty, on the learning effect (i.e. kind and quality of

  15. Direct local building inundation depth determination in 3-D point clouds generated from user-generated flood images

    Science.gov (United States)

    Griesbaum, Luisa; Marx, Sabrina; Höfle, Bernhard

    2017-07-01

    In recent years, the number of people affected by flooding caused by extreme weather events has increased considerably. In order to provide support in disaster recovery or to develop mitigation plans, accurate flood information is necessary. Particularly pluvial urban floods, characterized by high temporal and spatial variations, are not well documented. This study proposes a new, low-cost approach to determining local flood elevation and inundation depth of buildings based on user-generated flood images. It first applies close-range digital photogrammetry to generate a geo-referenced 3-D point cloud. Second, based on estimated camera orientation parameters, the flood level captured in a single flood image is mapped to the previously derived point cloud. The local flood elevation and the building inundation depth can then be derived automatically from the point cloud. The proposed method is carried out once for each of 66 different flood images showing the same building façade. An overall accuracy of 0.05 m with an uncertainty of ±0.13 m for the derived flood elevation within the area of interest as well as an accuracy of 0.13 m ± 0.10 m for the determined building inundation depth is achieved. Our results demonstrate that the proposed method can provide reliable flood information on a local scale using user-generated flood images as input. The approach can thus allow inundation depth maps to be derived even in complex urban environments with relatively high accuracies.

  16. City-scale accessibility of emergency responders operating during flood events

    Science.gov (United States)

    Green, Daniel; Yu, Dapeng; Pattison, Ian; Wilby, Robert; Bosher, Lee; Patel, Ramila; Thompson, Philip; Trowell, Keith; Draycon, Julia; Halse, Martin; Yang, Lili; Ryley, Tim

    2017-01-01

    Emergency responders often have to operate and respond to emergency situations during dynamic weather conditions, including floods. This paper demonstrates a novel method using existing tools and datasets to evaluate emergency responder accessibility during flood events within the city of Leicester, UK. Accessibility was quantified using the 8 and 10 min legislative targets for emergency provision for the ambulance and fire and rescue services respectively under "normal" no-flood conditions, as well as flood scenarios of various magnitudes (1 in 20-year, 1 in 100-year and 1 in 1000-year recurrence intervals), with both surface water and fluvial flood conditions considered. Flood restrictions were processed based on previous hydrodynamic inundation modelling undertaken and inputted into a Network Analysis framework as restrictions for surface water and fluvial flood events. Surface water flooding was shown to cause more disruption to emergency responders operating within the city due to its widespread and spatially distributed footprint when compared to fluvial flood events of comparable magnitude. Fire and rescue 10 min accessibility was shown to decrease from 100, 66.5, 39.8 and 26.2 % under the no-flood, 1 in 20-year, 1 in 100-year and 1 in 1000-year surface water flood scenarios respectively. Furthermore, total inaccessibility was shown to increase with flood magnitude from 6.0 % under the 1 in 20-year scenario to 31.0 % under the 1 in 100-year flood scenario. Additionally, the evolution of emergency service accessibility throughout a surface water flood event is outlined, demonstrating the rapid impact on emergency service accessibility within the first 15 min of the surface water flood event, with a reduction in service coverage and overlap being observed for the ambulance service during a 1 in 100-year flood event. The study provides evidence to guide strategic planning for decision makers prior to and during emergency response to flood events at the city

  17. Urban flood risk assessment using sewer flooding databases.

    Science.gov (United States)

    Caradot, Nicolas; Granger, Damien; Chapgier, Jean; Cherqui, Frédéric; Chocat, Bernard

    2011-01-01

    Sustainable water management is a global challenge for the 21st century. One key aspect remains protection against urban flooding. The main objective is to ensure or maintain an adequate level of service for all inhabitants. However, level of service is still difficult to assess and the high-risk locations difficult to identify. In this article, we propose a methodology, which (i) allows water managers to measure the service provided by the urban drainage system with regard to protection against urban flooding; and (ii) helps stakeholders to determine effective strategies for improving the service provided. One key aspect of this work is to use a database of sewer flood event records to assess flood risk. Our methodology helps urban water managers to assess the risk of sewer flooding; this approach does not seek to predict flooding but rather to inform decision makers on the current level of risk and on actions which need to be taken to reduce the risk. This work is based on a comprehensive definition of risk, including territorial vulnerability and perceptions of urban water stakeholders. This paper presents the results and the methodological contributions from implementing the methodology on two case studies: the cities of Lyon and Mulhouse.

  18. Flood Progression Modelling and Impact Analysis

    DEFF Research Database (Denmark)

    Mioc, Darka; Anton, François; Nickerson, B.

    People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model...... that computes floodplain polygons before the flood occurs. This allows emergency managers to access the impact of the flood before it occurs and make the early decisions for evacuation of the population and flood rescue. This research shows that the use of GIS and LiDAR technologies combined with hydrological...... modelling can significantly improve the decision making and visualization of flood impact needed for emergency planning and flood rescue. Furthermore, the 3D GIS application we developed for modelling flooded buildings and infrastructure provides a better platform for modelling and visualizing flood...

  19. Smoky River coal flood risk mapping study

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-06-01

    The Canada-Alberta Flood Damage Reduction Program (FDRP) is designed to reduce flood damage by identifying areas susceptible to flooding and by encouraging application of suitable land use planning, zoning, and flood preparedness and proofing. The purpose of this study is to define flood risk and floodway limits along the Smoky River near the former Smoky River Coal (SRC) plant. Alberta Energy has been responsible for the site since the mine and plant closed in 2000. The study describes flooding history, available data, features of the river and valley, calculation of flood levels, and floodway determination, and includes flood risk maps. The HEC-RAS program is used for the calculations. The flood risk area was calculated using the 1:100 year return period flood as the hydrological event. 7 refs., 11 figs., 7 tabs., 3 apps.

  20. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    Science.gov (United States)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.