WorldWideScience

Sample records for demonstration flood big

  1. Thirty Years Later: Reflections of the Big Thompson Flood, Colorado, 1976 to 2006

    Science.gov (United States)

    Jarrett, R. D.; Costa, J. E.; Brunstein, F. C.; Quesenberry, C. A.; Vandas, S. J.; Capesius, J. P.; O'Neill, G. B.

    2006-12-01

    Thirty years ago, over 300 mm of rain fell in about 4 to 6 hours in the middle reaches of the Big Thompson River Basin during the devastating flash flood on July 31, 1976. The rainstorm produced flood discharges that exceeded 40 m3/s/km2. A peak discharge of 883 m3/s was estimated at the Big Thompson River near Drake streamflow-gaging station. The raging waters left 144 people dead, 250 injured, and over 800 people were evacuated by helicopter. Four-hundred eighteen homes and businesses were destroyed, as well as 438 automobiles, and damage to infrastructure left the canyon reachable only via helicopter. Total damage was estimated in excess of $116 million (2006 dollars). Natural hazards similar to the Big Thompson flood are rare, but the probability of a similar event hitting the Front Range, other parts of Colorado, or other parts of the Nation is real. Although much smaller in scale than the Big Thompson flood, several flash floods have happened during the monsoon in early July 2006 in the Colorado foothills that reemphasized the hazards associated with flash flooding. The U.S. Geological Survey (USGS) conducts flood research to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson flood. A summary of hydrologic conditions of the 1976 flood, what the 1976 flood can teach us about flash floods, a description of some of the advances in USGS flood science as a consequence of this disaster, and lessons that we learned to help reduce loss of life from this extraordinary flash flood are discussed. In the 30 years since the Big Thompson flood, there have been important advances in streamflow monitoring and flood warning. The National Weather Service (NWS) NEXRAD radar allows real-time monitoring of precipitation in most places in the United States. The USGS currently (2006) operates about 7,250 real-time streamflow-gaging stations in the United States that are monitored by the USGS, the NWS, and emergency managers

  2. Assessment of big floods in the Eastern Black Sea Basin of Turkey.

    Science.gov (United States)

    Yüksek, Ömer; Kankal, Murat; Üçüncü, Osman

    2013-01-01

    In this study, general knowledge and some details of the floods in Eastern Black Sea Basin of Turkey are presented. Brief hydro-meteorological analysis of selected nine floods and detailed analysis of the greatest flood are given. In the studied area, 51 big floods have taken place between 1955-2005 years, causing 258 deaths and nearly US $500,000,000 of damage. Most of the floods have occurred in June, July and August. It is concluded that especially for the rainstorms that have caused significantly damages, the return periods of the rainfall heights and resultant flood discharges have gone up to 250 and 500 years, respectively. A general agreement is observed between the return periods of rains and resultant floods. It is concluded that there has been no significant climate change to cause increases in flood harms. The most important human factors to increase the damage are determined as wrong and illegal land use, deforestation and wrong urbanization and settlement, psychological and technical factors. Some structural and non-structural measures to mitigate flood damages are also included in the paper. Structural measures include dykes and flood levees. Main non-structural measures include flood warning system, modification of land use, watershed management and improvement, flood insurance, organization of flood management studies, coordination between related institutions and education of the people and informing of the stakeholders.

  3. WBP: The wood Brazilian BIG-GT demonstration project

    Energy Technology Data Exchange (ETDEWEB)

    Carpentieri, E. [Companhia Hidro Eletrica do Sao Francisco, Recife (Brazil)

    1993-12-31

    Brazil is one of the leading countries in the use of renewable energy. Most of its electricity comes from hydro power, about 200,000 barrels a day of ethanol from sugar cane is used as fuel, around 38% of the pig iron, and 20% of the steel production, uses charcoal as a reducing medium. Located in the tropics, with the sun shining all year round, and with its vast territory, the Country may be regarded as having all the basic conditions to develop a modern Biomass for Electricity industry. The conjunction of those characteristics with, the necessity of developing new energy resources for electricity production in the Northeast of the Country, the results of the studies made by Princeton University, Shell and Chesf, the progress achieved by the BIG-GT (Biomass Integrated Gasification Gas Turbine) technology in Europe, and the organization of the Global Environment Facility (GEF), provided the unique opportunity for the implementation of a commercial demonstration in Brazil. This paper describes the idea, the scope, the technical challenges, and actual status of development of the WBP, a project which aims to demonstrate the commercial viability of the BIG-GT technology. It also highlights, the project management structure, the role of the GEF, World Bank and of the United Nations Development Program (UNDP), and the participation of the Brazilian Federal Government, through the Ministry of Science and Technology (MCT). Finally it describes the Participants (ELETROBRAS, CVRD, CIENTEC, SHELL, and CHESF), their role in the project, and how the group was formed and operates.

  4. PROCESSING BIG REMOTE SENSING DATA FOR FAST FLOOD DETECTION IN A DISTRIBUTED COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2017-07-01

    Full Text Available The Earth observation (EO missions of the space agencies and space industry (ESA, NASA, national and commercial companies are evolving as never before. These missions aim to develop and launch next-generation series of satellites and sensors and often provide huge amounts of data, even free of charge, to enable novel monitoring services. The wide geospatial sector is targeted to handle new challenges to store, process and visualize these geospatial data, reaching the level of Big Data by their volume, variety, velocity, along with the need of multi-source spatio-temporal geospatial data processing. Handling and analysis of remote sensing data has always been a cumbersome task due to the ever-increasing size and frequency of collected information. This paper presents the achievements of the IQmulus EU FP7 research and development project with respect to processing and analysis of geospatial big data in the context of flood and waterlogging detection.

  5. The ordered network structure and its prediction for the big floods of the Changjiang River Basins

    Energy Technology Data Exchange (ETDEWEB)

    Men, Ke-Pei; Zhao, Kai; Zhu, Shu-Dan [Nanjing Univ. of Information Science and Technology, Nanjing (China). College of Mathematics and Statistics

    2013-12-15

    According to the latest statistical data of hydrology, a total of 21 floods took place over the Changjiang (Yangtze) River Basins from 1827 to 2012 and showed an obvious commensurable orderliness. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered analysis with complex network technology, we focus on the summary of the ordered network structure of the Changjiang floods, supplement new information, further optimize networks, construct the 2D- and 3D-ordered network structure and make prediction research. Predictions show that the future big deluges will probably occur over the Changjiang River Basin around 2013-2014, 2020-2021, 2030, 2036, 2051, and 2058. (orig.)

  6. The ordered network structure and its prediction for the big floods of the Changjiang River Basins

    International Nuclear Information System (INIS)

    Men, Ke-Pei; Zhao, Kai; Zhu, Shu-Dan

    2013-01-01

    According to the latest statistical data of hydrology, a total of 21 floods took place over the Changjiang (Yangtze) River Basins from 1827 to 2012 and showed an obvious commensurable orderliness. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered analysis with complex network technology, we focus on the summary of the ordered network structure of the Changjiang floods, supplement new information, further optimize networks, construct the 2D- and 3D-ordered network structure and make prediction research. Predictions show that the future big deluges will probably occur over the Changjiang River Basin around 2013-2014, 2020-2021, 2030, 2036, 2051, and 2058. (orig.)

  7. From Big Data to Small Transportable Products for Decision Support for Floods in Namibia

    Science.gov (United States)

    Mandl, D.; Frye, S.; Cappelaere, P.; Policelli, F.; Handy, M.; Sohlberg, R. A.; Grossman, R.

    2013-12-01

    During the past four years, a team from NASA, Oklahoma University, University of Maryland and University of Chicago in collaboration with the Namibia Hydrological Services (NHS) has explored ways to provide decision support products for floods. The products include a variety of data including a hydrological model, ground measurements such as river gauges, and earth remote sensing data. This poster or presentation highlights the lessons learned in acquiring, storing, managing big data on the cloud and turning it into relevant products for GEOSS users. Technology that has been explored includes the use of Hadoop/MapReduce and Accumulo to process and manage the large data sets. OpenStreetMap was explored for use in cataloging water boundaries and enabling collaborative mapping of the base water mask and floods. A Flood Dashboard was created to customize displays of various data products. Finally, a higher level Geo-Social Application Processing Interface (API) was developed so that users can discover, generate products dynamically for their specific needs/societal benefit areas and then share them with their Community of Practice over social networks. Results of this experiment have included 100x reduction in size of some flood products, making it possible to distribute these products to mobile platforms and/or bandwidth-limited users.

  8. Flood-inundation maps for a 12.5-mile reach of Big Papillion Creek at Omaha, Nebraska

    Science.gov (United States)

    Strauch, Kellan R.; Dietsch, Benjamin J.; Anderson, Kayla J.

    2016-03-22

    Digital flood-inundation maps for a 12.5-mile reach of the Big Papillion Creek from 0.6 mile upstream from the State Street Bridge to the 72nd Street Bridge in Omaha, Nebraska, were created by the U.S. Geological Survey (USGS) in cooperation with the Papio-Missouri River Natural Resources District. The flood-inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage on the Big Papillion Creek at Fort Street at Omaha, Nebraska (station 06610732). Near-real-time stages at this streamgage may be obtained on the Internet from the USGS National Water Information System at http://waterdata.usgs.gov/ or the National Weather Service Advanced Hydrologic Prediction Service at http:/water.weather.gov/ahps/, which also forecasts flood hydrographs at this site.

  9. The geomorphic effectiveness of a large flood on the Rio Grande in the Big Bend region: insights on geomorphic controls and post-flood geomorphic response

    Science.gov (United States)

    Dean, David J.; Schmidt, John C.

    2013-01-01

    Since the 1940s, the Rio Grande in the Big Bend region has undergone long periods of channel narrowing, which have been occasionally interrupted by rare, large floods that widen the channel (termed a channel reset). The most recent channel reset occurred in 2008 following a 17-year period of extremely low stream flow and rapid channel narrowing. Flooding was caused by precipitation associated with the remnants of tropical depression Lowell in the Rio Conchos watershed, the largest tributary to the Rio Grande. Floodwaters approached 1500 m3/s (between a 13 and 15 year recurrence interval) and breached levees, inundated communities, and flooded the alluvial valley of the Rio Grande; the wetted width exceeding 2.5 km in some locations. The 2008 flood had the 7th largest magnitude of record, however, conveyed the largest volume of water than any other flood. Because of the narrow pre-flood channel conditions, record flood stages occurred. We used pre- and post-flood aerial photographs, channel and floodplain surveys, and 1-dimensional hydraulic models to quantify the magnitude of channel change, investigate the controls of flood-induced geomorphic changes, and measure the post-flood response of the widened channel. These analyses show that geomorphic changes included channel widening, meander migration, avulsions, extensive bar formation, and vertical floodplain accretion. Reach-averaged channel widening between 26 and 52% occurred, but in some localities exceeded 500%. The degree and style of channel response was related, but not limited to, three factors: 1) bed-load supply and transport, 2) pre-flood channel plan form, and 3) rapid declines in specific stream power downstream of constrictions and areas of high channel bed slope. The post-flood channel response has consisted of channel contraction through the aggradation of the channel bed and the formation of fine-grained benches inset within the widened channel margins. The most significant post-flood geomorphic

  10. Coastal Flooding in Florida's Big Bend Region with Application to Sea Level Rise Based on Synthetic Storms Analysis

    Directory of Open Access Journals (Sweden)

    Scott C. Hagen Peter Bacopoulos

    2012-01-01

    Full Text Available Flooding is examined by comparing maximum envelopes of water against the 0.2% (= 1-in-500-year return-period flooding surface generated as part of revising the Federal Emergency Management Agency¡¦s flood insurance rate maps for Franklin, Wakulla, and Jefferson counties in Florida¡¦s Big Bend Region. The analysis condenses the number of storms to a small fraction of the original 159 used in production. The analysis is performed by assessing which synthetic storms contributed to inundation extent (the extent of inundation into the floodplain, coverage (the overall surface area of the inundated floodplain and the spatially variable 0.2% flooding surface. The results are interpreted in terms of storm attributes (pressure deficit, radius to maximum winds, translation speed, storm heading, and landfall location and the physical processes occurring within the natural system (storms surge and waves; both are contextualized against existing and new hurricane scales. The approach identifies what types of storms and storm attributes lead to what types of inundation, as measured in terms of extent and coverage, in Florida¡¦s Big Bend Region and provides a basis in the identification of a select subset of synthetic storms for studying the impact of sea level rise. The sea level rise application provides a clear contrast between a dynamic approach versus that of a static approach.

  11. Results from the Big Spring basin water quality monitoring and demonstration projects, Iowa, USA

    Science.gov (United States)

    Rowden, R.D.; Liu, H.; Libra, R.D.

    2001-01-01

    Agricultural practices, hydrology, and water quality of the 267-km2 Big Spring groundwater drainage basin in Clayton County, Iowa, have been monitored since 1981. Land use is agricultural; nitrate-nitrogen (-N) and herbicides are the resulting contaminants in groundwater and surface water. Ordovician Galena Group carbonate rocks comprise the main aquifer in the basin. Recharge to this karstic aquifer is by infiltration, augmented by sinkhole-captured runoff. Groundwater is discharged at Big Spring, where quantity and quality of the discharge are monitored. Monitoring has shown a threefold increase in groundwater nitrate-N concentrations from the 1960s to the early 1980s. The nitrate-N discharged from the basin typically is equivalent to over one-third of the nitrogen fertilizer applied, with larger losses during wetter years. Atrazine is present in groundwater all year; however, contaminant concentrations in the groundwater respond directly to recharge events, and unique chemical signatures of infiltration versus runoff recharge are detectable in the discharge from Big Spring. Education and demonstration efforts have reduced nitrogen fertilizer application rates by one-third since 1981. Relating declines in nitrate and pesticide concentrations to inputs of nitrogen fertilizer and pesticides at Big Spring is problematic. Annual recharge has varied five-fold during monitoring, overshadowing any water-quality improvements resulting from incrementally decreased inputs. ?? Springer-Verlag 2001.

  12. Effectiveness and reliability of emergency measures for flood prevention

    NARCIS (Netherlands)

    Lendering, K.T.; Jonkman, S.N.; Kok, M.

    2014-01-01

    Floods in the summer of 2013 in Central Europe demonstrated once again that floods account for a large part of damage and loss of life caused by natural disasters. During flood threats emergency measures, such as sand bags and big bags, are often applied to strengthen the flood defences and attempt

  13. Flood-inundation maps for Big Creek from the McGinnis Ferry Road bridge to the confluence of Hog Wallow Creek, Alpharetta and Roswell, Georgia

    Science.gov (United States)

    Musser, Jonathan W.

    2015-08-20

    Digital flood-inundation maps for a 12.4-mile reach of Big Creek that extends from 260 feet above the McGinnis Ferry Road bridge to the U.S. Geological Survey (USGS) streamgage at Big Creek below Hog Wallow Creek at Roswell, Georgia (02335757), were developed by the USGS in cooperation with the cities of Alpharetta and Roswell, Georgia. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage at Big Creek near Alpharetta, Georgia (02335700). Real-time stage information from this USGS streamgage may be obtained at http://waterdata.usgs.gov/ and can be used in conjunction with these maps to estimate near real-time areas of inundation. The National Weather Service (NWS) is incorporating results from this study into the Advanced Hydrologic Prediction Service (AHPS) flood-warning system http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs for many streams where the USGS operates streamgages and provides flow data. The forecasted peak-stage information for the USGS streamgage at Big Creek near Alpharetta (02335700), available through the AHPS Web site, may be used in conjunction with the maps developed for this study to show predicted areas of flood inundation.

  14. Peak discharge, flood frequency, and peak stage of floods on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado, and Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado, 2016

    Science.gov (United States)

    Kohn, Michael S.; Stevens, Michael R.; Mommandi, Amanullah; Khan, Aziz R.

    2017-12-14

    The U.S. Geological Survey (USGS), in cooperation with the Colorado Department of Transportation, determined the peak discharge, annual exceedance probability (flood frequency), and peak stage of two floods that took place on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado (hereafter referred to as “Big Cottonwood Creek site”), on August 23, 2016, and on Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado (hereafter referred to as “Fountain Creek site”), on August 29, 2016. A one-dimensional hydraulic model was used to estimate the peak discharge. To define the flood frequency of each flood, peak-streamflow regional-regression equations or statistical analyses of USGS streamgage records were used to estimate annual exceedance probability of the peak discharge. A survey of the high-water mark profile was used to determine the peak stage, and the limitations and accuracy of each component also are presented in this report. Collection and computation of flood data, such as peak discharge, annual exceedance probability, and peak stage at structures critical to Colorado’s infrastructure are an important addition to the flood data collected annually by the USGS.The peak discharge of the August 23, 2016, flood at the Big Cottonwood Creek site was 917 cubic feet per second (ft3/s) with a measurement quality of poor (uncertainty plus or minus 25 percent or greater). The peak discharge of the August 29, 2016, flood at the Fountain Creek site was 5,970 ft3/s with a measurement quality of poor (uncertainty plus or minus 25 percent or greater).The August 23, 2016, flood at the Big Cottonwood Creek site had an annual exceedance probability of less than 0.01 (return period greater than the 100-year flood) and had an annual exceedance probability of greater than 0.005 (return period less than the 200-year flood). The August 23, 2016, flood event was caused by a precipitation event having an annual exceedance probability of 1.0 (return

  15. Floods

    Science.gov (United States)

    Floods are common in the United States. Weather such as heavy rain, thunderstorms, hurricanes, or tsunamis can ... is breached, or when a dam breaks. Flash floods, which can develop quickly, often have a dangerous ...

  16. WBP/SIGAME the Brazilian BIG-GT demonstration project actual status and perspectives

    International Nuclear Information System (INIS)

    Carpentier, E.; Silva, A.

    1998-01-01

    Located in the tropics, with the sun shining all year round, and with its vast territory, Brazil may be regarded as having all the basic conditions to develop a modern Biomass for Electricity industry. Those characteristics together with: (a) the necessity of developing new energy resources for electricity production, in the northeast of the country; (b) the results of studies made by various entities, including CHESF; (c) the progress achieved by the BIG-GT technology; (d) the organisation of the Global Environment Facility (GEF); (e) and the support of the Brazilian government, through the Ministry of Science and Technology (MCT), provided the unique opportunity for the implementation of a commercial demonstration of that technology in Brazil. This paper describes the idea, scope, challenges, lessons, and actual status of development of the WBP/SIGAME project. It also highlights some institutional issues, budget figures, and energy prices. (author)

  17. Technical note: River modelling to infer flood management framework

    African Journals Online (AJOL)

    River hydraulic models have successfully identified the weaknesses and areas for improvement with respect to flooding in the Sarawak River system, and can also be used to support decisions on flood management measures. Often, the big question is 'how'. This paper demonstrates a theoretical flood management ...

  18. A Cloud-Based Global Flood Disaster Community Cyber-Infrastructure: Development and Demonstration

    Science.gov (United States)

    Wan, Zhanming; Hong, Yang; Khan, Sadiq; Gourley, Jonathan; Flamig, Zachary; Kirschbaum, Dalia; Tang, Guoqiang

    2014-01-01

    Flood disasters have significant impacts on the development of communities globally. This study describes a public cloud-based flood cyber-infrastructure (CyberFlood) that collects, organizes, visualizes, and manages several global flood databases for authorities and the public in real-time, providing location-based eventful visualization as well as statistical analysis and graphing capabilities. In order to expand and update the existing flood inventory, a crowdsourcing data collection methodology is employed for the public with smartphones or Internet to report new flood events, which is also intended to engage citizen-scientists so that they may become motivated and educated about the latest developments in satellite remote sensing and hydrologic modeling technologies. Our shared vision is to better serve the global water community with comprehensive flood information, aided by the state-of-the- art cloud computing and crowdsourcing technology. The CyberFlood presents an opportunity to eventually modernize the existing paradigm used to collect, manage, analyze, and visualize water-related disasters.

  19. The big CGRP flood - sources, sinks and signalling sites in the trigeminovascular system.

    Science.gov (United States)

    Messlinger, Karl

    2018-03-12

    Calcitonin gene-related peptide (CGRP) has long been a focus of migraine research, since it turned out that inhibition of CGRP or CGRP receptors by antagonists or monoclonal IgG antibodies was therapeutic in frequent and chronic migraine. This contribution deals with the questions, from which sites CGRP is released, where it is drained and where it acts to cause its headache proliferating effects in the trigeminovascular system. The available literature suggests that the bulk of CGRP is released from trigeminal afferents both in meningeal tissues and at the first synapse in the spinal trigeminal nucleus. CGRP may be drained off into three different compartments, the venous blood plasma, the cerebrospinal fluid and possibly the glymphatic system. CGRP receptors in peripheral tissues are located on arterial vessel walls, mononuclear immune cells and possibly Schwann cells; within the trigeminal ganglion they are located on neurons and glial cells; in the spinal trigeminal nucleus they can be found on central terminals of trigeminal afferents. All these structures are potential signalling sites for CGRP, where CGRP mediates arterial vasodilatation but not direct activation of trigeminal afferents. In the spinal trigeminal nucleus a facilitating effect on synaptic transmission seems likely. In the trigeminal ganglion CGRP is thought to initiate long-term changes including cross-signalling between neurons and glial cells based on gene expression. In this way, CGRP may upregulate the production of receptor proteins and pro-nociceptive molecules. CGRP and other big molecules cannot easily pass the blood-brain barrier. These molecules may act in the trigeminal ganglion to influence the production of pronociceptive substances and receptors, which are transported along the central terminals into the spinal trigeminal nucleus. In this way peripherally acting therapeutics can have a central antinociceptive effect.

  20. Flood-Fighting Structures Demonstration and Evaluation Program: Laboratory and Field Testing in Vicksburg, Mississippi

    Science.gov (United States)

    2007-07-01

    then it should be disposed of by recycling or land-filling. This material should not be burned due to the formation of carbon dioxide and carbon...and 2-192). A top spreader bar Chapter 2 Laboratory Testing and Evaluation of Expedient Flood-fighting Barriers 135 Figure 2-189

  1. Green River Formation Water Flood Demonstration Project: Final report. [October 21, 1992-April, 30, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Deo, M.D. [Dept. of Chemical and Fuels Engineering, University of Utah, Salt Lake City (US); Dyer, J.E.; Lomax, J.D. [Inland Resources, Inc., Lomax Exploration Co., Salt Lake City, UT (US); Nielson, D.L.; Lutz, S.J. [Energy and Geoscience Institute at the University of Utah, Salt Lake City (US)

    1996-11-01

    The objectives were to understand the oil production mechanisms in the Monument Butte unit via reservoir characterization and reservoir simulations and to transfer the water flooding technology to similar units in the vicinity, particularly the Travis and the Boundary units. Comprehensive reservoir characterization and reservoir simulations of the Monument Butte, Travis and Boundary units were presented in the two published project yearly reports. The primary and the secondary production from the Monument Butte unit were typical of oil production from an undersaturated oil reservoir close to its bubble point. The water flood in the smaller Travis unit appeared affected by natural and possibly by large interconnecting hydraulic fractures. Water flooding the boundary unit was considered more complicated due to the presence of an oil water contact in one of the wells. The reservoir characterization activity in the project basically consisted of extraction and analysis of a full diameter c ore, Formation Micro Imaging logs from several wells and Magnetic Resonance Imaging logs from two wells. In addition, several side-wall cores were drilled and analyzed, oil samples from a number of wells were physically and chemically characterized (using gas chromatography), oil-water relative permeabilities were measured and pour points and cloud points of a few oil samples were determined. The reservoir modeling activity comprised of reservoir simulation of all the three units at different scales and near well-bore modeling of the wax precipitation effects. The reservoir characterization efforts identified new reservoirs in the Travis and the Boundary units. The reservoir simulation activities established the extent of pressurization of the sections of the reservoirs in the immediate vicinity of the Monument Butte unit. This resulted in a major expansion of the unit and the production from this expanded unit increased from about 300 barrels per day to about 2000 barrels per day.

  2. Green River Formation Water Flood Demonstration Project: Final report, October 21, 1992-April, 30, 1996

    International Nuclear Information System (INIS)

    Deo, M.D.; Dyer, J.E.; Lomax, J.D.; Nielson, D.L.; Lutz, S.J.

    1996-01-01

    The objectives were to understand the oil production mechanisms in the Monument Butte unit via reservoir characterization and reservoir simulations and to transfer the water flooding technology to similar units in the vicinity, particularly the Travis and the Boundary units. Comprehensive reservoir characterization and reservoir simulations of the Monument Butte, Travis and Boundary units were presented in the two published project yearly reports. The primary and the secondary production from the Monument Butte unit were typical of oil production from an undersaturated oil reservoir close to its bubble point. The water flood in the smaller Travis unit appeared affected by natural and possibly by large interconnecting hydraulic fractures. Water flooding the boundary unit was considered more complicated due to the presence of an oil water contact in one of the wells. The reservoir characterization activity in the project basically consisted of extraction and analysis of a full diameter c ore, Formation Micro Imaging logs from several wells and Magnetic Resonance Imaging logs from two wells. In addition, several side-wall cores were drilled and analyzed, oil samples from a number of wells were physically and chemically characterized (using gas chromatography), oil-water relative permeabilities were measured and pour points and cloud points of a few oil samples were determined. The reservoir modeling activity comprised of reservoir simulation of all the three units at different scales and near well-bore modeling of the wax precipitation effects. The reservoir characterization efforts identified new reservoirs in the Travis and the Boundary units. The reservoir simulation activities established the extent of pressurization of the sections of the reservoirs in the immediate vicinity of the Monument Butte unit. This resulted in a major expansion of the unit and the production from this expanded unit increased from about 300 barrels per day to about 2000 barrels per day

  3. Field Demonstration of Carbon Dioxide Miscible Flooding in the Lansing-Kansas City Formation, Central Kansas

    Energy Technology Data Exchange (ETDEWEB)

    Alan Byrnes; G. Paul Willhite; Don Green; Richard Pancake; JyunSyung Tsau; W. Lynn Watney; John Doveton; Willard Guy; Rodney Reynolds; Dave Murfin; James Daniels; Russell Martin; William Flanders; Dave Vander Griend; Eric Mork; Paul Cantrell

    2010-03-07

    A pilot carbon dioxide miscible flood was initiated in the Lansing Kansas City C formation in the Hall Gurney Field, Russell County, Kansas. The reservoir zone is an oomoldic carbonate located at a depth of about 2900 feet. The pilot consists of one carbon dioxide injection well and three production wells. Continuous carbon dioxide injection began on December 2, 2003. By the end of June 2005, 16.19 MM lb of carbon dioxide was injected into the pilot area. Injection was converted to water on June 21, 2005 to reduce operating costs to a breakeven level with the expectation that sufficient carbon dioxide was injected to displace the oil bank to the production wells by water injection. By March 7,2010, 8,736 bbl of oil were produced from the pilot. Production from wells to the northwest of the pilot region indicates that oil displaced from carbon dioxide injection was produced from Colliver A7, Colliver A3, Colliver A14 and Graham A4 located on adjacent leases. About 19,166 bbl of incremental oil were estimated to have been produced from these wells as of March 7, 2010. There is evidence of a directional permeability trend toward the NW through the pilot region. The majority of the injected carbon dioxide remains in the pilot region, which has been maintained at a pressure at or above the minimum miscibility pressure. Estimated oil recovery attributed to the CO2 flood is 27,902 bbl which is equivalent to a gross CO2 utilization of 4.8 MCF/bbl. The pilot project is not economic.

  4. Affordable Development and Demonstration of a Small NTR Engine and Stage: How Small is Big Enough?

    Science.gov (United States)

    Borowski, Stanley K.; Sefcik, Robert J.; Fittje, James E.; McCurdy, David R.; Qualls, Arthur L.; Schnitzler, Bruce G.; Werner, James E.; Weitzberg (Abraham); Joyner, Claude R.

    2015-01-01

    The Nuclear Thermal Rocket (NTR) derives its energy from fission of uranium-235 atoms contained within fuel elements that comprise the engine's reactor core. It generates high thrust and has a specific impulse potential of approximately 900 seconds - a 100% increase over today's best chemical rockets. The Nuclear Thermal Propulsion (NTP) project, funded by NASA's AES program, includes five key task activities: (1) Recapture, demonstration, and validation of heritage graphite composite (GC) fuel (selected as the "Lead Fuel" option); (2) Engine Conceptual Design; (3) Operating Requirements Definition; (4) Identification of Affordable Options for Ground Testing; and (5) Formulation of an Affordable Development Strategy. During FY'14, a preliminary DDT&E plan and schedule for NTP development was outlined by GRC, DOE and industry that involved significant system-level demonstration projects that included GTD tests at the NNSS, followed by a FTD mission. To reduce cost for the GTD tests and FTD mission, small NTR engines, in either the 7.5 or 16.5 klbf thrust class, were considered. Both engine options used GC fuel and a "common" fuel element (FE) design. The small approximately 7.5 klbf "criticality-limited" engine produces approximately 157 megawatts of thermal power (MWt) and its core is configured with parallel rows of hexagonal-shaped FEs and tie tubes (TTs) with a FE to TT ratio of approximately 1:1. The larger approximately 16.5 klbf Small Nuclear Rocket Engine (SNRE), developed by LANL at the end of the Rover program, produces approximately 367 MWt and has a FE to TT ratio of approximately 2:1. Although both engines use a common 35 inch (approximately 89 cm) long FE, the SNRE's larger diameter core contains approximately 300 more FEs needed to produce an additional 210 MWt of power. To reduce the cost of the FTD mission, a simple "1-burn" lunar flyby mission was considered to reduce the LH2 propellant loading, the stage size and complexity. Use of existing and

  5. ARSENIC REMOVAL FROM DRINKING WATER BY IRON REMOVAL USEPA DEMONSTRATION PROJECT AT BIG SAUK LAKE MOBILE HOME PARK IN SAUK CENTRE, MN. SIX MONTH EVALUATION REPORT

    Science.gov (United States)

    This report documents the activities performed and the results obtained from the first six months of the arsenic removal treatment technology demonstration project at the Big Sauk Lake Mobile Home Park (BSLMHP) in Sauk Centre, MN. The objectives of the project are to evaluate the...

  6. Nursing Management Minimum Data Set: Cost-Effective Tool To Demonstrate the Value of Nurse Staffing in the Big Data Science Era.

    Science.gov (United States)

    Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L

    2016-01-01

    There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.

  7. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    Exarhos, C.A.; Van Swam, L.F.; Wahlquist, F.P.

    1981-12-01

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  8. Demonstrator Flood Control Room : Inventarisatie van de wensen van de verschillende Deltares onderdelen en een hierop gebaseerd ontwerp

    NARCIS (Netherlands)

    Boertjens, G.J.; Attema-van Waas, A.R.; Guikema, M.; Schilder, C.M.C.; Veen, M.J. van der

    2009-01-01

    Op basis van het uitgevoerde onderzoek trekt TNO de volgende conclusies: • De bestaande ruimte die Deltares op het oog heeft voor de realisatie van de trainingsruimte is klein. Een eerste fase van de gewenste Flood Control Room is realiseerbaar in deze ruimte, met inachtneming dat niet alle

  9. Floods and Flash Flooding

    Science.gov (United States)

    Floods and flash flooding Now is the time to determine your area’s flood risk. If you are not sure whether you ... If you are in a floodplain, consider buying flood insurance. Do not drive around barricades. If your ...

  10. Engineering Study for a Full Scale Demonstration of Steam Reforming Black Liquor Gasification at Georgia-Pacific's Mill in Big Island, Virginia; FINAL

    International Nuclear Information System (INIS)

    Robert De Carrera; Mike Ohl

    2002-01-01

    Georgia-Pacific Corporation performed an engineering study to determine the feasibility of installing a full-scale demonstration project of steam reforming black liquor chemical recovery at Georgia-Pacific's mill in Big Island, Virginia. The technology considered was the Pulse Enhanced Steam Reforming technology that was developed and patented by Manufacturing and Technology Conversion, International (MTCI) and is currently licensed to StoneChem, Inc., for use in North America. Pilot studies of steam reforming have been carried out on a 25-ton per day reformer at Inland Container's Ontario, California mill and on a 50-ton per day unit at Weyerhaeuser's New Bern, North Carolina mill

  11. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  12. Flooding and Flood Management

    Science.gov (United States)

    Brooks, K.N.; Fallon, J.D.; Lorenz, D.L.; Stark, J.R.; Menard, Jason; Easter, K.W.; Perry, Jim

    2011-01-01

    Floods result in great human disasters globally and nationally, causing an average of $4 billion of damages each year in the United States. Minnesota has its share of floods and flood damages, and the state has awarded nearly $278 million to local units of government for flood mitigation projects through its Flood Hazard Mitigation Grant Program. Since 1995, flood mitigation in the Red River Valley has exceeded $146 million. Considerable local and state funding has been provided to manage and mitigate problems of excess stormwater in urban areas, flooding of farmlands, and flood damages at road crossings. The cumulative costs involved with floods and flood mitigation in Minnesota are not known precisely, but it is safe to conclude that flood mitigation is a costly business. This chapter begins with a description of floods in Minneosta to provide examples and contrasts across the state. Background material is presented to provide a basic understanding of floods and flood processes, predication, and management and mitigation. Methods of analyzing and characterizing floods are presented because they affect how we respond to flooding and can influence relevant practices. The understanding and perceptions of floods and flooding commonly differ among those who work in flood forecasting, flood protection, or water resource mamnagement and citizens and businesses affected by floods. These differences can become magnified following a major flood, pointing to the need for better understanding of flooding as well as common language to describe flood risks and the uncertainty associated with determining such risks. Expectations of accurate and timely flood forecasts and our ability to control floods do not always match reality. Striving for clarity is important in formulating policies that can help avoid recurring flood damages and costs.

  13. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    International Nuclear Information System (INIS)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS 'pathways,' or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  14. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Herberger, Sarah Elizabeth Marie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  15. Affordable Development and Demonstration of a Small Nuclear Thermal Rocket (NTR) Engine and Stage: How Small Is Big Enough?

    Science.gov (United States)

    Borowski, Stanley K.; Sefcik, Robert J.; Fittje, James E.; McCurdy, David R.; Qualls, Arthur L.; Schnitzler, Bruce G.; Werner, James E.; Weitzberg, Abraham; Joyner, Claude R.

    2016-01-01

    The Nuclear Thermal Rocket (NTR) derives its energy from fission of uranium-235 atoms contained within fuel elements that comprise the engine's reactor core. It generates high thrust and has a specific impulse potential of approximately 900 specific impulse - a 100 percent increase over today's best chemical rockets. The Nuclear Thermal Propulsion (NTP) project, funded by NASA's Advanced Exploration Systems (AES) program, includes five key task activities: (1) Recapture, demonstration, and validation of heritage graphite composite (GC) fuel (selected as the Lead Fuel option); (2) Engine Conceptual Design; (3) Operating Requirements Definition; (4) Identification of Affordable Options for Ground Testing; and (5) Formulation of an Affordable Development Strategy. During fiscal year (FY) 2014, a preliminary Design Development Test and Evaluation (DDT&E) plan and schedule for NTP development was outlined by the NASA Glenn Research Center (GRC), Department of Energy (DOE) and industry that involved significant system-level demonstration projects that included Ground Technology Demonstration (GTD) tests at the Nevada National Security Site (NNSS), followed by a Flight Technology Demonstration (FTD) mission. To reduce cost for the GTD tests and FTD mission, small NTR engines, in either the 7.5 or 16.5 kilopound-force thrust class, were considered. Both engine options used GC fuel and a common fuel element (FE) design. The small approximately 7.5 kilopound-force criticality-limited engine produces approximately157 thermal megawatts and its core is configured with parallel rows of hexagonal-shaped FEs and tie tubes (TTs) with a FE to TT ratio of approximately 1:1. The larger approximately 16.5 kilopound-force Small Nuclear Rocket Engine (SNRE), developed by Los Alamos National Laboratory (LANL) at the end of the Rover program, produces approximately 367 thermal megawatts and has a FE to TT ratio of approximately 2:1. Although both engines use a common 35-inch (approximately

  16. Arsenic Removal from Drinking Water by Iron Removal - U.S. EPA Demonstration Project at Big Sauk Lake Mobile Home Park in Sauk Centre, MN Final Performance Evaluation Report

    Science.gov (United States)

    This report documents the activities performed and the results obtained from the one-year arsenic removal treatment technology demonstration project at the Big Sauk Lake Mobile Home Park (BSLMHP) in Sauk Centre, MN. The objectives of the project are to evaluate (1) the effective...

  17. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  18. Exploitation of Documented Historical Floods for Achieving Better Flood Defense

    Directory of Open Access Journals (Sweden)

    Slobodan Kolaković

    2016-01-01

    Full Text Available Establishing Base Flood Elevation for a stream network corresponding to a big catchment is feasible by interdisciplinary approach, involving stochastic hydrology, river hydraulics, and computer aided simulations. A numerical model calibrated by historical floods has been exploited in this study. The short presentation of the catchment of the Tisza River in this paper is followed by the overview of historical floods which hit the region in the documented period of 130 years. Several well documented historical floods provided opportunity for the calibration of the chosen numerical model. Once established, the model could be used for investigation of different extreme flood scenarios and to establish the Base Flood Elevation. The calibration has shown that the coefficient of friction in case of the Tisza River is dependent both on the actual water level and on the preceding flood events. The effect of flood plain maintenance as well as the activation of six potential detention ponds on flood mitigation has been examined. Furthermore, the expected maximum water levels have also been determined for the case if the ever observed biggest 1888 flood hit the region again. The investigated cases of flood superposition highlighted the impact of tributary Maros on flood mitigation along the Tisza River.

  19. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  20. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  1. 76 FR 21664 - Final Flood Elevation Determinations

    Science.gov (United States)

    2011-04-18

    ... proof Flood Insurance Study and FIRM available at the address cited below for each community. The BFEs... 2,100 feet +861 upstream of 11th Street. Big Duck Creek At South P Street........ +843 City of...

  2. Hydraulic Characteristics of Bedrock Constrictions and Evaluation of One- and Two-Dimensional Models of Flood Flow on the Big Lost River at the Idaho National Engineering and Environmental Laboratory, Idaho

    Science.gov (United States)

    Berenbrock, Charles; Rousseau, Joseph P.; Twining, Brian V.

    2007-01-01

    A 1.9-mile reach of the Big Lost River, between the Idaho National Engineering and Environmental Laboratory (INEEL) diversion dam and the Pioneer diversion structures, was investigated to evaluate the effects of streambed erosion and bedrock constrictions on model predictions of water-surface elevations. Two one-dimensional (1-D) models, a fixed-bed surface-water flow model (HEC-RAS) and a movable-bed surface-water flow and sediment-transport model (HEC-6), were used to evaluate these effects. The results of these models were compared to the results of a two-dimensional (2-D) fixed-bed model [Transient Inundation 2-Dimensional (TRIM2D)] that had previously been used to predict water-surface elevations for peak flows with sufficient stage and stream power to erode floodplain terrain features (Holocene inset terraces referred to as BLR#6 and BLR#8) dated at 300 to 500 years old, and an unmodified Pleistocene surface (referred to as the saddle area) dated at 10,000 years old; and to extend the period of record at the Big Lost River streamflow-gaging station near Arco for flood-frequency analyses. The extended record was used to estimate the magnitude of the 100-year flood and the magnitude of floods with return periods as long as 10,000 years. In most cases, the fixed-bed TRIM2D model simulated higher water-surface elevations, shallower flow depths, higher flow velocities, and higher stream powers than the fixed-bed HEC-RAS and movable-bed HEC-6 models for the same peak flows. The HEC-RAS model required flow increases of 83 percent [100 to 183 cubic meters per second (m3/s)], and 45 percent (100 to 145 m3/s) to match TRIM2D simulations of water-surface elevations at two paleoindicator sites that were used to determine peak flows (100 m3/s) with an estimated return period of 300 to 500 years; and an increase of 13 percent (150 to 169 m3/s) to match TRIM2D water-surface elevations at the saddle area that was used to establish the peak flow (150 m3/s) of a paleoflood

  3. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  4. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  5. Development of flood index by characterisation of flood hydrographs

    Science.gov (United States)

    Bhattacharya, Biswa; Suman, Asadusjjaman

    2015-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Due to climatological characteristics there are catchments where flood forecasting may have a relatively limited role and flood event management may have to be trusted upon. For example, in flash flood catchments, which often may be tiny and un-gauged, flood event management often depends on approximate prediction tools such as flash flood guidance (FFG). There are catchments fed largely by flood waters coming from upstream catchments, which are un-gauged or due to data sharing issues in transboundary catchments the flow of information from upstream catchment is limited. Hydrological and hydraulic modelling of these downstream catchments will never be sufficient to provide any required forecasting lead time and alternative tools to support flood event management will be required. In FFG, or similar approaches, the primary motif is to provide guidance by synthesising the historical data. We follow a similar approach to characterise past flood hydrographs to determine a flood index (FI), which varies in space and time with flood magnitude and its propagation. By studying the variation of the index the pockets of high flood risk, requiring attention, can be earmarked beforehand. This approach can be very useful in flood risk management of catchments where information about hydro-meteorological variables is inadequate for any forecasting system. This paper presents the development of FI and its application to several catchments including in Kentucky in the USA

  6. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  7. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  8. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  9. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  10. Copernicus Big Data and Google Earth Engine for Glacier Surface Velocity Field Monitoring: Feasibility Demonstration on San Rafael and San Quintin Glaciers

    Science.gov (United States)

    Di Tullio, M.; Nocchi, F.; Camplani, A.; Emanuelli, N.; Nascetti, A.; Crespi, M.

    2018-04-01

    The glaciers are a natural global resource and one of the principal climate change indicator at global and local scale, being influenced by temperature and snow precipitation changes. Among the parameters used for glacier monitoring, the surface velocity is a key element, since it is connected to glaciers changes (mass balance, hydro balance, glaciers stability, landscape erosion). The leading idea of this work is to continuously retrieve glaciers surface velocity using free ESA Sentinel-1 SAR imagery and exploiting the potentialities of the Google Earth Engine (GEE) platform. GEE has been recently released by Google as a platform for petabyte-scale scientific analysis and visualization of geospatial datasets. The algorithm of SAR off-set tracking developed at the Geodesy and Geomatics Division of the University of Rome La Sapienza has been integrated in a cloud based platform that automatically processes large stacks of Sentinel-1 data to retrieve glacier surface velocity field time series. We processed about 600 Sentinel-1 image pairs to obtain a continuous time series of velocity field measurements over 3 years from January 2015 to January 2018 for two wide glaciers located in the Northern Patagonian Ice Field (NPIF), the San Rafael and the San Quintin glaciers. Several results related to these relevant glaciers also validated with respect already available and renown software (i.e. ESA SNAP, CIAS) and with respect optical sensor measurements (i.e. LANDSAT8), highlight the potential of the Big Data analysis to automatically monitor glacier surface velocity fields at global scale, exploiting the synergy between GEE and Sentinel-1 imagery.

  11. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  12. Nogales flood detention study

    Science.gov (United States)

    Norman, Laura M.; Levick, Lainie; Guertin, D. Phillip; Callegary, James; Guadarrama, Jesus Quintanar; Anaya, Claudia Zulema Gil; Prichard, Andrea; Gray, Floyd; Castellanos, Edgar; Tepezano, Edgar; Huth, Hans; Vandervoet, Prescott; Rodriguez, Saul; Nunez, Jose; Atwood, Donald; Granillo, Gilberto Patricio Olivero; Ceballos, Francisco Octavio Gastellum

    2010-01-01

    Flooding in Ambos Nogales often exceeds the capacity of the channel and adjacent land areas, endangering many people. The Nogales Wash is being studied to prevent future flood disasters and detention features are being installed in tributaries of the wash. This paper describes the application of the KINEROS2 model and efforts to understand the capacity of these detention features under various flood and urbanization scenarios. Results depict a reduction in peak flow for the 10-year, 1-hour event based on current land use in tributaries with detention features. However, model results also demonstrate that larger storm events and increasing urbanization will put a strain on the features and limit their effectiveness.

  13. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  14. Understanding the allure of big infrastructure: Jakarta’s Great Garuda Sea Wall Project

    Directory of Open Access Journals (Sweden)

    Emma Colven

    2017-06-01

    Full Text Available In response to severe flooding in Jakarta, a consortium of Dutch firms in collaboration with the Indonesian government has designed the 'Great Garuda Sea Wall' project. The master plan proposes to construct a sea wall to enclose Jakarta Bay. A new waterfront city will be built on over 1000 hectares (ha of reclaimed land in the shape of the Garuda, Indonesia’s national symbol. By redeveloping North Jakarta, the project promises to realise the world-class city aspirations of Indonesia’s political elites. Heavily reliant on hydrological engineering, hard infrastructure and private capital, the project has been presented by proponents as the optimum way to protect the city from flooding. The project retains its allure among political elites despite not directly addressing land subsidence, understood to be a primary cause of flooding. I demonstrate how this project is driven by a techno-political network that brings together political and economic interests, world-class city discourses, engineering expertise, colonial histories, and postcolonial relations between Jakarta and the Netherlands. Due in part to this network, big infrastructure has long constituted the preferred state response to flooding in Jakarta. I thus make a case for provincialising narratives that claim we are witnessing a return to big infrastructure in water management.

  15. Effectiveness of flood damage mitigation measures: Empirical evidence from French flood disasters

    NARCIS (Netherlands)

    Poussin, J.K.; Botzen, W.J.W.; Aerts, J.C.J.H.

    2015-01-01

    Recent destructive flood events and projected increases in flood risks as a result of climate change in many regions around the world demonstrate the importance of improving flood risk management. Flood-proofing of buildings is often advocated as an effective strategy for limiting damage caused by

  16. Enhancing Big Data Value Using Knowledge Discovery Techniques

    OpenAIRE

    Mai Abdrabo; Mohammed Elmogy; Ghada Eltaweel; Sherif Barakat

    2016-01-01

    The world has been drowned by floods of data due to technological development. Consequently, the Big Data term has gotten the expression to portray the gigantic sum. Different sorts of quick data are doubling every second. We have to profit from this enormous surge of data to convert it to knowledge. Knowledge Discovery (KDD) can enhance detecting the value of Big Data based on some techniques and technologies like Hadoop, MapReduce, and NoSQL. The use of Big D...

  17. Health impacts of floods.

    Science.gov (United States)

    Du, Weiwei; FitzGerald, Gerard Joseph; Clark, Michele; Hou, Xiang-Yu

    2010-01-01

    Floods are the most common hazard to cause disasters and have led to extensive morbidity and mortality throughout the world. The impact of floods on the human community is related directly to the location and topography of the area, as well as human demographics and characteristics of the built environment. The aim of this study is to identify the health impacts of disasters and the underlying causes of health impacts associated with floods. A conceptual framework is developed that may assist with the development of a rational and comprehensive approach to prevention, mitigation, and management. This study involved an extensive literature review that located >500 references, which were analyzed to identify common themes, findings, and expert views. The findings then were distilled into common themes. The health impacts of floods are wide ranging, and depend on a number of factors. However, the health impacts of a particular flood are specific to the particular context. The immediate health impacts of floods include drowning, injuries, hypothermia, and animal bites. Health risks also are associated with the evacuation of patients, loss of health workers, and loss of health infrastructure including essential drugs and supplies. In the medium-term, infected wounds, complications of injury, poisoning, poor mental health, communicable diseases, and starvation are indirect effects of flooding. In the long-term, chronic disease, disability, poor mental health, and poverty-related diseases including malnutrition are the potential legacy. This article proposes a structured approach to the classification of the health impacts of floods and a conceptual framework that demonstrates the relationships between floods and the direct and indirect health consequences.

  18. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  19. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  20. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  1. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  2. Laboratory experiment demonstrating the way in which a steam barrier prevents the dissolution of salt buried in a flooded packed bed

    International Nuclear Information System (INIS)

    Taylor, R.W.; Bowen, D.

    1977-01-01

    We have conducted a laboratory experiment to demonstrate a way in which a solid material can be prevented from dissolving in water. The differential solubility of salt (NaCl) in steam vs water is exploited. As long as the temperature of the area and water surrounding the salt is maintained above the boiling point of water, the salt cannot dissolve. This phenomenon, known as the thermal barrier, has far-reaching implications for preventing the dispersal of contaminants present near groundwater sources

  3. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  4. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  5. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  6. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  7. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  8. Application of Indigenous Knowledge to Flood Prevention and ...

    African Journals Online (AJOL)

    In the last three decades, flooding has become a nightmare associated with rainfall in all the continents of the world, as it records heavy casualties everywhere and each time it occurred. Flooding is now a big and seemingly unstoppable environmental threat to rural and urban settlements, in both developed and developing ...

  9. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  10. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  11. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  12. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  13. The Global Flood Model

    Science.gov (United States)

    Williams, P.; Huddelston, M.; Michel, G.; Thompson, S.; Heynert, K.; Pickering, C.; Abbott Donnelly, I.; Fewtrell, T.; Galy, H.; Sperna Weiland, F.; Winsemius, H.; Weerts, A.; Nixon, S.; Davies, P.; Schiferli, D.

    2012-04-01

    Recently, a Global Flood Model (GFM) initiative has been proposed by Willis, UK Met Office, Esri, Deltares and IBM. The idea is to create a global community platform that enables better understanding of the complexities of flood risk assessment to better support the decisions, education and communication needed to mitigate flood risk. The GFM will provide tools for assessing the risk of floods, for devising mitigation strategies such as land-use changes and infrastructure improvements, and for enabling effective pre- and post-flood event response. The GFM combines humanitarian and commercial motives. It will benefit: - The public, seeking to preserve personal safety and property; - State and local governments, seeking to safeguard economic activity, and improve resilience; - NGOs, similarly seeking to respond proactively to flood events; - The insurance sector, seeking to understand and price flood risk; - Large corporations, seeking to protect global operations and supply chains. The GFM is an integrated and transparent set of modules, each composed of models and data. For each module, there are two core elements: a live "reference version" (a worked example) and a framework of specifications, which will allow development of alternative versions. In the future, users will be able to work with the reference version or substitute their own models and data. If these meet the specification for the relevant module, they will interoperate with the rest of the GFM. Some "crowd-sourced" modules could even be accredited and published to the wider GFM community. Our intent is to build on existing public, private and academic work, improve local adoption, and stimulate the development of multiple - but compatible - alternatives, so strengthening mankind's ability to manage flood impacts. The GFM is being developed and managed by a non-profit organization created for the purpose. The business model will be inspired from open source software (eg Linux): - for non-profit usage

  14. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  15. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  16. Further demonstration of the VRLA-type UltraBattery under medium-HEV duty and development of the flooded-type UltraBattery for micro-HEV applications

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, J.; Takada, T.; Monma, D. [The Furukawa Battery Co., Ltd., R and D Division, 23-6 Kuidesaku, Shimofunao-machi, Joban, Iwaki-city, 972-8501 (Japan); Lam, L.T. [CSIRO Energy Technology, Bayview Avenue, Clayton South, Vic. 3169 (Australia)

    2010-02-15

    The UltraBattery has been invented by the CSIRO Energy Technology in Australia and has been developed and produced by the Furukawa Battery Co., Ltd., Japan. This battery is a hybrid energy storage device which combines a super capacitor and a lead-acid battery in single unit cells, taking the best from both technologies without the need of extra, expensive electronic controls. The capacitor enhances the power and lifespan of the lead-acid battery as it acts as a buffer during high-rate discharging and charging, thus enabling it to provide and absorb charge rapidly during vehicle acceleration and braking. The laboratory results of the prototype valve-regulated UltraBatteries show that the capacity, power, available energy, cold cranking and self-discharge of these batteries have met, or exceeded, all the respective performance targets set for both minimum and maximum power-assist HEVs. The cycling performance of the UltraBatteries under micro-, mild- and full-HEV duties is at least four times longer than that of the state-of-the-art lead-acid batteries. Importantly, the cycling performance of UltraBatteries is proven to be comparable or even better than that of the Ni-MH cells. On the other hand, the field trial of UltraBatteries in the Honda Insight HEV shows that the vehicle has surpassed 170,000 km and the batteries are still in a healthy condition. Furthermore, the UltraBatteries demonstrate very good acceptance of the charge from regenerative braking even at high state-of-charge, e.g., 70% during driving. Therefore, no equalization charge is required for the UltraBatteries during field trial. The HEV powered by UltraBatteries gives slightly higher fuel consumption (cf., 4.16 with 4.05 L/100 km) and CO{sub 2} emissions (cf., 98.8 with 96 g km{sup -1}) compared with that by Ni-MH cells. There are no differences in driving experience between the Honda Insight powered by UltraBatteries and by Ni-MH cells. Given such comparable performance, the UltraBattery pack

  17. Flexibility in flood management design: proactive planning under uncertainty

    Science.gov (United States)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2016-12-01

    This paper presents a value-enhancing approach for proactive planning and design of long-lived flood management infrastructure given uncertain future flooding threats. Designing infrastructure that can be adapted over time is a method to safeguard the efficacy of current design decisions given future uncertainties. We explore the value of embedding "options" in a physical structure, where an option is the right but not the obligation to do something at a later date (e.g. over-dimensioning a floodwall foundation now facilitates a future height addition in response to observed increases in sea level; building extra pump bays in a drainage pumping station enables the easy addition of pumping capacity whenever increased precipitation warrants an expansion.) The proposed approach couples a simulation model that captures future climate induced changes to the hydrologic operating environment of a structure, with an economic model that estimates the lifetime economic performance of alternative investment strategies. The economic model uses Real "In" Options analysis, a type of cash flow analysis that quantifies the implicit value of options and the flexibility they provide. We demonstrate the approach using replacement planning for the multi-functional pumping station IJmuiden on the North Sea Canal in the Netherlands. The analysis models flexibility in design decisions, varying the size and specific options included in the new structure. Results indicate that the incorporation of options within the structural design has the potential to improve its economic performance, as compared to more traditional, "build it once and build it big" designs where flexibility is not an explicit design criterion. The added value resulting from the incorporation of flexibility varies with the range of future conditions considered, and the specific options examined. This approach could be applied to explore investment strategies for the design of other flood management structures, as well

  18. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  19. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  20. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  1. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  2. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  3. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  4. The analysis on the flood property of Weihe River in 2003

    International Nuclear Information System (INIS)

    Liu Longqing; Jiang Xinhui

    2004-01-01

    From the end of Aug to Oct in 2003, it occurred a serious rainfall in the Weihe River --the largest tributary of Yellow River. The rainfall is rare in the history with long duration in the Weihe River valley so that 5 successive floods have formed at the controlling hydrological station-Huaxian station. Those floods overflow the beach, broke the dykes and flood the big area of Lower Weihe River. The natural adversity made near 200.000 populations leave their homeland the serious economic losses. The durations of the floods are long, the water levels are high and the volume of floods is largeness, which is rare in the history to a large extent. The flood peak at Huaxian station is up to 3570 m 3 /s, which is the first biggest peak since 1992. In recent years, owing to the fact that probability of the big flood on Weihe River was rare, the main river was withered clearly, propagation time of flood is lengthened and the discharge flowing over the floodplain was only 800-1000 m 3 /s. The water producing areas of those floods were in the area with little sediment production and the sediment content of the river is lower. As a result, the main river is eroded, the discharge ability of the river course becomes big gradually and the discharge flowing over the floodplain recovers above 2000 m 3 /s. From the analyses of flood components and flood progress, the conclusion is: the sediment deposit and the rising of channel bed, the withering of the main river, the decreasing of the discharge flowing over the floodplain, the increasing of the large peak whittling rate and the prolonging of the propagation duration, all have become the universal appearance of the rivers in arid and half arid districts. The appearance is extremely easily to create the serious calamity in the big flood and the flood law in local area should be researched further.(Author)

  5. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  6. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  7. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  8. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  9. Harvesting Social Media for Generation of Near Real-time Flood Maps

    NARCIS (Netherlands)

    Eilander, Dirk; Trambauer, Patricia; Wagemaker, Jurjen; Van Loenen, Arnejan

    2016-01-01

    Social media are a new, big and exciting source of data. Rather than from traditional sensors and models, this data is from local people experiencing real-world phenomena, such as flood events. During floods, disaster managers often have trouble getting an accurate overview of the current situation.

  10. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  11. Flooding and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2011

    2011-01-01

    According to the Federal Emergency Management Agency, flooding is the nation's most common natural disaster. Some floods develop slowly during an extended period of rain or in a warming trend following a heavy snow. Flash floods can occur quickly, without any visible sign of rain. Catastrophic floods are associated with burst dams and levees,…

  12. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  13. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  14. A little big history of Tiananmen

    OpenAIRE

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why people built the gate the way they did can be found. These explanations are useful in their own right and may also be used to deepen our understanding of more traditional explanations of why Tiananmen ...

  15. Automated Big Traffic Analytics for Cyber Security

    OpenAIRE

    Miao, Yuantian; Ruan, Zichan; Pan, Lei; Wang, Yu; Zhang, Jun; Xiang, Yang

    2018-01-01

    Network traffic analytics technology is a cornerstone for cyber security systems. We demonstrate its use through three popular and contemporary cyber security applications in intrusion detection, malware analysis and botnet detection. However, automated traffic analytics faces the challenges raised by big traffic data. In terms of big data's three characteristics --- volume, variety and velocity, we review three state of the art techniques to mitigate the key challenges including real-time tr...

  16. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  17. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  18. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  19. A Flood Risk Assessment of Quang Nam, Vietnam Using Spatial Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chinh Luu

    2018-04-01

    Full Text Available Vietnam is highly vulnerable to flood and storm impacts. Holistic flood risk assessment maps that adequately consider flood risk factors of hazard, exposure, and vulnerability are not available. These are vital for flood risk preparedness and disaster mitigation measures at the local scale. Unfortunately, there is a lack of knowledge about spatial multicriteria decision analysis and flood risk analysis more broadly in Vietnam. In response to this need, we identify and quantify flood risk components in Quang Nam province through spatial multicriteria decision analysis. The study presents a new approach to local flood risk assessment mapping, which combines historical flood marks with exposure and vulnerability data. The flood risk map output could assist and empower decision-makers in undertaking flood risk management activities in the province. Our study demonstrates a methodology to build flood risk assessment maps using flood mark, exposure and vulnerability data, which could be applied in other provinces in Vietnam.

  20. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  1. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  2. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  3. Flood Hazard Area

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  4. Flood Hazard Boundaries

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  5. Base Flood Elevation

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  6. Flood simulation and verification with IoT sensors

    Science.gov (United States)

    Chang, Che-Hao; Hsu, Chih-Tsung; Wu, Shiang-Jen; Huang, Sue-Wei

    2017-04-01

    2D flood dynamic simulation is a vivid tool to demonstrate the possible expose area that sustain impact of high rise of water level. Along with progress in high resolution digital terrain model, the simulation results are quite convinced yet not proved to be close to what is really happened. Due to the dynamic and uncertain essence, the expose area usually could not be well defined during a flood event. Recent development in IoT sensors bring a low power and long distance communication which help us to collect real time flood depths. With these time series of flood depths at different locations, we are capable of verifying the simulation results corresponding to the flood event. 16 flood gauges with IoT specification as well as two flood events in Annan district, Tainan city, Taiwan are examined in this study. During the event in 11, June, 2016, 12 flood gauges works well and 8 of them provide observation match to simulation.

  7. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  8. Flood Risk Regional Flood Defences : Technical report

    NARCIS (Netherlands)

    Kok, M.; Jonkman, S.N.; Lendering, K.T.

    2015-01-01

    Historically the Netherlands have always had to deal with the threat of flooding, both from the rivers and the sea as well as from heavy rainfall. The country consists of a large amount of polders, which are low lying areas of land protected from flooding by embankments. These polders require an

  9. Characterization of remarkable floods in France, a transdisciplinary approach applied on generalized floods of January 1910

    Science.gov (United States)

    Boudou, Martin; Lang, Michel; Vinet, Freddy; Coeur, Denis

    2014-05-01

    . The January 1910's flood is one of these remarkable floods. This event is foremost known for its aftermaths on the Seine basin, where the flood remains the strongest recorded in Paris since 1658. However, its impacts were also widespread to France's Eastern regions (Martin, 2001). To demonstrate the evaluation grid's interest, we propose a deep analysis of the 1910's river flood with the integration of historical documentation. The approach focus on eastern France where the flood remains the highest recorded for several rivers but were often neglected by scientists in favor of Paris's flood. Through a transdisciplinary research based on the evaluation grid method, we will describe the January 1910 flood event and define why it can be considered as a remarkable flood for these regions.

  10. Flood Foresight: A near-real time flood monitoring and forecasting tool for rapid and predictive flood impact assessment

    Science.gov (United States)

    Revilla-Romero, Beatriz; Shelton, Kay; Wood, Elizabeth; Berry, Robert; Bevington, John; Hankin, Barry; Lewis, Gavin; Gubbin, Andrew; Griffiths, Samuel; Barnard, Paul; Pinnell, Marc; Huyck, Charles

    2017-04-01

    The hours and days immediately after a major flood event are often chaotic and confusing, with first responders rushing to mobilise emergency responders, provide alleviation assistance and assess loss to assets of interest (e.g., population, buildings or utilities). Preparations in advance of a forthcoming event are becoming increasingly important; early warning systems have been demonstrated to be useful tools for decision markers. The extent of damage, human casualties and economic loss estimates can vary greatly during an event, and the timely availability of an accurate flood extent allows emergency response and resources to be optimised, reduces impacts, and helps prioritise recovery. In the insurance sector, for example, insurers are under pressure to respond in a proactive manner to claims rather than waiting for policyholders to report losses. Even though there is a great demand for flood inundation extents and severity information in different sectors, generating flood footprints for large areas from hydraulic models in real time remains a challenge. While such footprints can be produced in real time using remote sensing, weather conditions and sensor availability limit their ability to capture every single flood event across the globe. In this session, we will present Flood Foresight (www.floodforesight.com), an operational tool developed to meet the universal requirement for rapid geographic information, before, during and after major riverine flood events. The tool provides spatial data with which users can measure their current or predicted impact from an event - at building, basin, national or continental scales. Within Flood Foresight, the Screening component uses global rainfall predictions to provide a regional- to continental-scale view of heavy rainfall events up to a week in advance, alerting the user to potentially hazardous situations relevant to them. The Forecasting component enhances the predictive suite of tools by providing a local

  11. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  12. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  13. Unstructured mesh adaptivity for urban flooding modelling

    Science.gov (United States)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  14. Impact of stream restoration on flood waves

    Science.gov (United States)

    Sholtes, J.; Doyle, M.

    2008-12-01

    Restoration of channelized or incised streams has the potential to reduce downstream flooding via storing and dissipating the energy of flood waves. Restoration design elements such as restoring meanders, reducing slope, restoring floodplain connectivity, re-introducing in-channel woody debris, and re-vegetating banks and the floodplain have the capacity to attenuate flood waves via energy dissipation and channel and floodplain storage. Flood discharge hydrographs measured up and downstream of several restored reaches of varying stream order and located in both urban and rural catchments are coupled with direct measurements of stream roughness at various stages to directly measure changes to peak discharge, flood wave celerity, and dispersion. A one-dimensional unsteady flow routing model, HEC-RAS, is calibrated and used to compare attenuation characteristics between pre and post restoration conditions. Modeled sensitivity results indicate that a restoration project placed on a smaller order stream demonstrates the highest relative reduction in peak discharge of routed flood waves compared to one of equal length on a higher order stream. Reductions in bed slope, extensions in channel length, and increases in channel and floodplain roughness follow restoration placement with the watershed in relative importance. By better understanding how design, scale, and location of restored reaches within a catchment hydraulically impact flood flows, this study contributes both to restoration design and site decision making. It also quantifies the effect of reach scale stream restoration on flood wave attenuation.

  15. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  16. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  17. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  18. Hydrological forecast of maximal water level in Lepenica river basin and flood control measures

    Directory of Open Access Journals (Sweden)

    Milanović Ana

    2006-01-01

    Full Text Available Lepenica river basin territory has became axis of economic and urban development of Šumadija district. However, considering Lepenica River with its tributaries, and their disordered river regime, there is insufficient of water for water supply and irrigation, while on the other hand, this area is suffering big flood and torrent damages (especially Kragujevac basin. The paper presents flood problems in the river basin, maximum water level forecasts, and flood control measures carried out until now. Some of the potential solutions, aiming to achieve the effective flood control, are suggested as well.

  19. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  20. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  1. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  2. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  3. Urban pluvial flood prediction

    DEFF Research Database (Denmark)

    Thorndahl, Søren Liedtke; Nielsen, Jesper Ellerbæk; Jensen, David Getreuer

    2016-01-01

    Flooding produced by high-intensive local rainfall and drainage system capacity exceedance can have severe impacts in cities. In order to prepare cities for these types of flood events – especially in the future climate – it is valuable to be able to simulate these events numerically both...... historically and in real-time. There is a rather untested potential in real-time prediction of urban floods. In this paper radar data observations with different spatial and temporal resolution, radar nowcasts of 0–2 h lead time, and numerical weather models with lead times up to 24 h are used as inputs...... to an integrated flood and drainage systems model in order to investigate the relative difference between different inputs in predicting future floods. The system is tested on a small town Lystrup in Denmark, which has been flooded in 2012 and 2014. Results show it is possible to generate detailed flood maps...

  4. The Semantic Network of Flood Hydrological Data for Kelantan, Malaysia

    Science.gov (United States)

    Yusoff, Aziyati; Din, Norashidah Md; Yussof, Salman; Ullah Khan, Samee

    2016-03-01

    Every year, authorities in Malaysia are putting efforts on disaster management mechanisms including the flood incidence that might hit the east coast of Peninsular Malaysia. This includes the state of Kelantan of which it was reported that flood is just a normal event occurred annually. However, the aftermath was always unmanageable and had left the state to struggle for its own recoveries. Though it was expected that flood occurred every year, among the worst were in 1967, 1974, 1982 and recently in December 2014. This study is proposing a semantic network as an approach to the method of utilising big data analytics in analysing the huge data from the state’s flood reading stations. It is expected that by using current computing edge can also facilitate mitigating this particular disaster.

  5. FLOOD MENACE IN KADUNA METROPOLIS: IMPACTS ...

    African Journals Online (AJOL)

    Dr A.B.Ahmed

    damage, causes of flooding, human response to flooding and severity of ... from moving out. Source of ... Man responds to flood hazards through adjustment, flood abatement ... action to minimize or ameliorate flood hazards; flood abatement.

  6. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  7. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  8. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  9. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  10. 2 Dimensional Hydrodynamic Flood Routing Analysis on Flood Forecasting Modelling for Kelantan River Basin

    Directory of Open Access Journals (Sweden)

    Azad Wan Hazdy

    2017-01-01

    Full Text Available Flood disaster occurs quite frequently in Malaysia and has been categorized as the most threatening natural disaster compared to landslides, hurricanes, tsunami, haze and others. A study by Department of Irrigation and Drainage (DID show that 9% of land areas in Malaysia are prone to flood which may affect approximately 4.9 million of the population. 2 Dimensional floods routing modelling demonstrate is turning out to be broadly utilized for flood plain display and is an extremely viable device for evaluating flood. Flood propagations can be better understood by simulating the flow and water level by using hydrodynamic modelling. The hydrodynamic flood routing can be recognized by the spatial complexity of the schematization such as 1D model and 2D model. It was found that most of available hydrological models for flood forecasting are more focus on short duration as compared to long duration hydrological model using the Probabilistic Distribution Moisture Model (PDM. The aim of this paper is to discuss preliminary findings on development of flood forecasting model using Probabilistic Distribution Moisture Model (PDM for Kelantan river basin. Among the findings discuss in this paper includes preliminary calibrated PDM model, which performed reasonably for the Dec 2014, but underestimated the peak flows. Apart from that, this paper also discusses findings on Soil Moisture Deficit (SMD and flood plain analysis. Flood forecasting is the complex process that begins with an understanding of the geographical makeup of the catchment and knowledge of the preferential regions of heavy rainfall and flood behaviour for the area of responsibility. Therefore, to decreases the uncertainty in the model output, so it is important to increase the complexity of the model.

  11. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  12. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  13. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  14. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  15. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  16. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  17. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  18. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  19. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  20. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  1. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  2. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  3. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  4. Predicting Coastal Flood Severity using Random Forest Algorithm

    Science.gov (United States)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2017-12-01

    Coastal floods have become more common recently and are predicted to further increase in frequency and severity due to sea level rise. Predicting floods in coastal cities can be difficult due to the number of environmental and geographic factors which can influence flooding events. Built stormwater infrastructure and irregular urban landscapes add further complexity. This paper demonstrates the use of machine learning algorithms in predicting street flood occurrence in an urban coastal setting. The model is trained and evaluated using data from Norfolk, Virginia USA from September 2010 - October 2016. Rainfall, tide levels, water table levels, and wind conditions are used as input variables. Street flooding reports made by city workers after named and unnamed storm events, ranging from 1-159 reports per event, are the model output. Results show that Random Forest provides predictive power in estimating the number of flood occurrences given a set of environmental conditions with an out-of-bag root mean squared error of 4.3 flood reports and a mean absolute error of 0.82 flood reports. The Random Forest algorithm performed much better than Poisson regression. From the Random Forest model, total daily rainfall was by far the most important factor in flood occurrence prediction, followed by daily low tide and daily higher high tide. The model demonstrated here could be used to predict flood severity based on forecast rainfall and tide conditions and could be further enhanced using more complete street flooding data for model training.

  5. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  6. Principles of big data preparing, sharing, and analyzing complex information

    CERN Document Server

    Berman, Jules J

    2013-01-01

    Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endo

  7. A global flash flood forecasting system

    Science.gov (United States)

    Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin

    2016-04-01

    The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial

  8. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  9. Fragmented patterns of flood change across the United States

    Science.gov (United States)

    Archfield, Stacey A.; Hirsch, Robert M.; Viglione, A.; Blöschl, G.

    2016-01-01

    Trends in the peak magnitude, frequency, duration, and volume of frequent floods (floods occurring at an average of two events per year relative to a base period) across the United States show large changes; however, few trends are found to be statistically significant. The multidimensional behavior of flood change across the United States can be described by four distinct groups, with streamgages experiencing (1) minimal change, (2) increasing frequency, (3) decreasing frequency, or (4) increases in all flood properties. Yet group membership shows only weak geographic cohesion. Lack of geographic cohesion is further demonstrated by weak correlations between the temporal patterns of flood change and large-scale climate indices. These findings reveal a complex, fragmented pattern of flood change that, therefore, clouds the ability to make meaningful generalizations about flood change across the United States.

  10. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  11. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  12. The Big Mac Standard: A statistical Illustration

    OpenAIRE

    Yukinobu Kitamura; Hiroshi Fujiki

    2004-01-01

    We demonstrate a statistical procedure for selecting the most suitable empirical model to test an economic theory, using the example of the test for purchasing power parity based on the Big Mac Index. Our results show that supporting evidence for purchasing power parity, conditional on the Balassa-Samuelson effect, depends crucially on the selection of models, sample periods and economies used for estimations.

  13. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  14. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  15. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  16. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  17. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  19. Unsupervised Tensor Mining for Big Data Practitioners.

    Science.gov (United States)

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  20. Discover Floods Educators Guide

    Science.gov (United States)

    Project WET Foundation, 2009

    2009-01-01

    Now available as a Download! This valuable resource helps educators teach students about both the risks and benefits of flooding through a series of engaging, hands-on activities. Acknowledging the different roles that floods play in both natural and urban communities, the book helps young people gain a global understanding of this common--and…

  1. Flood action plans

    International Nuclear Information System (INIS)

    Slopek, R.J.

    1995-01-01

    Safe operating procedures developed by TransAlta Utilities for dealing with flooding, resulting from upstream dam failures or extreme rainfalls, were presented. Several operating curves developed by Monenco AGRA were described, among them the No Overtopping Curve (NOC), the Safe Filling Curve (SFC), the No Spill Curve (NSC) and the Guaranteed Fill Curve (GFC). The concept of an operational comfort zone was developed and defined. A flood action plan for all operating staff was created as a guide in case of a flooding incident. Staging of a flood action plan workshop was described. Dam break scenarios pertinent to the Bow River were developed for subsequent incorporation into a Flood Action Plan Manual. Evaluation of the technical presentations made during workshops were found them to have been effective in providing operating staff with a better understanding of the procedures that they would perform in an emergency. 8 figs

  2. Distillation Column Flooding Predictor

    Energy Technology Data Exchange (ETDEWEB)

    George E. Dzyacky

    2010-11-23

    The Flooding Predictor™ is a patented advanced control technology proven in research at the Separations Research Program, University of Texas at Austin, to increase distillation column throughput by over 6%, while also increasing energy efficiency by 10%. The research was conducted under a U. S. Department of Energy Cooperative Agreement awarded to George Dzyacky of 2ndpoint, LLC. The Flooding Predictor™ works by detecting the incipient flood point and controlling the column closer to its actual hydraulic limit than historical practices have allowed. Further, the technology uses existing column instrumentation, meaning no additional refining infrastructure is required. Refiners often push distillation columns to maximize throughput, improve separation, or simply to achieve day-to-day optimization. Attempting to achieve such operating objectives is a tricky undertaking that can result in flooding. Operators and advanced control strategies alike rely on the conventional use of delta-pressure instrumentation to approximate the column’s approach to flood. But column delta-pressure is more an inference of the column’s approach to flood than it is an actual measurement of it. As a consequence, delta pressure limits are established conservatively in order to operate in a regime where the column is never expected to flood. As a result, there is much “left on the table” when operating in such a regime, i.e. the capacity difference between controlling the column to an upper delta-pressure limit and controlling it to the actual hydraulic limit. The Flooding Predictor™, an innovative pattern recognition technology, controls columns at their actual hydraulic limit, which research shows leads to a throughput increase of over 6%. Controlling closer to the hydraulic limit also permits operation in a sweet spot of increased energy-efficiency. In this region of increased column loading, the Flooding Predictor is able to exploit the benefits of higher liquid

  3. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  4. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  5. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  6. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  7. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  8. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  9. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  10. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  11. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  12. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  13. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  14. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  15. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  16. Iowa Flood Information System

    Science.gov (United States)

    Demir, I.; Krajewski, W. F.; Goska, R.; Mantilla, R.; Weber, L. J.; Young, N.

    2011-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 500 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities

  17. Risk-trading in flood management: An economic model.

    Science.gov (United States)

    Chang, Chiung Ting

    2017-09-15

    Although flood management is no longer exclusively a topic of engineering, flood mitigation continues to be associated with hard engineering options. Flood adaptation or the capacity to adapt to flood risk, as well as a demand for internalizing externalities caused by flood risk between regions, complicate flood management activities. Even though integrated river basin management has long been recommended to resolve the above issues, it has proven difficult to apply widely, and sometimes even to bring into existence. This article explores how internalization of externalities as well as the realization of integrated river basin management can be encouraged via the use of a market-based approach, namely a flood risk trading program. In addition to maintaining efficiency of optimal resource allocation, a flood risk trading program may also provide a more equitable distribution of benefits by facilitating decentralization. This article employs a graphical analysis to show how flood risk trading can be implemented to encourage mitigation measures that increase infiltration and storage capacity. A theoretical model is presented to demonstrate the economic conditions necessary for flood risk trading. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Predicting floods with Flickr tags.

    Science.gov (United States)

    Tkachenko, Nataliya; Jarvis, Stephen; Procter, Rob

    2017-01-01

    Increasingly, user generated content (UGC) in social media postings and their associated metadata such as time and location stamps are being used to provide useful operational information during natural hazard events such as hurricanes, storms and floods. The main advantage of these new sources of data are twofold. First, in a purely additive sense, they can provide much denser geographical coverage of the hazard as compared to traditional sensor networks. Second, they provide what physical sensors are not able to do: By documenting personal observations and experiences, they directly record the impact of a hazard on the human environment. For this reason interpretation of the content (e.g., hashtags, images, text, emojis, etc) and metadata (e.g., keywords, tags, geolocation) have been a focus of much research into social media analytics. However, as choices of semantic tags in the current methods are usually reduced to the exact name or type of the event (e.g., hashtags '#Sandy' or '#flooding'), the main limitation of such approaches remains their mere nowcasting capacity. In this study we make use of polysemous tags of images posted during several recent flood events and demonstrate how such volunteered geographic data can be used to provide early warning of an event before its outbreak.

  19. Unexpected flood loss correlations across Europe

    Science.gov (United States)

    Booth, Naomi; Boyd, Jessica

    2017-04-01

    Floods don't observe country borders, as highlighted by major events across Europe that resulted in heavy economic and insured losses in 1999, 2002, 2009 and 2013. Flood loss correlations between some countries occur along multi-country river systems or between neighbouring nations affected by the same weather systems. However, correlations are not so obvious and whilst flooding in multiple locations across Europe may appear independent, for a re/insurer providing cover across the continent, these unexpected correlations can lead to high loss accumulations. A consistent, continental-scale method that allows quantification and comparison of losses, and identifies correlations in loss between European countries is therefore essential. A probabilistic model for European river flooding was developed that allows estimation of potential losses to pan-European property portfolios. By combining flood hazard and exposure information in a catastrophe modelling platform, we can consider correlations between river basins across Europe rather than being restricted to country boundaries. A key feature of the model is its statistical event set based on extreme value theory. Using historical river flow data, the event set captures spatial and temporal patterns of flooding across Europe and simulates thousands of events representing a full range of possible scenarios. Some known correlations were identified, such as between neighbouring Belgium and Luxembourg where 28% of events that affect either country produce a loss in both. However, our model identified some unexpected correlations including between Austria and Poland, and Poland and France, which are geographically distant. These correlations in flood loss may be missed by traditional methods and are key for re/insurers with risks in multiple countries. The model also identified that 46% of European river flood events affect more than one country. For more extreme events with a return period higher than 200 years, all events

  20. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  1. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  2. Survey of Cyber Crime in Big Data

    Science.gov (United States)

    Rajeswari, C.; Soni, Krishna; Tandon, Rajat

    2017-11-01

    Big data is like performing computation operations and database operations for large amounts of data, automatically from the data possessor’s business. Since a critical strategic offer of big data access to information from numerous and various areas, security and protection will assume an imperative part in big data research and innovation. The limits of standard IT security practices are notable, with the goal that they can utilize programming sending to utilize programming designers to incorporate pernicious programming in a genuine and developing risk in applications and working frameworks, which are troublesome. The impact gets speedier than big data. In this way, one central issue is that security and protection innovation are sufficient to share controlled affirmation for countless direct get to. For powerful utilization of extensive information, it should be approved to get to the information of that space or whatever other area from a space. For a long time, dependable framework improvement has arranged a rich arrangement of demonstrated ideas of demonstrated security to bargain to a great extent with the decided adversaries, however this procedure has been to a great extent underestimated as “needless excess” and sellers In this discourse, essential talks will be examined for substantial information to exploit this develop security and protection innovation, while the rest of the exploration difficulties will be investigated.

  3. Reactor safety under design basis flood condition for inland sites

    International Nuclear Information System (INIS)

    Hajela, S.; Bajaj, S.S.; Samota, A.; Verma, U.S.P.; Warudkar, A.S.

    2002-01-01

    Full text: In June 1994, there was an incident of flooding at Kakrapar Atomic Power Station (KAPS) due to combination of heavy rains and mechanical failure in the operation of gates at the adjoining weir. An indepth review of the incident was carried out and a number of flood protection measures were recommended and were implemented at site. As part of this review, a safety analysis was also done to demonstrate reactor safety with a series of failures considered in the flood protection features. For each inland NPP site, as part of design, different flood scenarios are analysed to arrive at design basis flood (DBF) level. This level is estimated based on worst combination of heavy local precipitation, flooding in river, failure of upstream/downstream water control structures

  4. Predicting the impact of urban flooding using open data.

    Science.gov (United States)

    Tkachenko, Nataliya; Procter, Rob; Jarvis, Stephen

    2016-05-01

    This paper aims to explore whether there is a relationship between search patterns for flood risk information on the Web and how badly localities have been affected by flood events. We hypothesize that localities where people stay more actively informed about potential flooding experience less negative impact than localities where people make less effort to be informed. Being informed, of course, does not hold the waters back; however, it may stimulate (or serve as an indicator of) such resilient behaviours as timely use of sandbags, relocation of possessions from basements to upper floors and/or temporary evacuation from flooded homes to alternative accommodation. We make use of open data to test this relationship empirically. Our results demonstrate that although aggregated Web search reflects average rainfall patterns, its eigenvectors predominantly consist of locations with similar flood impacts during 2014-2015. These results are also consistent with statistically significant correlations of Web search eigenvectors with flood warning and incident reporting datasets.

  5. Legitimizing differentiated flood protection levels

    NARCIS (Netherlands)

    Thomas, Hartmann; Spit, Tejo

    2016-01-01

    The European flood risk management plan is a new instrument introduced by the Floods Directive. It introduces a spatial turn and a scenario approach in flood risk management, ultimately leading to differentiated flood protection levels on a catchment basis. This challenges the traditional sources of

  6. Modeling of Flood Risk for the Continental United States

    Science.gov (United States)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  7. Interactive Web-based Floodplain Simulation System for Realistic Experiments of Flooding and Flood Damage

    Science.gov (United States)

    Demir, I.

    2013-12-01

    Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.

  8. Earth Science Capability Demonstration Project

    Science.gov (United States)

    Cobleigh, Brent

    2006-01-01

    A viewgraph presentation reviewing the Earth Science Capability Demonstration Project is shown. The contents include: 1) ESCD Project; 2) Available Flight Assets; 3) Ikhana Procurement; 4) GCS Layout; 5) Baseline Predator B Architecture; 6) Ikhana Architecture; 7) UAV Capability Assessment; 8) The Big Picture; 9) NASA/NOAA UAV Demo (5/05 to 9/05); 10) NASA/USFS Western States Fire Mission (8/06); and 11) Suborbital Telepresence.

  9. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  10. Determining tropical cyclone inland flooding loss on a large scale through a new flood peak ratio-based methodology

    International Nuclear Information System (INIS)

    Czajkowski, Jeffrey; Michel-Kerjan, Erwann; Villarini, Gabriele; Smith, James A

    2013-01-01

    In recent years, the United States has been severely affected by numerous tropical cyclones (TCs) which have caused massive damages. While media attention mainly focuses on coastal losses from storm surge, these TCs have inflicted significant devastation inland as well. Yet, little is known about the relationship between TC-related inland flooding and economic losses. Here we introduce a novel methodology that first successfully characterizes the spatial extent of inland flooding, and then quantifies its relationship with flood insurance claims. Hurricane Ivan in 2004 is used as illustration. We empirically demonstrate in a number of ways that our quantified inland flood magnitude produces a very good representation of the number of inland flood insurance claims experienced. These results highlight the new technological capabilities that can lead to a better risk assessment of inland TC flood. This new capacity will be of tremendous value to a number of public and private sector stakeholders dealing with disaster preparedness. (letter)

  11. The August 2002 flood in Salzburg / Austria experience gained and lessons learned from the ``Flood of the century''?

    Science.gov (United States)

    Wiesenegger, H.

    2003-04-01

    On the {12th} of August 2002 a low pressure system moved slowly from northern Italy towards Slovakia. It continuously carried moist air from the Mediterranean towards the northern rim of the Alps with the effect of wide-spread heavy rainfall in Salzburg and other parts of Austria. Daily precipitation amounts of 100 - 160 mm, in some parts even more, as well as rainfall intensities of 5 - 10 mm/h , combined with well saturated soils lead to a rare flood with a return period of 100 years and more. This rare hydrological event not only caused a national catastrophe with damages of several Billion Euro, but also endangered more than 200,000 people, and even killed some. As floods are dangerous, life-threatening, destructive, and certainly amongst the most frequent and costly natural disasters in terms of human hardship as well as economic loss, a great effort, therefore, has to be made to protect people against negative impacts of floods. In order to achieve this objective, various regulations in land use planning (flood maps), constructive measurements (river regulations and technical constructions) as well as flood warning systems, which are not suitable to prevent big floods, but offer in-time-warnings to minimize the loss of human lives, are used in Austria. HYDRIS (Hydrological Information System for flood forecasting in Salzburg), a modular river basin model, developed at Technical University Vienna and operated by the Hydrological Service of Salzburg, was used during the August 2002 flood providing accurate 3 to 4 hour forecasts within 3 % of the real peak discharge of the fast flowing River Salzach. The August {12^th}} flood was in many ways an exceptional, very fast happening event which took many people by surprise. At the gauging station Salzburg / Salzach (catchment area 4425 {km^2}) it took only eighteen hours from mean annual discharge (178 {m3/s}) to the hundred years flood (2300 {m3/s}). The August flood made clear, that there is a strong need for

  12. Flood Risk Management in Iowa through an Integrated Flood Information System

    Science.gov (United States)

    Demir, Ibrahim; Krajewski, Witold

    2013-04-01

    communities in advance to help minimize damage of floods. This presentation provides an overview and live demonstration of the tools and interfaces in the IFIS developed to date to provide a platform for one-stop access to flood related data, visualizations, flood conditions, and forecast.

  13. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L.

    1983-01-01

    An apparatus is described in which effects of pressure, volume, and temperature changes on a gas can be observed simultaneously. Includes use of the apparatus in demonstrating Boyle's, Gay-Lussac's, and Charles' Laws, attractive forces, Dalton's Law of Partial pressures, and in illustrating measurable vapor pressures of liquids and some solids.…

  14. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1987-01-01

    Describes two demonstrations to illustrate characteristics of substances. Outlines a method to detect the changes in pH levels during the electrolysis of water. Uses water pistols, one filled with methane gas and the other filled with water, to illustrate the differences in these two substances. (TW)

  15. Flood-proof motors

    Energy Technology Data Exchange (ETDEWEB)

    Schmitt, Marcus [AREVA NP GmbH, Erlangen (Germany)

    2013-07-01

    Even before the Fukushima event occurred some German nuclear power plants (NPP) have considered flooding scenarios. As a result of one of these studies, AREVA performed an upgrade project in NPP Isar 1 with flood-proof motors as a replacement of existing air-cooled low-voltage and high-voltage motors of the emergency cooling chain. After the Fukushima event, in which the cooling chains failed, the topic flood-proof equipment gets more and more into focus. This compact will introduce different kinds of flood-proof electrical motors which are currently installed or planned for installation into NPPs over the world. Moreover the process of qualification, as it was performed during the project in NPP Isar 1, will be shown. (orig.)

  16. Floods and Mold Growth

    Science.gov (United States)

    Mold growth may be a problem after flooding. Excess moisture in the home is cause for concern about indoor air quality primarily because it provides breeding conditions for pests, molds and other microorganisms.

  17. FLOODPLAIN, FLOOD COUNTY, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Floodplain Mapping/Redelineation study deliverables depict and quantify the flood risks for the study area. The primary risk classifications used are the...

  18. Flood-proof motors

    International Nuclear Information System (INIS)

    Schmitt, Marcus

    2013-01-01

    Even before the Fukushima event occurred some German nuclear power plants (NPP) have considered flooding scenarios. As a result of one of these studies, AREVA performed an upgrade project in NPP Isar 1 with flood-proof motors as a replacement of existing air-cooled low-voltage and high-voltage motors of the emergency cooling chain. After the Fukushima event, in which the cooling chains failed, the topic flood-proof equipment gets more and more into focus. This compact will introduce different kinds of flood-proof electrical motors which are currently installed or planned for installation into NPPs over the world. Moreover the process of qualification, as it was performed during the project in NPP Isar 1, will be shown. (orig.)

  19. Flood hazard assessment in areas prone to flash flooding

    Science.gov (United States)

    Kvočka, Davor; Falconer, Roger A.; Bray, Michaela

    2016-04-01

    Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.

  20. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  1. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  2. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  3. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  4. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  5. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  6. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  7. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  8. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  9. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  10. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  11. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  12. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  13. On the stationarity of Floods in west African rivers

    Science.gov (United States)

    NKA, B. N.; Oudin, L.; Karambiri, H.; Ribstein, P.; Paturel, J. E.

    2014-12-01

    West Africa undergoes a big change since the years 1970-1990, characterized by very low precipitation amounts, leading to low stream flows in river basins, except in the Sahelian region where the impact of human activities where pointed out to justify the substantial increase of floods in some catchments. More recently, studies showed an increase in the frequency of intense rainfall events, and according to observations made over the region, increase of flood events is also noticeable during the rainy season. Therefore, the assumption of stationarity on flood events is questionable and the reliability of flood evolution and climatic patterns is justified. In this work, we analyzed the trends of floods events for several catchments in the Sahelian and Sudanian regions of Burkina Faso. We used thirteen tributaries of large river basins (Niger, Nakambe, Mouhoun, Comoé) for which daily rainfall and flow data were collected from national hydrological and meteorological services of the country. We used Mann-Kendall and Pettitt tests to detect trends and break points in the annual time series of 8 rainfall indices and the annual maximum discharge records. We compare the trends of precipitation indices and flood size records to analyze the possible causality link between floods size and rainfall pattern. We also analyze the stationary of the frequency of flood exceeding the ten year return period level. The samples were extracted by a Peak over threshold method and the quantification of change in flood frequency was assessed by using a test developed by Lang M. (1995). The results exhibit two principal behaviors. Generally speaking, no trend is detected on catchments annual maximum discharge, but positive break points are pointed out in a group of three right bank tributaries of the Niger river that are located in the sahelian region between 300mm to 650mm. These same catchments show as well an increase of the yearly number of flood greater than the ten year flood since

  14. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  15. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  16. Hyper-resolution monitoring of urban flooding with social media and crowdsourcing data

    Science.gov (United States)

    Wang, Ruo-Qian; Mao, Huina; Wang, Yuan; Rae, Chris; Shaw, Wesley

    2018-02-01

    Hyper-resolution datasets for urban flooding are rare. This problem prevents detailed flooding risk analysis, urban flooding control, and the validation of hyper-resolution numerical models. We employed social media and crowdsourcing data to address this issue. Natural Language Processing and Computer Vision techniques are applied to the data collected from Twitter and MyCoast (a crowdsourcing app). We found these big data based flood monitoring approaches can complement the existing means of flood data collection. The extracted information is validated against precipitation data and road closure reports to examine the data quality. The two data collection approaches are compared and the two data mining methods are discussed. A series of suggestions is given to improve the data collection strategy.

  17. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  18. 18 CFR 1304.407 - Development within flood control storage zones of TVA reservoirs.

    Science.gov (United States)

    2010-04-01

    ... flood control storage zones of TVA reservoirs. 1304.407 Section 1304.407 Conservation of Power and Water... documentation related to flood control storage, provided the loss of flood control storage caused by the project... control storage. If this determination can be made, the applicant must then demonstrate how the loss of...

  19. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  20. Big Data Analytics in the Management of Business

    Directory of Open Access Journals (Sweden)

    Jelonek Dorota

    2017-01-01

    Full Text Available Data, information, knowledge have always played a critical role in business. The amount of various data that can be collected and stored is increasing, therefore companies need new solutions for data processing and analysis. The paper presents considerations on the concept of Big Data. The aim of the paper is to demonstrate that Big Data analytics is an effective support in managing the company. It also indicates the areas and activities where the use of Big Data analytics can bring the greatest benefits to companies.

  1. Mitigating flood exposure

    Science.gov (United States)

    Shultz, James M; McLean, Andrew; Herberman Mash, Holly B; Rosen, Alexa; Kelly, Fiona; Solo-Gabriele, Helena M; Youngs Jr, Georgia A; Jensen, Jessica; Bernal, Oscar; Neria, Yuval

    2013-01-01

    Introduction. In 2011, following heavy winter snowfall, two cities bordering two rivers in North Dakota, USA faced major flood threats. Flooding was foreseeable and predictable although the extent of risk was uncertain. One community, Fargo, situated in a shallow river basin, successfully mitigated and prevented flooding. For the other community, Minot, located in a deep river valley, prevention was not possible and downtown businesses and one-quarter of the homes were inundated, in the city’s worst flood on record. We aimed at contrasting the respective hazards, vulnerabilities, stressors, psychological risk factors, psychosocial consequences, and disaster risk reduction strategies under conditions where flood prevention was, and was not, possible. Methods. We applied the “trauma signature analysis” (TSIG) approach to compare the hazard profiles, identify salient disaster stressors, document the key components of disaster risk reduction response, and examine indicators of community resilience. Results. Two demographically-comparable communities, Fargo and Minot, faced challenging river flood threats and exhibited effective coordination across community sectors. We examined the implementation of disaster risk reduction strategies in situations where coordinated citizen action was able to prevent disaster impact (hazard avoidance) compared to the more common scenario when unpreventable disaster strikes, causing destruction, harm, and distress. Across a range of indicators, it is clear that successful mitigation diminishes both physical and psychological impact, thereby reducing the trauma signature of the event. Conclusion. In contrast to experience of historic flooding in Minot, the city of Fargo succeeded in reducing the trauma signature by way of reducing risk through mitigation. PMID:28228985

  2. Flexibility in Flood Management Design: Proactive Planning Under Climate Change Uncertainty

    Science.gov (United States)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2015-12-01

    This paper presents an innovative, value-enhancing procedure for effective planning and design of long-lived flood management infrastructure given uncertain future flooding threats due to climate change. Designing infrastructure that can be adapted over time is a method to safeguard the efficacy of current design decisions given uncertainty about rates and future impacts of climate change. This paper explores the value of embedding "options" in a physical structure, where an option is the right but not the obligation to do something at a later date (e.g. over-dimensioning a floodwall foundation now facilitates a future height addition in response to observed increases in sea level; building of extra pump bays in a pumping station now enables the addition of pumping capacity whenever increased precipitation warrants an expansion.) The proposed procedure couples a simulation model that captures future climate induced changes to the hydrologic operating environment of a structure, with an economic model that estimates the lifetime economic performance of alternative investments. The economic model uses Real "In" Options analysis, a type of cash flow analysis that quantifies the implicit value of options and the flexibility they provide. This procedure is demonstrated using replacement planning for the multi-functional pumping station IJmuiden on the North Sea Canal in the Netherlands. Flexibility in design decisions is modelled, varying the size and specific options included in the new structure. Results indicate that the incorporation of options within the structural design has the potential to improve its economic performance, as compared to more traditional, "build it once and build it big" designs where flexibility is not an explicit design criterion. The added value resulting from the incorporation of flexibility varies with the range of future conditions considered, as well as the options examined. This procedure could be applied more broadly to explore

  3. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Application of Flood Nomograph for Flood Forecasting in Urban Areas

    Directory of Open Access Journals (Sweden)

    Eui Hoon Lee

    2018-01-01

    Full Text Available Imperviousness has increased due to urbanization, as has the frequency of extreme rainfall events by climate change. Various countermeasures, such as structural and nonstructural measures, are required to prepare for these effects. Flood forecasting is a representative nonstructural measure. Flood forecasting techniques have been developed for the prevention of repetitive flood damage in urban areas. It is difficult to apply some flood forecasting techniques using training processes because training needs to be applied at every usage. The other flood forecasting techniques that use rainfall data predicted by radar are not appropriate for small areas, such as single drainage basins. In this study, a new flood forecasting technique is suggested to reduce flood damage in urban areas. The flood nomograph consists of the first flooding nodes in rainfall runoff simulations with synthetic rainfall data at each duration. When selecting the first flooding node, the initial amount of synthetic rainfall is 1 mm, which increases in 1 mm increments until flooding occurs. The advantage of this flood forecasting technique is its simple application using real-time rainfall data. This technique can be used to prepare a preemptive response in the process of urban flood management.

  5. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  6. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  7. Crowdsourcing detailed flood data

    Science.gov (United States)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  8. Floods in Colorado

    Science.gov (United States)

    Follansbee, Robert; Sawyer, Leon R.

    1948-01-01

    The first records of floods in Colorado antedated the settlement of the State by about 30 years. These were records of floods on the Arkansas and Republican Rivers in 1826. Other floods noted by traders, hunters and emigrants, some of whom were on their way to the Far West, occurred in 1844 on the Arkansas River, and by inference on the South Platte River. Other early floods were those on the Purgatoire, the Lower Arkansas, and the San Juan Rivers about 1859. The most serious flood since settlement began was that on the Arkansas River during June 1921, which caused the loss of about 100 lives and an estimated property loss of $19,000,000. Many floods of lesser magnitude have occurred, and some of these have caused loss of life and very considerable property damage. Topography is the chief factor in determining the location of storms and resulting floods. These occur most frequently on the eastern slope of the Front Range. In the mountains farther west precipitation is insufficient to cause floods except during periods of melting snow, in June. In the southwestern part of the State, where precipitation during periods of melting snow is insufficient to cause floods, the severest floods yet experienced resulted from heavy rains in September 1909 and October 1911. In the eastern foothills region, usually below an altitude of about 7,500 feet and extending for a distance of about 50 miles east of the mountains, is a zone subject to rainfalls of great intensity known as cloudbursts. These cloudbursts are of short duration and are confined to very small areas. At times the intensity is so great as to make breathing difficult for those exposed to a storm. The areas of intense rainfall are so small that Weather Bureau precipitation stations have not been located in them. Local residents, being cloudburst conscious, frequently measure the rainfall in receptacles in their yards, and such records constitute the only source of information regarding the intensity. A flood

  9. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  10. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  11. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  12. Impacts of dyke development in flood prone areas in the Vietnamese Mekong Delta to downstream flood hazard

    Science.gov (United States)

    Khanh Triet Nguyen, Van; Dung Nguyen, Viet; Fujii, Hideto; Kummu, Matti; Merz, Bruno; Apel, Heiko

    2016-04-01

    The Vietnamese Mekong Delta (VMD) plays an important role in food security and socio-economic development of the country. Being a low-lying coastal region, the VMD is particularly susceptible to both riverine and tidal floods, which provide, on (the) one hand, the basis for the rich agricultural production and the livelihood of the people, but on the other hand pose a considerable hazard depending on the severity of the floods. But despite of potentially hazardous flood, the area remain active as a rice granary due to its nutrient-rich soils and sediment input, and dense waterways, canals and the long standing experience of the population living with floods. In response to both farmers' requests and governmental plans, the construction of flood protection infrastructure in the delta progressed rapidly in the last twenty years, notably at areas prone to deep flooding, i.e. the Plain of Reeds (PoR) and Long Xuyen Quadrangle (LXQ). Triple rice cropping becomes possible in farmlands enclosed by "full-dykes", i.e. dykes strong and high enough to prevent flooding of the flood plains for most of the floods. In these protected flood plains rice can be grown even during the peak flood period (September to November). However, little is known about the possibly (and already alleged) negative impacts of this fully flood protection measure to downstream areas. This study aims at quantifying how the flood regime in the lower part of the VMD (e.g. Can Tho, My Thuan, …) has been changed in the last 2 recent "big flood" events of 2000 and 2011 due to the construction of the full-dyke system in the upper part. First, an evaluation of 35 years of daily water level data was performed in order to detect trends at key gauging stations: Kratie: upper boundary of the Delta, Tan Chau and Chau Doc: areas with full-dyke construction, Can Tho and My Thuan: downstream. Results from the Mann-Kendall (MK) test show a decreasing trend of the annual maximum water level at 3 stations Kratie, Tan

  13. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  14. Social sensing of floods in the UK.

    Science.gov (United States)

    Arthur, Rudy; Boulton, Chris A; Shotton, Humphrey; Williams, Hywel T P

    2018-01-01

    "Social sensing" is a form of crowd-sourcing that involves systematic analysis of digital communications to detect real-world events. Here we consider the use of social sensing for observing natural hazards. In particular, we present a case study that uses data from a popular social media platform (Twitter) to detect and locate flood events in the UK. In order to improve data quality we apply a number of filters (timezone, simple text filters and a naive Bayes 'relevance' filter) to the data. We then use place names in the user profile and message text to infer the location of the tweets. These two steps remove most of the irrelevant tweets and yield orders of magnitude more located tweets than we have by relying on geo-tagged data. We demonstrate that high resolution social sensing of floods is feasible and we can produce high-quality historical and real-time maps of floods using Twitter.

  15. Channel Shallowing as Mitigation of Coastal Flooding

    Directory of Open Access Journals (Sweden)

    Philip M. Orton

    2015-07-01

    Full Text Available Here, we demonstrate that reductions in the depth of inlets or estuary channels can be used to reduce or prevent coastal flooding. A validated hydrodynamic model of Jamaica Bay, New York City (NYC, is used to test nature-based adaptation measures in ameliorating flooding for NYC's two largest historical coastal flood events. In addition to control runs with modern bathymetry, three altered landscape scenarios are tested: (1 increasing the area of wetlands to their 1879 footprint and bathymetry, but leaving deep shipping channels unaltered; (2 shallowing all areas deeper than 2 m in the bay to be 2 m below Mean Low Water; (3 shallowing only the narrowest part of the inlet to the bay. These three scenarios are deliberately extreme and designed to evaluate the leverage each approach exerts on water levels. They result in peak water level reductions of 0.3%, 15%, and 6.8% for Hurricane Sandy, and 2.4%, 46% and 30% for the Category-3 hurricane of 1821, respectively (bay-wide averages. These results suggest that shallowing can provide greater flood protection than wetland restoration, and it is particularly effective at reducing "fast-pulse" storm surges that rise and fall quickly over several hours, like that of the 1821 storm. Nonetheless, the goal of flood mitigation must be weighed against economic, navigation, and ecological needs, and practical concerns such as the availability of sediment.

  16. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  17. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  18. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  19. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  20. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  1. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  2. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  3. Probabilistic flood extent estimates from social media flood observations

    NARCIS (Netherlands)

    Brouwer, Tom; Eilander, Dirk; Van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-01-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from

  4. Probabilistic flood extent estimates from social media flood observations

    NARCIS (Netherlands)

    Brouwer, Tom; Eilander, Dirk; Van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-01-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, creates a growing need for accurate and timely flood maps. This research focussed on creating flood maps using user generated content from Twitter. Twitter data has

  5. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    Science.gov (United States)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  6. Flood Risk Management In Europe: European flood regulation

    NARCIS (Netherlands)

    Hegger, D.L.T.; Bakker, M.H.; Green, C.; Driessen, Peter; Delvaux, B.; Rijswick, H.F.M.W. van; Suykens, C.; Beyers, J-C.; Deketelaere, K.; Doorn-Hoekveld, W. van; Dieperink, C.

    2013-01-01

    In Europe, water management is moving from flood defense to a risk management approach, which takes both the probability and the potential consequences of flooding into account. In this report, we will look at Directives and (non-)EU- initiatives in place to deal with flood risk in Europe indirectly

  7. Improving Global Flood Forecasting using Satellite Detected Flood Extent

    NARCIS (Netherlands)

    Revilla Romero, B.

    2016-01-01

    Flooding is a natural global phenomenon but in many cases is exacerbated by human activity. Although flooding generally affects humans in a negative way, bringing death, suffering, and economic impacts, it also has potentially beneficial effects. Early flood warning and forecasting systems, as well

  8. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  9. Camp Marmal Flood Study

    Science.gov (United States)

    2012-03-01

    was simulated by means of a broad - crested weir built into the topography of the mesh. There is 0.5 m of freeboard and the width of the weir is 30 m...ER D C/ CH L TR -1 2- 5 Camp Marmal Flood Study Co as ta l a nd H yd ra ul ic s La bo ra to ry Jeremy A. Sharp , Steve H. Scott...Camp Marmal Flood Study Jeremy A. Sharp , Steve H. Scott, Mark R. Jourdan, and Gaurav Savant Coastal and Hydraulics Laboratory U.S. Army Engineer

  10. FEMA DFIRM Base Flood Elevations

    Data.gov (United States)

    Minnesota Department of Natural Resources — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally,...

  11. 2013 FEMA Flood Hazard Boundaries

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  12. FEMA DFIRM Flood Hazard Areas

    Data.gov (United States)

    Minnesota Department of Natural Resources — FEMA flood hazard delineations are used by the Federal Emergency Management Agency (FEMA) to designate the Special Flood Hazard Area (SFHA) and for insurance rating...

  13. Base Flood Elevation (BFE) Lines

    Data.gov (United States)

    Department of Homeland Security — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally if...

  14. National Flood Hazard Layer (NFHL)

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The National Flood Hazard Layer (NFHL) is a compilation of GIS data that comprises a nationwide digital Flood Insurance Rate Map. The GIS data and services are...

  15. FEMA 100 year Flood Data

    Data.gov (United States)

    California Natural Resource Agency — The Q3 Flood Data product is a digital representation of certain features of FEMA's Flood Insurance Rate Map (FIRM) product, intended for use with desktop mapping...

  16. 2013 FEMA Flood Control Structures

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  17. FEMA Q3 Flood Data

    Data.gov (United States)

    Kansas Data Access and Support Center — The Q3 Flood Data are derived from the Flood Insurance Rate Maps (FIRMS) published by the Federal Emergency Management Agency (FEMA). The file is georeferenced to...

  18. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  19. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    Science.gov (United States)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the

  20. Flood characteristics of the Haor area in Bangladesh

    Science.gov (United States)

    Suman, Asadusjjaman; Bhattacharya, Biswa

    2013-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Bangladesh is a country, which is frequently suffering from flooding. The current research is conducted in the framework of a project, which focuses on the flooding issues in the Haor region in the north-east of Bangladesh. A haor is a saucer-shaped depression, which is used during the dry period (December to mid-May) for agriculture and as a fishery during the wet period (June-November), and thereby presents a very interesting socio-economic perspective of flood risk management. Pre-monsoon flooding till mid-May causes agricultural loss and lot of distress whereas monsoon flooding brings benefits. The area is bordering India, thereby presenting trans-boundary issues as well, and is fed by some flashy Indian catchments. The area is drained mainly through the Surma-Kushiyara river system. The terrain generally is flat and the flashy characteristics die out within a short distance from the border. Limited studies on the region, particularly with the help of numerical models, have been carried out in the past. Therefore, an objective of the current research was to set up numerical models capable of reasonably emulating the physical system. Such models could, for example, associate different gauges to the spatio-temporal variation of hydrodynamic variables and help in carrying out a systemic study on the impact of climate changes. A 1D2D model, with one-dimensional model for the rivers (based on MIKE 11 modelling tool from Danish Hydraulic Institute) and a two

  1. Multivariate pluvial flood damage models

    International Nuclear Information System (INIS)

    Van Ootegem, Luc; Verhofstadt, Elsy; Van Herck, Kristine; Creten, Tom

    2015-01-01

    Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks

  2. Multivariate pluvial flood damage models

    Energy Technology Data Exchange (ETDEWEB)

    Van Ootegem, Luc [HIVA — University of Louvain (Belgium); SHERPPA — Ghent University (Belgium); Verhofstadt, Elsy [SHERPPA — Ghent University (Belgium); Van Herck, Kristine; Creten, Tom [HIVA — University of Louvain (Belgium)

    2015-09-15

    Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks.

  3. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  4. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  5. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  6. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  7. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  8. Optical and Physical Methods for Mapping Flooding with Satellite Imagery

    Science.gov (United States)

    Fayne, Jessica Fayne; Bolten, John; Lakshmi, Venkat; Ahamed, Aakash

    2016-01-01

    Flood and surface water mapping is becoming increasingly necessary, as extreme flooding events worldwide can damage crop yields and contribute to billions of dollars economic damages as well as social effects including fatalities and destroyed communities (Xaio et al. 2004; Kwak et al. 2015; Mueller et al. 2016).Utilizing earth observing satellite data to map standing water from space is indispensable to flood mapping for disaster response, mitigation, prevention, and warning (McFeeters 1996; Brakenridge and Anderson 2006). Since the early 1970s(Landsat, USGS 2013), researchers have been able to remotely sense surface processes such as extreme flood events to help offset some of these problems. Researchers have demonstrated countless methods and modifications of those methods to help increase knowledge of areas at risk and areas that are flooded using remote sensing data from optical and radar systems, as well as free publically available and costly commercial datasets.

  9. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  10. Floods in a changing climate

    Science.gov (United States)

    Theresa K. Andersen; Marshall J. Shepherd

    2013-01-01

    Atmospheric warming and associated hydrological changes have implications for regional flood intensity and frequency. Climate models and hydrological models have the ability to integrate various contributing factors and assess potential changes to hydrology at global to local scales through the century. This survey of floods in a changing climate reviews flood...

  11. The use of Natural Flood Management to mitigate local flooding in the rural landscape

    Science.gov (United States)

    Wilkinson, Mark; Quinn, Paul; Ghimire, Sohan; Nicholson, Alex; Addy, Steve

    2014-05-01

    The past decade has seen increases in the occurrence of flood events across Europe, putting a growing number of settlements of varying sizes at risk. The issue of flooding in smaller villages is usually not well publicised. In these small communities, the cost of constructing and maintaining traditional flood defences often outweigh the potential benefits, which has led to a growing quest for more cost effective and sustainable approaches. Here we aim to provide such an approach that alongside flood risk reduction, also has multipurpose benefits of sediment control, water quality amelioration, and habitat creation. Natural flood management (NFM) aims to reduce flooding by working with natural features and characteristics to slow down or temporarily store flood waters. NFM measures include dynamic water storage ponds and wetlands, interception bunds, channel restoration and instream wood placement, and increasing soil infiltration through soil management and tree planting. Based on integrated monitoring and modelling studies, we demonstrate the potential to manage runoff locally using NFM in rural systems by effectively managing flow pathways (hill slopes and small channels) and by exploiting floodplains and buffers strips. Case studies from across the UK show that temporary storage ponds (ranging from 300 to 3000m3) and other NFM measures can reduce peak flows in small catchments (5 to 10 km2) by up to 15 to 30 percent. In addition, increasing the overall effective storage capacity by a network of NFM measures was found to be most effective for total reduction of local flood peaks. Hydraulic modelling has shown that the positioning of such features within the catchment, and how they are connected to the main channel, may also affect their effectiveness. Field evidence has shown that these ponds can collect significant accumulations of fine sediment during flood events. On the other hand, measures such as wetlands could also play an important role during low flow

  12. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  13. Phantom cosmology without Big Rip singularity

    Energy Technology Data Exchange (ETDEWEB)

    Astashenok, Artyom V. [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation); Nojiri, Shin' ichi, E-mail: nojiri@phys.nagoya-u.ac.jp [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan); Kobayashi-Maskawa Institute for the Origin of Particles and the Universe, Nagoya University, Nagoya 464-8602 (Japan); Odintsov, Sergei D. [Department of Physics, Nagoya University, Nagoya 464-8602 (Japan); Institucio Catalana de Recerca i Estudis Avancats - ICREA and Institut de Ciencies de l' Espai (IEEC-CSIC), Campus UAB, Facultat de Ciencies, Torre C5-Par-2a pl, E-08193 Bellaterra (Barcelona) (Spain); Tomsk State Pedagogical University, Tomsk (Russian Federation); Yurov, Artyom V. [Baltic Federal University of I. Kant, Department of Theoretical Physics, 236041, 14, Nevsky st., Kaliningrad (Russian Federation)

    2012-03-23

    We construct phantom energy models with the equation of state parameter w which is less than -1, w<-1, but finite-time future singularity does not occur. Such models can be divided into two classes: (i) energy density increases with time ('phantom energy' without 'Big Rip' singularity) and (ii) energy density tends to constant value with time ('cosmological constant' with asymptotically de Sitter evolution). The disintegration of bound structure is confirmed in Little Rip cosmology. Surprisingly, we find that such disintegration (on example of Sun-Earth system) may occur even in asymptotically de Sitter phantom universe consistent with observational data. We also demonstrate that non-singular phantom models admit wormhole solutions as well as possibility of Big Trip via wormholes.

  14. Phantom cosmology without Big Rip singularity

    International Nuclear Information System (INIS)

    Astashenok, Artyom V.; Nojiri, Shin'ichi; Odintsov, Sergei D.; Yurov, Artyom V.

    2012-01-01

    We construct phantom energy models with the equation of state parameter w which is less than -1, w<-1, but finite-time future singularity does not occur. Such models can be divided into two classes: (i) energy density increases with time (“phantom energy” without “Big Rip” singularity) and (ii) energy density tends to constant value with time (“cosmological constant” with asymptotically de Sitter evolution). The disintegration of bound structure is confirmed in Little Rip cosmology. Surprisingly, we find that such disintegration (on example of Sun-Earth system) may occur even in asymptotically de Sitter phantom universe consistent with observational data. We also demonstrate that non-singular phantom models admit wormhole solutions as well as possibility of Big Trip via wormholes.

  15. Math Fights Flooding

    NARCIS (Netherlands)

    Besseling, Niels; Bokhove, Onno; Kolechkina, Alla; Molenaar, Jaap; van Nooyen, Ronald; Rottschäfer, Vivi; Stein, Alfred; Stoorvogel, Anton

    2008-01-01

    Due to climate changes that are expected in the coming years, the characteristics of the rainfall will change. This can potentially cause flooding or have negative influences on agriculture and nature. In this research, we study the effects of this change in rainfall and investigate what can be done

  16. Flood model for Brazil

    Science.gov (United States)

    Palán, Ladislav; Punčochář, Petr

    2017-04-01

    Looking on the impact of flooding from the World-wide perspective, in last 50 years flooding has caused over 460,000 fatalities and caused serious material damage. Combining economic loss from ten costliest flood events (from the same period) returns a loss (in the present value) exceeding 300bn USD. Locally, in Brazil, flood is the most damaging natural peril with alarming increase of events frequencies as 5 out of the 10 biggest flood losses ever recorded have occurred after 2009. The amount of economic and insured losses particularly caused by various flood types was the key driver of the local probabilistic flood model development. Considering the area of Brazil (being 5th biggest country in the World) and the scattered distribution of insured exposure, a domain covered by the model was limited to the entire state of Sao Paolo and 53 additional regions. The model quantifies losses on approx. 90 % of exposure (for regular property lines) of key insurers. Based on detailed exposure analysis, Impact Forecasting has developed this tool using long term local hydrological data series (Agencia Nacional de Aguas) from riverine gauge stations and digital elevation model (Instituto Brasileiro de Geografia e Estatística). To provide most accurate representation of local hydrological behaviour needed for the nature of probabilistic simulation, a hydrological data processing focused on frequency analyses of seasonal peak flows - done by fitting appropriate extreme value statistical distribution and stochastic event set generation consisting of synthetically derived flood events respecting realistic spatial and frequency patterns visible in entire period of hydrological observation. Data were tested for homogeneity, consistency and for any significant breakpoint occurrence in time series so the entire observation or only its subparts were used for further analysis. The realistic spatial patterns of stochastic events are reproduced through the innovative use of d-vine copula

  17. Damaging Rainfall and Flooding. The Other Sahel Hazards

    Energy Technology Data Exchange (ETDEWEB)

    Tarhule, A. [Department of Geography, University of Oklahoma, 100 East Boyd Street, Norman, OK, 73079 (United States)

    2005-10-01

    Damaging rainfall and rain-induced flooding occur from time to time in the drought-prone Sahel savannah zone of Niger in West Africa but official records of these events and their socioeconomic impacts do not exist. This paper utilized newspaper accounts between 1970 and 2000 to survey and illustrate the range of these flood hazards in the Sahel. During the study interval, 53 newspaper articles reported 79 damaging rainfall and flood events in 47 different communities in the Sahel of Niger. Collectively, these events destroyed 5,580 houses and rendered 27,289 people homeless. Cash losses and damage to infrastructure in only three events exceeded $4 million. Sahel residents attribute these floods to five major causes including both natural and anthropogenic, but they view the flood problem as driven primarily by land use patterns. Despite such awareness, traditional coping strategies appear inadequate for dealing with the problems in part because of significant climatic variability. Analysis of several rainfall measures indicates that the cumulative rainfall in the days prior to a heavy rain event is an important factor influencing whether or not heavy rainfall results in flooding. Thus, despite some limitations, newspaper accounts of historical flooding are largely consistent with measured climatic variables. The study demonstrates that concerted effort is needed to improve the status of knowledge concerning flood impacts and indeed other natural and human hazards in the Sahel.

  18. The Total Risk Analysis of Large Dams under Flood Hazards

    Directory of Open Access Journals (Sweden)

    Yu Chen

    2018-02-01

    Full Text Available Dams and reservoirs are useful systems in water conservancy projects; however, they also pose a high-risk potential for large downstream areas. Flood, as the driving force of dam overtopping, is the main cause of dam failure. Dam floods and their risks are of interest to researchers and managers. In hydraulic engineering, there is a growing tendency to evaluate dam flood risk based on statistical and probabilistic methods that are unsuitable for the situations with rare historical data or low flood probability, so a more reasonable dam flood risk analysis method with fewer application restrictions is needed. Therefore, different from previous studies, this study develops a flood risk analysis method for large dams based on the concept of total risk factor (TRF used initially in dam seismic risk analysis. The proposed method is not affected by the adequacy of historical data or the low probability of flood and is capable of analyzing the dam structure influence, the flood vulnerability of the dam site, and downstream risk as well as estimating the TRF of each dam and assigning corresponding risk classes to each dam. Application to large dams in the Dadu River Basin, Southwestern China, demonstrates that the proposed method provides quick risk estimation and comparison, which can help local management officials perform more detailed dam safety evaluations for useful risk management information.

  19. Influence of Flood Detention Capability in Flood Prevention for Flood Disaster of Depression Area

    OpenAIRE

    Chia Lin Chan; Yi Ju Yang; Chih Chin Yang

    2011-01-01

    Rainfall records of rainfall station including the rainfall potential per hour and rainfall mass of five heavy storms are explored, respectively from 2001 to 2010. The rationalization formula is to investigate the capability of flood peak duration of flood detention pond in different rainfall conditions. The stable flood detention model is also proposed by using system dynamic control theory to get the message of flood detention pond in this research. When rainfall freque...

  20. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  1. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  2. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  3. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  4. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  5. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  6. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  7. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  8. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  9. Catchment scale multi-objective flood management

    Science.gov (United States)

    Rose, Steve; Worrall, Peter; Rosolova, Zdenka; Hammond, Gene

    2010-05-01

    Rural land management is known to affect both the generation and propagation of flooding at the local scale, but there is still a general lack of good evidence that this impact is still significant at the larger catchment scale given the complexity of physical interactions and climatic variability taking place at this level. The National Trust, in partnership with the Environment Agency, are managing an innovative project on the Holnicote Estate in south west England to demonstrate the benefits of using good rural land management practices to reduce flood risk at the both the catchment and sub-catchment scales. The Holnicote Estate is owned by the National Trust and comprises about 5,000 hectares of land, from the uplands of Exmoor to the sea, incorporating most of the catchments of the river Horner and Aller Water. There are nearly 100 houses across three villages that are at risk from flooding which could potentially benefit from changes in land management practices in the surrounding catchment providing a more sustainable flood attenuation function. In addition to the contribution being made to flood risk management there are a range of other ecosystems services that will be enhanced through these targeted land management changes. Alterations in land management will create new opportunities for wildlife and habitats and help to improve the local surface water quality. Such improvements will not only create additional wildlife resources locally but also serve the landscape response to climate change effects by creating and enhancing wildlife networks within the region. Land management changes will also restore and sustain landscape heritage resources and provide opportunities for amenity, recreation and tourism. The project delivery team is working with the National Trust from source to sea across the entire Holnicote Estate, to identify and subsequently implement suitable land management techniques to manage local flood risk within the catchments. These

  10. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  11. GIS Support for Flood Rescue

    DEFF Research Database (Denmark)

    Liang, Gengsheng; Mioc, Darka; Anton, François

    2007-01-01

    Under flood events, the ground traffic is blocked in and around the flooded area due to damages to roads and bridges. The traditional transportation network may not always help people to make a right decision for evacuation. In order to provide dynamic road information needed for flood rescue, we...... to retrieve the shortest and safest route in Fredericton road network during flood event. It enables users to make a timely decision for flood rescue. We are using Oracle Spatial to deal with emergency situations that can be applied to other constrained network applications as well....... developed an adaptive web-based transportation network application using Oracle technology. Moreover, the geographic relationships between the road network and flood areas are taken into account. The overlay between the road network and flood polygons is computed on the fly. This application allows users...

  12. Numerical simulation of flood barriers

    Science.gov (United States)

    Srb, Pavel; Petrů, Michal; Kulhavý, Petr

    This paper deals with testing and numerical simulating of flood barriers. The Czech Republic has been hit by several very devastating floods in past years. These floods caused several dozens of causalities and property damage reached billions of Euros. The development of flood measures is very important, especially for the reduction the number of casualties and the amount of property damage. The aim of flood control measures is the detention of water outside populated areas and drainage of water from populated areas as soon as possible. For new flood barrier design it is very important to know its behaviour in case of a real flood. During the development of the barrier several standardized tests have to be carried out. Based on the results from these tests numerical simulation was compiled using Abaqus software and some analyses were carried out. Based on these numerical simulations it will be possible to predict the behaviour of barriers and thus improve their design.

  13. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  14. Citizen involvement in flood risk governance: flood groups and networks

    Directory of Open Access Journals (Sweden)

    Twigger-Ross Clare

    2016-01-01

    Full Text Available Over the past decade has been a policy shift withinUK flood risk management towards localism with an emphasis on communities taking ownership of flood risk. There is also an increased focus on resilience and, more specifically, on community resilience to flooding. This paper draws on research carried out for UK Department for Environment Food and Rural Affairs to evaluate the Flood Resilience Community Pathfinder (FRCP scheme in England. Resilience is conceptualised as multidimensional and linked to exisiting capacities within a community. Creating resilience to flooding is an ongoing process of adaptation, learning from past events and preparing for future risks. This paper focusses on the development of formal and informal institutions to support improved flood risk management: institutional resilience capacity. It includes new institutions: e.g. flood groups, as well as activities that help to build inter- and intra- institutional resilience capacity e.g. community flood planning. The pathfinder scheme consisted of 13 projects across England led by local authorities aimed at developing community resilience to flood risk between 2013 – 2015. This paper discusses the nature and structure of flood groups, the process of their development, and the extent of their linkages with formal institutions, drawing out the barriers and facilitators to developing institutional resilience at the local level.

  15. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  16. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  17. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  18. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  19. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  20. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  1. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  2. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  3. Effects of a flooding event on a threatened black bear population in Louisiana

    Science.gov (United States)

    O'Connell-Goode, Kaitlin C.; Lowe, Carrie L.; Clark, Joseph D.

    2014-01-01

    The Louisiana black bear, Ursus americanus luteolus, is listed as threatened under the Endangered Species Act as a result of habitat loss and human-related mortality. Information on population-level responses of large mammals to flooding events is scarce, and we had a unique opportunity to evaluate the viability of the Upper Atchafalaya River Basin (UARB) black bear population before and after a significant flooding event. We began collecting black bear hair samples in 2007 for a DNA mark-recapture study to estimate abundance (N) and apparent survival (φ). In 2011, the Morganza Spillway was opened to divert floodwaters from the Mississippi River through the UARB, inundating > 50% of our study area, potentially impacting recovery of this important bear population. To evaluate the effects of this flooding event on bear population dynamics, we used a robust design multistate model to estimate changes in transition rates from the flooded area to non-flooded area (ψF→NF) before (2007–2010), during (2010–2011) and after (2011–2012) the flood. Average N across all years of study was 63.2 (SE = 5.2), excluding the year of the flooding event. Estimates of ψF→NF increased from 0.014 (SE = 0.010; meaning that 1.4% of the bears moved from the flooded area to non-flooded areas) before flooding to 0.113 (SE = 0.045) during the flood year, and then decreased to 0.028 (SE= 0.035) after the flood. Although we demonstrated a flood effect on transition rates as hypothesized, the effect was small (88.7% of the bears remained in the flooded area during flooding) and φ was unchanged, suggesting that the 2011 flooding event had minimal impact on survival and site fidelity.

  4. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  5. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  7. An Evaluation of Selected Extraordinary Floods in the United States Reported by the U.S. Geological Survey and Implications for Future Advancement of Flood Science

    Science.gov (United States)

    Costa, John E.; Jarrett, Robert D.

    2008-01-01

    discharges that were estimated by an inappropriate method (slope-area) (Big Creek near Waynesville, North Carolina; Day Creek near Etiwanda, California). Original field notes and records could not be found for three of the floods, however, some data (copies of original materials, records of reviews) were available for two of these floods. A rating was assigned to each of seven peak discharges that had no rating. Errors identified in the reviews include misidentified flow processes, incorrect drainage areas for very small basins, incorrect latitude and longitude, improper field methods, arithmetic mistakes in hand calculations, omission of measured high flows when developing rating curves, and typographical errors. Common problems include use of two-section slope-area measurements, poor site selection, uncertainties in Manning's n-values, inadequate review, lost data files, and insufficient and inadequately described high-water marks. These floods also highlight the extreme difficulty in making indirect discharge measurements following extraordinary floods. Significantly, none of the indirect measurements are rated better than fair, which indicates the need to improve methodology to estimate peak discharge. Highly unsteady flow and resulting transient hydraulic phenomena, two-dimensional flow patterns, debris flows at streamflow-gaging stations, and the possibility of disconnected flow surfaces are examples of unresolved problems not well handled by current indirect discharge methodology. On the basis of a comprehensive review of 50,000 annual peak discharges and miscellaneous floods in California, problems with individual flood peak discharges would be expected to require a revision of discharge or rating curves at a rate no greater than about 0.10 percent of all floods. Many extraordinary floods create complex flow patterns and processes that cannot be adequately documented with quasi-steady, uniform one-dimensional analyses. These floods are most accura

  8. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    ., 2008) The methodology includes estimates of flood probabilities due to coastal- and fluvial-driven processes occurring individually or jointly, mechanisms of flooding and their impacts on urban environment. Various flood scenarios are examined in order to demonstrate that this methodology is necessary to quantify the important physical processes in coastal flood predictions. Cork City, located on the south of Ireland subject to frequent coastal-fluvial flooding, is used as a study case.

  9. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  10. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  11. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  12. Modeling and Analysis in Marine Big Data: Advances and Challenges

    Directory of Open Access Journals (Sweden)

    Dongmei Huang

    2015-01-01

    Full Text Available It is aware that big data has gathered tremendous attentions from academic research institutes, governments, and enterprises in all aspects of information sciences. With the development of diversity of marine data acquisition techniques, marine data grow exponentially in last decade, which forms marine big data. As an innovation, marine big data is a double-edged sword. On the one hand, there are many potential and highly useful values hidden in the huge volume of marine data, which is widely used in marine-related fields, such as tsunami and red-tide warning, prevention, and forecasting, disaster inversion, and visualization modeling after disasters. There is no doubt that the future competitions in marine sciences and technologies will surely converge into the marine data explorations. On the other hand, marine big data also brings about many new challenges in data management, such as the difficulties in data capture, storage, analysis, and applications, as well as data quality control and data security. To highlight theoretical methodologies and practical applications of marine big data, this paper illustrates a broad view about marine big data and its management, makes a survey on key methods and models, introduces an engineering instance that demonstrates the management architecture, and discusses the existing challenges.

  13. Floods and tsunamis.

    Science.gov (United States)

    Llewellyn, Mark

    2006-06-01

    Floods and tsunamis cause few severe injuries, but those injuries can overwhelm local areas, depending on the magnitude of the disaster. Most injuries are extremity fractures, lacerations, and sprains. Because of the mechanism of soft tissue and bone injuries, infection is a significant risk. Aspiration pneumonias are also associated with tsunamis. Appropriate precautionary interventions prevent communicable dis-ease outbreaks. Psychosocial health issues must be considered.

  14. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  15. Identification of flood-rich and flood-poor periods in flood series

    Science.gov (United States)

    Mediero, Luis; Santillán, David; Garrote, Luis

    2015-04-01

    Recently, a general concern about non-stationarity of flood series has arisen, as changes in catchment response can be driven by several factors, such as climatic and land-use changes. Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Trends are usually detected by the Mann-Kendall test. However, the results of this test depend on the starting and ending year of the series, which can lead to different results in terms of the period considered. The results can be conditioned to flood-poor and flood-rich periods located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to a set of long series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. Mediero et al. (2014) found a general decreasing trend in flood series in some parts of Spain that could be caused by a flood-rich period observed in 1950-1970, placed at the beginning of the flood series. The results of this study support the findings of Mediero et al. (2014), as a flood-rich period in 1950-1970 was identified in most of the selected sites. References: Mediero, L., Santillán, D., Garrote, L., Granados, A. Detection and attribution of trends in magnitude, frequency and timing of floods in Spain, Journal of Hydrology, 517, 1072-1088, 2014.

  16. Ecosystem Approach To Flood Disaster Risk Reduction

    Directory of Open Access Journals (Sweden)

    RK Kamble

    2013-12-01

    Full Text Available India is one of the ten worst disaster prone countries of the world. The country is prone to disasters due to number of factors; both natural and anthropogenic, including adverse geo-climatic conditions, topographical features, environmental degradation, population growth, urbanisation, industrlisation, non-scientific development practices etc. The factors either in original or by accelerating the intensity and frequency of disasters are responsible for heavy toll of human lives and disrupting the life support systems in the country. India has 40 million hectares of the flood-prone area, on an average, flood affect an area of around 7.5 million hectares per year. Knowledge of environmental systems and processes are key factors in the management of disasters, particularly the hydro-metrological ones. Management of flood risk and disaster is a multi-dimensional affair that calls for interdisciplinary approach. Ecosystem based disaster risk reduction builds on ecosystem management principles, strategies and tools in order to maximise ecosystem services for risk reduction. This perspective takes into account the integration of social and ecological systems, placing people at the centre of decision making. The present paper has been attempted to demonstrate how ecosystem-based approach can help in flood disaster risk reduction. International Journal of Environment, Volume-2, Issue-1, Sep-Nov 2013, Pages 70-82 DOI: http://dx.doi.org/10.3126/ije.v2i1.9209

  17. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  18. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California.

    Science.gov (United States)

    Calil, Juliano; Beck, Michael W; Gleason, Mary; Merrifield, Matthew; Klausmeyer, Kirk; Newkirk, Sarah

    2015-01-01

    Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S. Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as "repetitive loss." During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds.

  19. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California.

    Directory of Open Access Journals (Sweden)

    Juliano Calil

    Full Text Available Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S.Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as "repetitive loss." During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds.

  20. The development of flood map in Malaysia

    Science.gov (United States)

    Zakaria, Siti Fairus; Zin, Rosli Mohamad; Mohamad, Ismail; Balubaid, Saeed; Mydin, Shaik Hussein; MDR, E. M. Roodienyanto

    2017-11-01

    In Malaysia, flash floods are common occurrences throughout the year in flood prone areas. In terms of flood extent, flash floods affect smaller areas but because of its tendency to occur in densely urbanized areas, the value of damaged property is high and disruption to traffic flow and businesses are substantial. However, in river floods especially the river floods of Kelantan and Pahang, the flood extent is widespread and can extend over 1,000 square kilometers. Although the value of property and density of affected population is lower, the damage inflicted by these floods can also be high because the area affected is large. In order to combat these floods, various flood mitigation measures have been carried out. Structural flood mitigation alone can only provide protection levels from 10 to 100 years Average Recurrence Intervals (ARI). One of the economically effective non-structural approaches in flood mitigation and flood management is using a geospatial technology which involves flood forecasting and warning services to the flood prone areas. This approach which involves the use of Geographical Information Flood Forecasting system also includes the generation of a series of flood maps. There are three types of flood maps namely Flood Hazard Map, Flood Risk Map and Flood Evacuation Map. Flood Hazard Map is used to determine areas susceptible to flooding when discharge from a stream exceeds the bank-full stage. Early warnings of incoming flood events will enable the flood victims to prepare themselves before flooding occurs. Properties and life's can be saved by keeping their movable properties above the flood levels and if necessary, an early evacuation from the area. With respect to flood fighting, an early warning with reference through a series of flood maps including flood hazard map, flood risk map and flood evacuation map of the approaching flood should be able to alert the organization in charge of the flood fighting actions and the authority to

  1. iFLOOD: A Real Time Flood Forecast System for Total Water Modeling in the National Capital Region

    Science.gov (United States)

    Sumi, S. J.; Ferreira, C.

    2017-12-01

    Extreme flood events are the costliest natural hazards impacting the US and frequently cause extensive damages to infrastructure, disruption to economy and loss of lives. In 2016, Hurricane Matthew brought severe damage to South Carolina and demonstrated the importance of accurate flood hazard predictions that requires the integration of riverine and coastal model forecasts for total water prediction in coastal and tidal areas. The National Weather Service (NWS) and the National Ocean Service (NOS) provide flood forecasts for almost the entire US, still there are service-gap areas in tidal regions where no official flood forecast is available. The National capital region is vulnerable to multi-flood hazards including high flows from annual inland precipitation events and surge driven coastal inundation along the tidal Potomac River. Predicting flood levels on such tidal areas in river-estuarine zone is extremely challenging. The main objective of this study is to develop the next generation of flood forecast systems capable of providing accurate and timely information to support emergency management and response in areas impacted by multi-flood hazards. This forecast system is capable of simulating flood levels in the Potomac and Anacostia River incorporating the effects of riverine flooding from the upstream basins, urban storm water and tidal oscillations from the Chesapeake Bay. Flood forecast models developed so far have been using riverine data to simulate water levels for Potomac River. Therefore, the idea is to use forecasted storm surge data from a coastal model as boundary condition of this system. Final output of this validated model will capture the water behavior in river-estuary transition zone far better than the one with riverine data only. The challenge for this iFLOOD forecast system is to understand the complex dynamics of multi-flood hazards caused by storm surges, riverine flow, tidal oscillation and urban storm water. Automated system

  2. Swiss Re Global Flood Hazard Zones: Know your flood risk

    Science.gov (United States)

    Vinukollu, R. K.; Castaldi, A.; Mehlhorn, J.

    2012-12-01

    Floods, among all natural disasters, have a great damage potential. On a global basis, there is strong evidence of increase in the number of people affected and economic losses due to floods. For example, global insured flood losses have increased by 12% every year since 1970 and this is expected to further increase with growing exposure in the high risk areas close to rivers and coastlines. Recently, the insurance industry has been surprised by the large extent of losses, because most countries lack reliable hazard information. One example has been the 2011 Thailand floods where millions of people were affected and the total economic losses were 30 billion USD. In order to assess the flood risk across different regions and countries, the flood team at Swiss Re based on a Geomorphologic Regression approach, developed in house and patented, produced global maps of flood zones. Input data for the study was obtained from NASA's Shuttle Radar Topographic Mission (SRTM) elevation data, Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) and HydroSHEDS. The underlying assumptions of the approach are that naturally flowing rivers shape their channel and flood plain according to basin inherent forces and characteristics and that the flood water extent strongly depends on the shape of the flood plain. On the basis of the catchment characteristics, the model finally calculates the probability of a location to be flooded or not for a defined return period, which in the current study was set to 100 years. The data is produced at a 90-m resolution for latitudes 60S to 60N. This global product is now used in the insurance industry to inspect, inform and/or insure the flood risk across the world.

  3. Modeling urban coastal flood severity from crowd-sourced flood reports using Poisson regression and Random Forest

    Science.gov (United States)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2018-04-01

    Sea level rise has already caused more frequent and severe coastal flooding and this trend will likely continue. Flood prediction is an essential part of a coastal city's capacity to adapt to and mitigate this growing problem. Complex coastal urban hydrological systems however, do not always lend themselves easily to physically-based flood prediction approaches. This paper presents a method for using a data-driven approach to estimate flood severity in an urban coastal setting using crowd-sourced data, a non-traditional but growing data source, along with environmental observation data. Two data-driven models, Poisson regression and Random Forest regression, are trained to predict the number of flood reports per storm event as a proxy for flood severity, given extensive environmental data (i.e., rainfall, tide, groundwater table level, and wind conditions) as input. The method is demonstrated using data from Norfolk, Virginia USA from September 2010 to October 2016. Quality-controlled, crowd-sourced street flooding reports ranging from 1 to 159 per storm event for 45 storm events are used to train and evaluate the models. Random Forest performed better than Poisson regression at predicting the number of flood reports and had a lower false negative rate. From the Random Forest model, total cumulative rainfall was by far the most dominant input variable in predicting flood severity, followed by low tide and lower low tide. These methods serve as a first step toward using data-driven methods for spatially and temporally detailed coastal urban flood prediction.

  4. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  5. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  6. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  7. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  8. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  9. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  10. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  11. Recent advances in flood forecasting and flood risk assessment

    Directory of Open Access Journals (Sweden)

    G. Arduino

    2005-01-01

    Full Text Available Recent large floods in Europe have led to increased interest in research and development of flood forecasting systems. Some of these events have been provoked by some of the wettest rainfall periods on record which has led to speculation that such extremes are attributable in some measure to anthropogenic global warming and represent the beginning of a period of higher flood frequency. Whilst current trends in extreme event statistics will be difficult to discern, conclusively, there has been a substantial increase in the frequency of high floods in the 20th century for basins greater than 2x105 km2. There is also increasing that anthropogenic forcing of climate change may lead to an increased probability of extreme precipitation and, hence, of flooding. There is, therefore, major emphasis on the improvement of operational flood forecasting systems in Europe, with significant European Community spending on research and development on prototype forecasting systems and flood risk management projects. This Special Issue synthesises the most relevant scientific and technological results presented at the International Conference on Flood Forecasting in Europe held in Rotterdam from 3-5 March 2003. During that meeting 150 scientists, forecasters and stakeholders from four continents assembled to present their work and current operational best practice and to discuss future directions of scientific and technological efforts in flood prediction and prevention. The papers presented at the conference fall into seven themes, as follows.

  12. Public perception of flood risks, flood forecasting and mitigation

    Directory of Open Access Journals (Sweden)

    M. Brilly

    2005-01-01

    Full Text Available A multidisciplinary and integrated approach to the flood mitigation decision making process should provide the best response of society in a flood hazard situation including preparation works and post hazard mitigation. In Slovenia, there is a great lack of data on social aspects and public response to flood mitigation measures and information management. In this paper, two studies of flood perception in the Slovenian town Celje are represented. During its history, Celje was often exposed to floods, the most recent serious floods being in 1990 and in 1998, with a hundred and fifty return period and more than ten year return period, respectively. Two surveys were conducted in 1997 and 2003, with 157 participants from different areas of the town in the first, and 208 in the second study, aiming at finding the general attitude toward the floods. The surveys revealed that floods present a serious threat in the eyes of the inhabitants, and that the perception of threat depends, to a certain degree, on the place of residence. The surveys also highlighted, among the other measures, solidarity and the importance of insurance against floods.

  13. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  14. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  15. Long-term reactions of plants and macroinvertebrates to extreme floods in floodplain grasslands.

    Science.gov (United States)

    Ilg, Christiane; Dziock, Frank; Foeckler, Francis; Follner, Klaus; Gerisch, Michael; Glaeser, Judith; Rink, Anke; Schanowski, Arno; Scholz, Mathias; Deichner, Oskar; Henle, Klaus

    2008-09-01

    Extreme summertime flood events are expected to become more frequent in European rivers due to climate change. In temperate areas, where winter floods are common, extreme floods occurring in summer, a period of high physiological activity, may seriously impact floodplain ecosystems. Here we report on the effects of the 2002 extreme summer flood on flora and fauna of the riverine grasslands of the Middle Elbe (Germany), comparing pre- and post-flooding data collected by identical methods. Plants, mollusks, and carabid beetles differed considerably in their response in terms of abundance and diversity. Plants and mollusks, displaying morphological and behavioral adaptations to flooding, showed higher survival rates than the carabid beetles, the adaptation strategies of which were mainly linked to life history. Our results illustrate the complexity of responses of floodplain organisms to extreme flood events. They demonstrate that the efficiency of resistance and resilience strategies is widely dependent on the mode of adaptation.

  16. On the flood forecasting at the Bulgarian part of Struma River Basin

    International Nuclear Information System (INIS)

    Dimitrov, Dobri

    2004-01-01

    Struma is a mountain river flowing from North to South, from Bulgaria through Greece up to the Aegean Sea. It generates flush floods of snow melt - rainfall type mainly in the late spring. Flood forecasting there is needed to improve the flood mitigation measures at the Bulgarian territory of the basin as well as for effective reservoir management downstream Bulgarian border, secure flood handling at Greek territory and generally decrease the flood hazard. The paper summarizes the range of activities in the basin including: - the installation of automatic telemetric hydro meteorological observation network; - review of the results of relevant past projects; - analysis of historical hydro meteorological data; - design and calibration of flood forecasting models; - demonstrating the possibility to issue flood warnings with certain lead time and accuracy; - recent efforts to increase the lead time of the hydrological forecasts, applying forecasts from High Resolution Limited Area meteorological models and other activities in the frame of the EC 5th FP EFFS project.(Author)

  17. Flooding correlations in narrow channel

    International Nuclear Information System (INIS)

    Kim, S. H.; Baek, W. P.; Chang, S. H.

    1999-01-01

    Heat transfer in narrow gap is considered as important phenomena in severe accidents in nuclear power plants. Also in heat removal of electric chip. Critical heat flux(CHF) in narrow gap limits the maximum heat transfer rate in narrow channel. In case of closed bottom channel, flooding limited CHF occurrence is observed. Flooding correlations will be helpful to predict the CHF in closed bottom channel. In present study, flooding data for narrow channel geometry were collected and the work to recognize the effect of the span, w and gap size, s were performed. And new flooding correlations were suggested for high-aspect-ratio geometry. Also, flooding correlation was applied to flooding limited CHF data

  18. Flood Hazards - A National Threat

    Science.gov (United States)

    ,

    2006-01-01

    In the late summer of 2005, the remarkable flooding brought by Hurricane Katrina, which caused more than $200 billion in losses, constituted the costliest natural disaster in U.S. history. However, even in typical years, flooding causes billions of dollars in damage and threatens lives and property in every State. Natural processes, such as hurricanes, weather systems, and snowmelt, can cause floods. Failure of levees and dams and inadequate drainage in urban areas can also result in flooding. On average, floods kill about 140 people each year and cause $6 billion in property damage. Although loss of life to floods during the past half-century has declined, mostly because of improved warning systems, economic losses have continued to rise due to increased urbanization and coastal development.

  19. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  20. Statistics and Analysis of the Relations between Rainstorm Floods and Earthquakes

    Directory of Open Access Journals (Sweden)

    Baodeng Hou

    2016-01-01

    Full Text Available The frequent occurrence of geophysical disasters under climate change has drawn Chinese scholars to pay their attention to disaster relations. If the occurrence sequence of disasters could be identified, long-term disaster forecast could be realized. Based on the Earth Degassing Effect (EDE which is valid, this paper took the magnitude, epicenter, and occurrence time of the earthquake, as well as the epicenter and occurrence time of the rainstorm floods as basic factors to establish an integrated model to study the correlation between rainstorm floods and earthquakes. 2461 severe earthquakes occurred in China or within 3000 km from China and the 169 heavy rainstorm floods occurred in China over the past 200+ years as the input data of the model. The computational results showed that although most of the rainstorm floods have nothing to do with the severe earthquakes from a statistical perspective, some floods might relate to earthquakes. This is especially true when the earthquakes happen in the vapor transmission zone where rainstorms lead to abundant water vapors. In this regard, earthquakes are more likely to cause big rainstorm floods. However, many cases of rainstorm floods could be found after severe earthquakes with a large extent of uncertainty.

  1. Societal and economic impacts of flood hazards in Turkey – an overview

    Directory of Open Access Journals (Sweden)

    Koç Gamze

    2016-01-01

    Full Text Available Turkey has been severely affected by many natural hazards, in particular earthquakes and floods. Although there is a large body of literature on earthquake hazards and risks in Turkey, comparatively little is known about flood hazards and risks. Therefore, with this study it is aimed to investigate flood patterns, societal and economic impacts of flood hazards in Turkey, as well as providing a comparative overview of the temporal and spatial distribution of flood losses by analysing EM-DAT (Emergency Events Database and TABB (Turkey Disaster Data Base databases on disaster losses throughout Turkey for the years 1960-2014. The comparison of these two databases reveals big mismatches of the flood data, e.g. the reported number of events, number of affected people and economic loss, differ dramatically. With this paper, it has been explored reasons for mismatches. Biases and fallacies for loss data in the two databases has been discussed as well. Since loss data collection is gaining more and more attention, e.g. in the Sendai Framework for Disaster Risk Reduction 2015-2030 (SFDRR, the study could offer a base-work for developing guidelines and procedures on how to standardize loss databases and implement across the other hazard events, as well as substantial insights for flood risk mitigation and adaptation studies in Turkey and will offer valuable insights for other (European countries.

  2. Fault tree analysis for urban flooding

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.; Van Gelder, P.H.A.J.M.

    2008-01-01

    Traditional methods to evaluate flood risk mostly focus on storm events as the main cause of flooding. Fault tree analysis is a technique that is able to model all potential causes of flooding and to quantify both the overall probability of flooding and the contributions of all causes of flooding to

  3. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  4. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  5. Flood Impacts on People: from Hazard to Risk Maps

    Science.gov (United States)

    Arrighi, C.; Castelli, F.

    2017-12-01

    The mitigation of adverse consequences of floods on people is crucial for civil protection and public authorities. According to several studies, in the developed countries the majority of flood-related fatalities occurs due to inappropriate high risk behaviours such as driving and walking in floodwaters. In this work both the loss of stability of vehicles and pedestrians in floodwaters are analysed. Flood hazard is evaluated, based on (i) a 2D inundation model of an urban area, (ii) 3D hydrodynamic simulations of water flows around vehicles and human body and (iii) a dimensional analysis of experimental activity. Exposure and vulnerability of vehicles and population are assessed exploiting several sources of open GIS data in order to produce risk maps for a testing case study. The results show that a significant hazard to vehicles and pedestrians exists in the study area. Particularly high is the hazard to vehicles, which are likely to be swept away by flood flow, possibly aggravate damages to structures and infrastructures and locally alter the flood propagation. Exposure and vulnerability analysis identifies some structures such as schools and public facilities, which may attract several people. Moreover, some shopping facilities in the area, which attract both vehicular and pedestrians' circulation are located in the highest flood hazard zone.The application of the method demonstrates that, at municipal level, such risk maps can support civil defence strategies and education to active citizenship, thus contributing to flood impact reduction to population.

  6. Should seasonal rainfall forecasts be used for flood preparedness?

    Directory of Open Access Journals (Sweden)

    E. Coughlan de Perez

    2017-09-01

    Full Text Available In light of strong encouragement for disaster managers to use climate services for flood preparation, we question whether seasonal rainfall forecasts should indeed be used as indicators of the likelihood of flooding. Here, we investigate the primary indicators of flooding at the seasonal timescale across sub-Saharan Africa. Given the sparsity of hydrological observations, we input bias-corrected reanalysis rainfall into the Global Flood Awareness System to identify seasonal indicators of floodiness. Results demonstrate that in some regions of western, central, and eastern Africa with typically wet climates, even a perfect tercile forecast of seasonal total rainfall would provide little to no indication of the seasonal likelihood of flooding. The number of extreme events within a season shows the highest correlations with floodiness consistently across regions. Otherwise, results vary across climate regimes: floodiness in arid regions in southern and eastern Africa shows the strongest correlations with seasonal average soil moisture and seasonal total rainfall. Floodiness in wetter climates of western and central Africa and Madagascar shows the strongest relationship with measures of the intensity of seasonal rainfall. Measures of rainfall patterns, such as the length of dry spells, are least related to seasonal floodiness across the continent. Ultimately, identifying the drivers of seasonal flooding can be used to improve forecast information for flood preparedness and to avoid misleading decision-makers.

  7. Building regional early flood warning systems by AI techniques

    Science.gov (United States)

    Chang, F. J.; Chang, L. C.; Amin, M. Z. B. M.

    2017-12-01

    Building early flood warning system is essential for the protection of the residents against flood hazards and make actions to mitigate the losses. This study implements AI technology for forecasting multi-step-ahead regional flood inundation maps during storm events. The methodology includes three major schemes: (1) configuring the self-organizing map (SOM) to categorize a large number of regional inundation maps into a meaningful topology; (2) building dynamic neural networks to forecast multi-step-ahead average inundated depths (AID); and (3) adjusting the weights of the selected neuron in the constructed SOM based on the forecasted AID to obtain real-time regional inundation maps. The proposed models are trained, and tested based on a large number of inundation data sets collected in regions with the most frequent and serious flooding in the river basin. The results appear that the SOM topological relationships between individual neurons and their neighbouring neurons are visible and clearly distinguishable, and the hybrid model can continuously provide multistep-ahead visible regional inundation maps with high resolution during storm events, which have relatively small RMSE values and high R2 as compared with numerical simulation data sets. The computing time is only few seconds, and thereby leads to real-time regional flood inundation forecasting and make early flood inundation warning system. We demonstrate that the proposed hybrid ANN-based model has a robust and reliable predictive ability and can be used for early warning to mitigate flood disasters.

  8. The Financial Benefit of Early Flood Warnings in Europe

    Science.gov (United States)

    Pappenberger, Florian; Cloke, Hannah L.; Wetterhall, Fredrik; Parker, Dennis J.; Richardson, David; Thielen, Jutta

    2015-04-01

    Effective disaster risk management relies on science based solutions to close the gap between prevention and preparedness measures. The outcome of consultations on the UNIDSR post-2015 framework for disaster risk reduction highlight the need for cross-border early warning systems to strengthen the preparedness phases of disaster risk management in order to save people's lives and property and reduce the overall impact of severe events. In particular, continental and global scale flood forecasting systems provide vital information to various decision makers with which early warnings of floods can be made. Here the potential monetary benefits of early flood warnings using the example of the European Flood Awareness System (EFAS) are calculated based on pan-European Flood damage data and calculations of potential flood damage reductions. The benefits are of the order of 400 Euro for every 1 Euro invested. Because of the uncertainties which accompany the calculation, a large sensitivity analysis is performed in order to develop an envelope of possible financial benefits. Current EFAS system skill is compared against perfect forecasts to demonstrate the importance of further improving the skill of the forecasts. Improving the response to warnings is also essential in reaping the benefits of flood early warnings.

  9. Rethinking the relationship between flood risk perception and flood management.

    Science.gov (United States)

    Birkholz, S; Muro, M; Jeffrey, P; Smith, H M

    2014-04-15

    Although flood risk perceptions and their concomitant motivations for behaviour have long been recognised as significant features of community resilience in the face of flooding events, there has, for some time now, been a poorly appreciated fissure in the accompanying literature. Specifically, rationalist and constructivist paradigms in the broader domain of risk perception provide different (though not always conflicting) contexts for interpreting evidence and developing theory. This contribution reviews the major constructs that have been applied to understanding flood risk perceptions and contextualises these within broader conceptual developments around risk perception theory and contemporary thinking around flood risk management. We argue that there is a need to re-examine and re-invigorate flood risk perception research, in a manner that is comprehensively underpinned by more constructivist thinking around flood risk management as well as by developments in broader risk perception research. We draw attention to an historical over-emphasis on the cognitive perceptions of those at risk to the detriment of a richer understanding of a wider range of flood risk perceptions such as those of policy-makers or of tax-payers who live outside flood affected areas as well as the linkages between these perspectives and protective measures such as state-supported flood insurance schemes. Conclusions challenge existing understandings of the relationship between risk perception and flood management, particularly where the latter relates to communication strategies and the extent to which those at risk from flooding feel responsible for taking protective actions. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Flood Risk and Flood hazard maps - Visualisation of hydrological risks

    International Nuclear Information System (INIS)

    Spachinger, Karl; Dorner, Wolfgang; Metzka, Rudolf; Serrhini, Kamal; Fuchs, Sven

    2008-01-01

    Hydrological models are an important basis of flood forecasting and early warning systems. They provide significant data on hydrological risks. In combination with other modelling techniques, such as hydrodynamic models, they can be used to assess the extent and impact of hydrological events. The new European Flood Directive forces all member states to evaluate flood risk on a catchment scale, to compile maps of flood hazard and flood risk for prone areas, and to inform on a local level about these risks. Flood hazard and flood risk maps are important tools to communicate flood risk to different target groups. They provide compiled information to relevant public bodies such as water management authorities, municipalities, or civil protection agencies, but also to the broader public. For almost each section of a river basin, run-off and water levels can be defined based on the likelihood of annual recurrence, using a combination of hydrological and hydrodynamic models, supplemented by an analysis of historical records and mappings. In combination with data related to the vulnerability of a region risk maps can be derived. The project RISKCATCH addressed these issues of hydrological risk and vulnerability assessment focusing on the flood risk management process. Flood hazard maps and flood risk maps were compiled for Austrian and German test sites taking into account existing national and international guidelines. These maps were evaluated by eye-tracking using experimental graphic semiology. Sets of small-scale as well as large-scale risk maps were presented to test persons in order to (1) study reading behaviour as well as understanding and (2) deduce the most attractive components that are essential for target-oriented risk communication. A cognitive survey asking for negative and positive aspects and complexity of each single map complemented the experimental graphic semiology. The results indicate how risk maps can be improved to fit the needs of different user

  11. Effects of climate variability on global scale flood risk

    Science.gov (United States)

    Ward, P.; Dettinger, M. D.; Kummu, M.; Jongman, B.; Sperna Weiland, F.; Winsemius, H.

    2013-12-01

    In this contribution we demonstrate the influence of climate variability on flood risk. Globally, flooding is one of the worst natural hazards in terms of economic damages; Munich Re estimates global losses in the last decade to be in excess of $240 billion. As a result, scientifically sound estimates of flood risk at the largest scales are increasingly needed by industry (including multinational companies and the insurance industry) and policy communities. Several assessments of global scale flood risk under current and conditions have recently become available, and this year has seen the first studies assessing how flood risk may change in the future due to global change. However, the influence of climate variability on flood risk has as yet hardly been studied, despite the fact that: (a) in other fields (drought, hurricane damage, food production) this variability is as important for policy and practice as long term change; and (b) climate variability has a strong influence in peak riverflows around the world. To address this issue, this contribution illustrates the influence of ENSO-driven climate variability on flood risk, at both the globally aggregated scale and the scale of countries and large river basins. Although it exerts significant and widespread influences on flood peak discharges in many parts of the world, we show that ENSO does not have a statistically significant influence on flood risk once aggregated to global totals. At the scale of individual countries, though, strong relationships exist over large parts of the Earth's surface. For example, we find particularly strong anomalies of flood risk in El Niño or La Niña years (compared to all years) in southern Africa, parts of western Africa, Australia, parts of Central Eurasia (especially for El Niño), the western USA (especially for La Niña), and parts of South America. These findings have large implications for both decadal climate-risk projections and long-term future climate change

  12. Towards a Flood Severity Index

    Science.gov (United States)

    Kettner, A.; Chong, A.; Prades, L.; Brakenridge, G. R.; Muir, S.; Amparore, A.; Slayback, D. A.; Poungprom, R.

    2017-12-01

    Flooding is the most common natural hazard worldwide, affecting 21 million people every year. In the immediate moments following a flood event, humanitarian actors like the World Food Program need to make rapid decisions ( 72 hrs) on how to prioritize affected areas impacted by such an event. For other natural disasters like hurricanes/cyclones and earthquakes, there are industry-recognized standards on how the impacted areas are to be classified. Shake maps, quantifying peak ground motion, from for example the US Geological Survey are widely used for assessing earthquakes. Similarly, cyclones are tracked by Joint Typhoon Warning Center (JTWC) and Global Disaster Alert and Coordination System (GDACS) who release storm nodes and tracks (forecasted and actual), with wind buffers and classify the event according to the Saffir-Simpson Hurricane Wind Scale. For floods, the community is usually able to acquire unclassified data of the flood extent as identified from satellite imagery. Most often no water discharge hydrograph is available to classify the event into recurrence intervals simply because there is no gauging station, or the gauging station was unable to record the maximum discharge due to overtopping or flood damage. So, the question remains: How do we methodically turn a flooded area into classified areas of different gradations of impact? Here, we present a first approach towards developing a global applicable flood severity index. The flood severity index is set up such that it considers relatively easily obtainable physical parameters in a short period of time like: flood frequency (relating the current flood to historical events) and magnitude, as well as land cover, slope, and where available pre-event simulated flood depth. The scale includes categories ranging from very minor flooding to catastrophic flooding. We test and evaluate the postulated classification scheme against a set of past flood events. Once a severity category is determined, socio

  13. Flood Catastrophe Model for Designing Optimal Flood Insurance Program: Estimating Location-Specific Premiums in the Netherlands.

    Science.gov (United States)

    Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A

    2017-01-01

    As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.

  14. Quantifying riverine and storm-surge flood risk by single-family residence: application to Texas.

    Science.gov (United States)

    Czajkowski, Jeffrey; Kunreuther, Howard; Michel-Kerjan, Erwann

    2013-12-01

    The development of catastrophe models in recent years allows for assessment of the flood hazard much more effectively than when the federally run National Flood Insurance Program (NFIP) was created in 1968. We propose and then demonstrate a methodological approach to determine pure premiums based on the entire distribution of possible flood events. We apply hazard, exposure, and vulnerability analyses to a sample of 300,000 single-family residences in two counties in Texas (Travis and Galveston) using state-of-the-art flood catastrophe models. Even in zones of similar flood risk classification by FEMA there is substantial variation in exposure between coastal and inland flood risk. For instance, homes in the designated moderate-risk X500/B zones in Galveston are exposed to a flood risk on average 2.5 times greater than residences in X500/B zones in Travis. The results also show very similar average annual loss (corrected for exposure) for a number of residences despite their being in different FEMA flood zones. We also find significant storm-surge exposure outside of the FEMA designated storm-surge risk zones. Taken together these findings highlight the importance of a microanalysis of flood exposure. The process of aggregating risk at a flood zone level-as currently undertaken by FEMA-provides a false sense of uniformity. As our analysis indicates, the technology to delineate the flood risks exists today. © 2013 Society for Risk Analysis.

  15. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    Full Text Available The WTI2017 project is responsible for the development of flood defence assessment tools for the 3600 km of Dutch primary flood defences, dikes/levees, dunes and hydraulic structures. These tools are necessary, as per January 1st 2017, the new flood risk management policy for the Netherlands will be implemented. Then, the seven decades old design practice (maximum water level methodology of 1958 and two decades old safety standards (and maximum hydraulic load methodology of 1996 will formally be replaced by a more risked based approach for the national policy in flood risk management. The formal flood defence assessment is an important part of this new policy, especially for flood defence managers, since national and regional funding for reinforcement is based on this assessment. This new flood defence policy is based on a maximum allowable probability of flooding. For this, a maximum acceptable individual risk was determined at 1/100 000 per year, this is the probability of life loss of for every protected area in the Netherlands. Safety standards of flood defences were then determined based on this acceptable individual risk. The results were adjusted based on information from cost -benefit analysis, societal risk and large scale societal disruption due to the failure of critical infrastructure e.g. power stations. The resulting riskbased flood defence safety standards range from a 300 to a 100 000 year return period for failure. Two policy studies, WV21 (Safety from floods in the 21st century and VNK-2 (the National Flood Risk in 2010 provided the essential information to determine the new risk based safety standards for flood defences. The WTI2017 project will provide the safety assessment tools based on these new standards and is thus an essential element for the implementation of this policy change. A major issue to be tackled was the development of user-friendly tools, as the new assessment is to be carried out by personnel of the

  16. Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?

    Science.gov (United States)

    Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy

    2016-10-01

    The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.

  17. Determining the Financial Impact of Flood Hazards in Ungaged Basins

    Science.gov (United States)

    Cotterman, K. A.; Gutenson, J. L.; Pradhan, N. R.; Byrd, A.

    2017-12-01

    Many portions of the Earth lack adequate authoritative or in situ data that is of great value in determining natural hazard vulnerability from both anthropogenic and physical perspective. Such locations include the majority of developing nations, which do not possess adequate warning systems and protective infrastructure. The lack of warning and protection from natural hazards make these nations vulnerable to the destructive power of events such as floods. The goal of this research is to demonstrate an initial workflow with which to characterize flood financial hazards with global datasets and crowd-sourced, non-authoritative data in ungagged river basins. This workflow includes the hydrologic and hydraulic response of the watershed to precipitation, characterized by the physics-based modeling application Gridded Surface-Subsurface Hydrologic Analysis (GSSHA) model. In addition, data infrastructure and resources are available to approximate the human impact of flooding. Open source, volunteer geographic information (VGI) data can provide global coverage of elements at risk of flooding. Additional valuation mechanisms can then translate flood exposure into percentage and financial damage to each building. The combinations of these tools allow the authors to remotely assess flood hazards with minimal computational, temporal, and financial overhead. This combination of deterministic and stochastic modeling provides the means to quickly characterize watershed flood vulnerability and will allow emergency responders and planners to better understand the implications of flooding, both spatially and financially. In either a planning, real-time, or forecasting scenario, the system will assist the user in understanding basin flood vulnerability and increasing community resiliency and preparedness.

  18. Reserve Special Flood Hazard Areas (SFHA)

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...

  19. Elephant Butte Special Flood Hazard Areas (SFHA)

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This vector dataset depicts the 1% annual flood boundary (otherwise known as special flood hazard area or 100 year flood boundary) for its specified area. The data...

  20. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  1. Flood Risk, Flood Mitigation, and Location Choice: Evaluating the National Flood Insurance Program's Community Rating System.

    Science.gov (United States)

    Fan, Qin; Davlasheridze, Meri

    2016-06-01

    Climate change is expected to worsen the negative effects of natural disasters like floods. The negative impacts, however, can be mitigated by individuals' adjustments through migration and relocation behaviors. Previous literature has identified flood risk as one significant driver in relocation decisions, but no prior study examines the effect of the National Flood Insurance Program's voluntary program-the Community Rating System (CRS)-on residential location choice. This article fills this gap and tests the hypothesis that flood risk and the CRS-creditable flood control activities affect residential location choices. We employ a two-stage sorting model to empirically estimate the effects. In the first stage, individuals' risk perception and preference heterogeneity for the CRS activities are considered, while mean effects of flood risk and the CRS activities are estimated in the second stage. We then estimate heterogeneous marginal willingness to pay (WTP) for the CRS activities by category. Results show that age, ethnicity and race, educational attainment, and prior exposure to risk explain risk perception. We find significant values for the CRS-creditable mitigation activities, which provides empirical evidence for the benefits associated with the program. The marginal WTP for an additional credit point earned for public information activities, including hazard disclosure, is found to be the highest. Results also suggest that water amenities dominate flood risk. Thus, high amenity values may increase exposure to flood risk, and flood mitigation projects should be strategized in coastal regions accordingly. © 2015 Society for Risk Analysis.

  2. PAI-OFF: A new proposal for online flood forecasting in flash flood prone catchments

    Science.gov (United States)

    Schmitz, G. H.; Cullmann, J.

    2008-10-01

    SummaryThe Process Modelling and Artificial Intelligence for Online Flood Forecasting (PAI-OFF) methodology combines the reliability of physically based, hydrologic/hydraulic modelling with the operational advantages of artificial intelligence. These operational advantages are extremely low computation times and straightforward operation. The basic principle of the methodology is to portray process models by means of ANN. We propose to train ANN flood forecasting models with synthetic data that reflects the possible range of storm events. To this end, establishing PAI-OFF requires first setting up a physically based hydrologic model of the considered catchment and - optionally, if backwater effects have a significant impact on the flow regime - a hydrodynamic flood routing model of the river reach in question. Both models are subsequently used for simulating all meaningful and flood relevant storm scenarios which are obtained from a catchment specific meteorological data analysis. This provides a database of corresponding input/output vectors which is then completed by generally available hydrological and meteorological data for characterizing the catchment state prior to each storm event. This database subsequently serves for training both a polynomial neural network (PoNN) - portraying the rainfall-runoff process - and a multilayer neural network (MLFN), which mirrors the hydrodynamic flood wave propagation in the river. These two ANN models replace the hydrological and hydrodynamic model in the operational mode. After presenting the theory, we apply PAI-OFF - essentially consisting of the coupled "hydrologic" PoNN and "hydrodynamic" MLFN - to the Freiberger Mulde catchment in the Erzgebirge (Ore-mountains) in East Germany (3000 km 2). Both the demonstrated computational efficiency and the prediction reliability underline the potential of the new PAI-OFF methodology for online flood forecasting.

  3. Surrogate modeling of joint flood risk across coastal watersheds

    Science.gov (United States)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  4. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  5. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  6. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  7. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  8. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  9. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  10. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  11. Floods in the Saguenay

    International Nuclear Information System (INIS)

    Martel, R.; Michaud, E.; Tousignant, P.M.

    1997-01-01

    Footage of a natural disaster that occurred between July 20 and 25 1996, in the Saguenay region of Quebec was documented. A heavy downpour of rain raised the water level of the Kenogami Lake reservoir beyond its capacity. This created huge pressure on its dam that upset the fragile balance between nature and rock. The dam raptured, resulting in a flood of previously unseen proportions. The Riviere au Sable in Jonquiere became an overwhelming body of water. The video showed how the shores of the river were eroded and how apartment buildings were engulfed by the torrent of water. A newly constructed electricity power plant had to be decommissioned, roads were washed away and entire neighborhoods were devastated. The devastation suffered by the cities of Chicoutimi, Jonquiere, Ville de la Baie, Ferland-Boileau, and L'Anse St-Jean was recorded. Thousands of victims of the disaster were evacuated with the help of the Canadian Armed Forces. Some of the work of reconstruction, begun even before the total retreat of the flood, involved restoration of roads, bridges and communication networks, was also shown

  12. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  13. Priming the Pump for Big Data at Sentara Healthcare.

    Science.gov (United States)

    Kern, Howard P; Reagin, Michael J; Reese, Bertram S

    2016-01-01

    Today's healthcare organizations are facing significant demands with respect to managing population health, demonstrating value, and accepting risk for clinical outcomes across the continuum of care. The patient's environment outside the walls of the hospital and physician's office-and outside the electronic health record (EHR)-has a substantial impact on clinical care outcomes. The use of big data is key to understanding factors that affect the patient's health status and enhancing clinicians' ability to anticipate how the patient will respond to various therapies. Big data is essential to delivering sustainable, highquality, value-based healthcare, as well as to the success of new models of care such as clinically integrated networks (CINs) and accountable care organizations.Sentara Healthcare, based in Norfolk, Virginia, has been an early adopter of the technologies that have readied us for our big data journey: EHRs, telehealth-supported electronic intensive care units, and telehealth primary care support through MDLIVE. Although we would not say Sentara is at the cutting edge of the big data trend, it certainly is among the fast followers. Use of big data in healthcare is still at an early stage compared with other industries. Tools for data analytics are maturing, but traditional challenges such as heightened data security and limited human resources remain the primary focus for regional health systems to improve care and reduce costs. Sentara primarily makes actionable use of big data in our CIN, Sentara Quality Care Network, and at our health plan, Optima Health. Big data projects can be expensive, and justifying the expense organizationally has often been easier in times of crisis. We have developed an analytics strategic plan separate from but aligned with corporate system goals to ensure optimal investment and management of this essential asset.

  14. Flood risk management in Italy

    DEFF Research Database (Denmark)

    Mysiak, J.; Testella, F.; Bonaiuto, M.

    2013-01-01

    Italy's recent history is punctuated with devastating flood disasters claiming high death toll and causing vast but underestimated economic, social and environmental damage. The responses to major flood and landslide disasters such as the Polesine (1951), Vajont (1963), Firenze (1966), Valtelina...

  15. The influence of climate change on flood risks in France ­- first estimates and uncertainty analysis

    OpenAIRE

    Dumas , Patrice; Hallegatte , Sréphane; Quintana-Seguí , Pere; Martin , Eric

    2013-01-01

    International audience; Abstract. This paper proposes a methodology to project the possible evolution of river flood damages due to climate change, and applies it to mainland France. Its main contributions are (i) to demonstrate a methodology to investigate the full causal chain from global climate change to local economic flood losses; (ii) to show that future flood losses may change in a very significant manner over France; (iii) to show that a very large uncertainty arises from the climate...

  16. Internal flooding analyses results of Slovak NPPs

    International Nuclear Information System (INIS)

    Sopira, Vladimir

    2000-01-01

    The assessment of the flood risk was the objective of the internal flooding analysis for NPPs Bohunice V1, V2 and Mochovce. All important flooding sources were identified. The rooms containing safety important components were analyzed from the point of view of: Integrity of flood boundaries; Capability for drainage; Flood signalisation; Flood localization and liquidation; Vulnerability of safety system component. The redundancies of safety systems are located mostly separately and no flood can endanger more than single train. It can be concluded that NPPs with WWER-440 are very safe against the flooding initiating event

  17. Detecting Flood Variations in Shanghai over 1949–2009 with Mann-Kendall Tests and a Newspaper-Based Database

    Directory of Open Access Journals (Sweden)

    Shiqiang Du

    2015-04-01

    Full Text Available A valuable aid to assessing and managing flood risk lies in a reliable database of historical floods. In this study, a newspaper-based flood database for Shanghai (NFDS for the period 1949–2009 was developed through a systematic scanning of newspapers. After calibration and validation of the database, Mann-Kendall tests and correlation analysis were applied to detect possible changes in flood frequencies. The analysis was carried out for three different flood types: overbank flood, agricultural waterlogging, and urban waterlogging. The compiled NFDS registered 146 floods and 92% of them occurred in the flood-prone season from June to September. The statistical analyses showed that both the annual flood and the floods in June–August increased significantly. Urban waterlogging showed a very strong increasing trend, probably because of insufficient capacity of urban drainage system and impacts of rapid urbanization. By contrast, the decrease in overbank flooding and the slight increase in agricultural waterlogging were likely because of the construction of river levees and seawalls and the upgrade of agricultural drainage systems, respectively. This study demonstrated the usefulness of local newspapers in building a historical flood database and in assessing flood characterization.

  18. Simple Method for Assessing Spread of Flood Prone Areas under Historical and Future Rainfall in the Upper Citarum Watershed

    Directory of Open Access Journals (Sweden)

    Bambang Dwi Dasanto

    2014-06-01

    Full Text Available From 1931 to 2010 the flood frequency in Upper Citarum Watershed had increased sharply indicating the decline of the wateshed quality. With the change of climate, risk of the flood may get worse. This study aims to determine effective rainfall that caused flooding and to evaluate the impact of future rainfall changes on the flood prone areas. Effective rainfall which contributes to direct runoff (DRO and leads to flooding was determined using regression equation relating the DRO and cumulative rainfall of a number of consecutive days. Mapping the flood prone areas was developed using the GIS techniques. Results showed that the effective rainfall which caused flooding was the rainfall accumulation for four consecutive days before occurrence of peak of DRO. The percentage of accuracy between estimated and actual flood maps was about 76.9%. According to historical rainfall, the flood prone areas spreaded at right and left directions of the Upstream Citarum River. If this area experiences the climate change, the frequency and flood extents will increase. This study can only identify locations and possibility of flood occurrence but it cannot demonstrate widespread of flood inundation precisely. However, this simple approach can evaluate the flood frequency and intensity quite well.

  19. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  20. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  1. Big endothelin changes the cellular miRNA environment in TMOb osteoblasts and increases mineralization.

    Science.gov (United States)

    Johnson, Michael G; Kristianto, Jasmin; Yuan, Baozhi; Konicke, Kathryn; Blank, Robert

    2014-08-01

    Endothelin (ET1) promotes the growth of osteoblastic breast and prostate cancer metastases. Conversion of big ET1 to mature ET1, catalyzed primarily by endothelin converting enzyme 1 (ECE1), is necessary for ET1's biological activity. We previously identified the Ece1, locus as a positional candidate gene for a pleiotropic quantitative trait locus affecting femoral size, shape, mineralization, and biomechanical performance. We exposed TMOb osteoblasts continuously to 25 ng/ml big ET1. Cells were grown for 6 days in growth medium and then switched to mineralization medium for an additional 15 days with or without big ET1, by which time the TMOb cells form mineralized nodules. We quantified mineralization by alizarin red staining and analyzed levels of miRNAs known to affect osteogenesis. Micro RNA 126-3p was identified by search as a potential regulator of sclerostin (SOST) translation. TMOb cells exposed to big ET1 showed greater mineralization than control cells. Big ET1 repressed miRNAs targeting transcripts of osteogenic proteins. Big ET1 increased expression of miRNAs that target transcripts of proteins that inhibit osteogenesis. Big ET1 increased expression of 126-3p 121-fold versus control. To begin to assess the effect of big ET1 on SOST production we analyzed both SOST transcription and protein production with and without the presence of big ET1 demonstrating that transcription and translation were uncoupled. Our data show that big ET1 signaling promotes mineralization. Moreover, the results suggest that big ET1's osteogenic effects are potentially mediated through changes in miRNA expression, a previously unrecognized big ET1 osteogenic mechanism.

  2. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  3. The Big Fish Down Under: Examining Moderators of the "Big-Fish-Little-Pond" Effect for Australia's High Achievers

    Science.gov (United States)

    Seaton, Marjorie; Marsh, Herbert W.; Yeung, Alexander Seeshing; Craven, Rhonda

    2011-01-01

    Big-fish-little-pond effect (BFLPE) research has demonstrated that academic self-concept is negatively affected by attending high-ability schools. This article examines data from large, representative samples of 15-year-olds from each Australian state, based on the three Program for International Student Assessment (PISA) databases that focus on…

  4. Developing a Malaysia flood model

    Science.gov (United States)

    Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina

    2014-05-01

    Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.

  5. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  6. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  7. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  8. Big Data and Social Media

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    A critical analysis of the "keep everything" Big Data era, the impact on our lives of the information, at first glance "convenient for future use" that we make known about ourselves on the network. NB! The lecture will be recorded like all Academic Training lectures. Lecturer's biography: Father of the Internet, see https://internethalloffame.org/inductees/vint-cerf or https://en.wikipedia.org/wiki/Vint_Cerf The video on slide number 9 is from page https://www.gapminder.org/tools/#$state$time$value=2018&value;;&chart-type=bubbles   Keywords: Big Data, Internet, History, Applications, tools, privacy, technology, preservation, surveillance, google, Arpanet, CERN, Web  

  9. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  10. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  11. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  12. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  13. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  14. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  15. Texas floods of 1940

    Science.gov (United States)

    Breeding, Seth D.

    1948-01-01

    Floods occurred in Texas during, June, July, and November 1940 that exceeded known stages on many small streams and at a few places on the larger streams. Stages at several stream-gaging stations exceeded the maximum known at those places since the collection of daily records began. A storm, haying its axis generally on a north-south line from Cameron to Victoria and extending across the Brazos, Colorado, Lavaca, and Guadalupe River Basins, caused heavy rainfall over a large part of south-central Texas. The maximum recorded rain of 22.7 inches for the 2-day period June 29-30 occurred at Engle. Of this amount, 17.5 inches fell in the 12-hour period between 8 p.m. June 29, and 8 a.m. June 30. Light rains fell at a number of places on June 28, and additional light rains fell at many places within the area from July 1 to 4. During the period June 28 to July 4 more than 20 inches of rain fell over an area of 300 square miles, more than 15 inches over 1,920 square miles, and more than 10 inches over 5,100 square miles. The average annual rainfall for the area experiencing the heaviest rainfall during this storm is about 35 inches. Farming is largely confined to the fertile flood plains in much of the area subjected to the record-breaking floods in June and July. Therefore these floods, coming at the height of the growing season, caused severe losses to crops. Much damage was done also to highways and railways. The city of Hallettsville suffered the greatest damage of any urban area. The Lavaca River at that place reached a stage 8 feet higher than ever known before, drowned several people, destroyed many homes, and submerged almost the entire business district. The maximum discharge there was 93,100 second-feet from a drainage area of 101 square miles. Dry Creek near Smithville produced a maximum discharge of 1,879 second-feet from an area of 1.48 square miles and a runoff of 11.3 inches in a 2-day period from a rainfall of 19.5 inches. The area in the Colorado River

  16. Numerical analysis of the big bounce in loop quantum cosmology

    International Nuclear Information System (INIS)

    Laguna, Pablo

    2007-01-01

    Loop quantum cosmology (LQC) homogeneous models with a massless scalar field show that the big-bang singularity can be replaced by a big quantum bounce. To gain further insight on the nature of this bounce, we study the semidiscrete loop quantum gravity Hamiltonian constraint equation from the point of view of numerical analysis. For illustration purposes, we establish a numerical analogy between the quantum bounces and reflections in finite difference discretizations of wave equations triggered by the use of nonuniform grids or, equivalently, reflections found when solving numerically wave equations with varying coefficients. We show that the bounce is closely related to the method for the temporal update of the system and demonstrate that explicit time-updates in general yield bounces. Finally, we present an example of an implicit time-update devoid of bounces and show back-in-time, deterministic evolutions that reach and partially jump over the big-bang singularity

  17. Can companies benefit from Big Science? Science and Industry

    CERN Document Server

    Autio, Erkko; Bianchi-Streit, M

    2003-01-01

    Several studies have indicated that there are significant returns on financial investment via "Big Science" centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields - for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm's organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings i...

  18. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  19. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  20. Improving Gas Flooding Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Reid Grigg; Robert Svec; Zheng Zeng; Alexander Mikhalin; Yi Lin; Guoqiang Yin; Solomon Ampir; Rashid Kassim

    2008-03-31

    This study focuses on laboratory studies with related analytical and numerical models, as well as work with operators for field tests to enhance our understanding of and capabilities for more efficient enhanced oil recovery (EOR). Much of the work has been performed at reservoir conditions. This includes a bubble chamber and several core flood apparatus developed or modified to measure interfacial tension (IFT), critical micelle concentration (CMC), foam durability, surfactant sorption at reservoir conditions, and pressure and temperature effects on foam systems.Carbon dioxide and N{sub 2} systems have been considered, under both miscible and immiscible conditions. The injection of CO2 into brine-saturated sandstone and carbonate core results in brine saturation reduction in the range of 62 to 82% brine in the tests presented in this paper. In each test, over 90% of the reduction occurred with less than 0.5 PV of CO{sub 2} injected, with very little additional brine production after 0.5 PV of CO{sub 2} injected. Adsorption of all considered surfactant is a significant problem. Most of the effect is reversible, but the amount required for foaming is large in terms of volume and cost for all considered surfactants. Some foams increase resistance to the value beyond what is practical in the reservoir. Sandstone, limestone, and dolomite core samples were tested. Dissolution of reservoir rock and/or cement, especially carbonates, under acid conditions of CO2 injection is a potential problem in CO2 injection into geological formations. Another potential change in reservoir injectivity and productivity will be the precipitation of dissolved carbonates as the brine flows and pressure decreases. The results of this report provide methods for determining surfactant sorption and can be used to aid in the determination of surfactant requirements for reservoir use in a CO{sub 2}-foam flood for mobility control. It also provides data to be used to determine rock permeability

  1. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  2. Big Area Additive Manufacturing of High Performance Bonded NdFeB Magnets

    NARCIS (Netherlands)

    Li, L; Tirado, A.; Nlebedim, I.C.; Rios, O.; Post, B.; Kunc, V.; Lowden, R.R.; Lara-Curzio, E.; Fredette, R.; Ormerod, J.; Lograsso, T.A.; Paranthaman, M.P.

    2016-01-01

    Additive manufacturing allows for the production of complex parts with minimum material waste, offering an effective technique for fabricating permanent magnets which frequently involve critical rare earth elements. In this report, we demonstrate a novel method - Big Area Additive Manufacturing

  3. The Effects of Object Orientation and Object Type on Children's Interpretation of the Word BIG.

    Science.gov (United States)

    Coley, John D.; Gelman, Susan A.

    1989-01-01

    Investigated the interpretation of the word "big" by 40 children of 3 to 5 years. The type and orientation of objects used in the study were varied. Results demonstrated that contextual factors influenced children's responses. (RJC)

  4. Behaviour of liquid films and flooding in counter-current two-phase flow, (1)

    International Nuclear Information System (INIS)

    Suzuki, Shin-ichi; Ueda, Tatsuhiro.

    1978-01-01

    This paper reports on the results of study of the behavior of liquid film and flooding in counter-current two phase flow, and the flow speed of gas phase was measured over the wide ranges of tube diameter, tube length, amount of liquid flow, viscosity and surface tension. Liquid samples used for this experiment were water, glycerol, and second octyl alcohol. The phenomena were observed with a high speed camera. The maximum thickness of liquid film was measured, and the effects of various factors on the flooding were investigated. The results of investigation were as follows. The big waves which cause the flooding were developed by the interaction of one of the waves on liquid film surface with gas phase flow. The flow speed of gas phase at the time of beginning of flooding increases with the reduction of amount of liquid flow and the increase of tube diameter. The flooding flow speed is reduced with the increase of tube length. The larger maximum film thickness at the time of no gas phase flow causes flooding at low gas phase flow speed. (Kato, T.)

  5. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  6. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  7. Evaluating the impact and risk of pluvial flash flood on intra-urban road network: A case study in the city center of Shanghai, China

    Science.gov (United States)

    Yin, Jie; Yu, Dapeng; Yin, Zhane; Liu, Min; He, Qing

    2016-06-01

    Urban pluvial flood are attracting growing public concern due to rising intense precipitation and increasing consequences. Accurate risk assessment is critical to an efficient urban pluvial flood management, particularly in transportation sector. This paper describes an integrated methodology, which initially makes use of high resolution 2D inundation modeling and flood depth-dependent measure to evaluate the potential impact and risk of pluvial flash flood on road network in the city center of Shanghai, China. Intensity-Duration-Frequency relationships of Shanghai rainstorm and Chicago Design Storm are combined to generate ensemble rainfall scenarios. A hydrodynamic model (FloodMap-HydroInundation2D) is used to simulate overland flow and flood inundation for each scenario. Furthermore, road impact and risk assessment are respectively conducted by a new proposed algorithm and proxy. Results suggest that the flood response is a function of spatio-temporal distribution of precipitation and local characteristics (i.e. drainage and topography), and pluvial flash flood is found to lead to proportionate but nonlinear impact on intra-urban road inundation risk. The approach tested here would provide more detailed flood information for smart management of urban street network and may be applied to other big cities where road flood risk is evolving in the context of climate change and urbanization.

  8. Time-dependent reliability analysis of flood defences

    International Nuclear Information System (INIS)

    Buijs, F.A.; Hall, J.W.; Sayers, P.B.; Gelder, P.H.A.J.M. van

    2009-01-01

    This paper describes the underlying theory and a practical process for establishing time-dependent reliability models for components in a realistic and complex flood defence system. Though time-dependent reliability models have been applied frequently in, for example, the offshore, structural safety and nuclear industry, application in the safety-critical field of flood defence has to date been limited. The modelling methodology involves identifying relevant variables and processes, characterisation of those processes in appropriate mathematical terms, numerical implementation, parameter estimation and prediction. A combination of stochastic, hierarchical and parametric processes is employed. The approach is demonstrated for selected deterioration mechanisms in the context of a flood defence system. The paper demonstrates that this structured methodology enables the definition of credible statistical models for time-dependence of flood defences in data scarce situations. In the application of those models one of the main findings is that the time variability in the deterioration process tends to be governed the time-dependence of one or a small number of critical attributes. It is demonstrated how the need for further data collection depends upon the relevance of the time-dependence in the performance of the flood defence system.

  9. Increasing stress on disaster risk finance due to large floods

    Science.gov (United States)

    Jongman, Brenden; Hochrainer-Stigler, Stefan; Feyen, Luc; Aerts, Jeroen; Mechler, Reinhard; Botzen, Wouter; Bouwer, Laurens; Pflug, Georg; Rojas, Rodrigo; Ward, Philip

    2014-05-01

    Recent major flood disasters have shown that single extreme events can affect multiple countries simultaneously, which puts high pressure on trans-national risk reduction and risk transfer mechanisms. To date, little is known about such flood hazard interdependencies across regions, and the corresponding joint risks at regional to continental scales. Reliable information on correlated loss probabilities is crucial for developing robust insurance schemes and public adaptation funds, and for enhancing our understanding of climate change impacts. Here we show that extreme discharges are strongly correlated across European river basins and that these correlations can, or should, be used in national to continental scale risk assessment. We present probabilistic trends in continental flood risk, and demonstrate that currently observed extreme flood losses could more than double in frequency by 2050 under future climate change and socioeconomic development. The results demonstrate that accounting for tail dependencies leads to higher estimates of extreme losses than estimates based on the traditional assumption of independence between basins. We suggest that risk management for these increasing losses is largely feasible, and we demonstrate that risk can be shared by expanding risk transfer financing, reduced by investing in flood protection, or absorbed by enhanced solidarity between countries. We conclude that these measures have vastly different efficiency, equity and acceptability implications, which need to be taken into account in broader consultation, for which our analysis provides a basis.

  10. Dealing with Uncertainty in Flood Management Through Diversification

    Directory of Open Access Journals (Sweden)

    Jeroen C. J. H. Aerts

    2008-06-01

    Full Text Available This paper shows, through a numerical example, how to develop portfolios of flood management activities that generate the highest return under an acceptable risk for an area in the central part of the Netherlands. The paper shows a method based on Modern Portfolio Theory (MPT that contributes to developing flood management strategies. MPT aims at finding sets of investments that diversify risks thereby reducing the overall risk of the total portfolio of investments. This paper shows that through systematically combining four different flood protection measures in portfolios containing three or four measures; risk is reduced compared with portfolios that only contain one or two measures. Adding partly uncorrelated measures to the portfolio diversifies risk. We demonstrate how MPT encourages a systematic discussion of the relationship between the return and risk of individual flood mitigation activities and the return and risk of complete portfolios. It is also shown how important it is to understand the correlation of the returns of various flood management activities. The MPT approach, therefore, fits well with the notion of adaptive water management, which perceives the future as inherently uncertain. Through applying MPT on flood protection strategies current vulnerability will be reduced by diversifying risk.

  11. Quantification of Uncertainty in the Flood Frequency Analysis

    Science.gov (United States)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  12. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  13. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  14. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  15. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  16. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  17. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  18. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  19. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  20. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  1. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  2. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  3. Big Math for Little Kids

    Science.gov (United States)

    Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert

    2004-01-01

    "Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…

  4. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  5. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    at different distances (that is, at different epochs in the past) to come to this ... that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant ...

  6. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  7. Accounting For Greenhouse Gas Emissions From Flooded ...

    Science.gov (United States)

    Nearly three decades of research has demonstrated that the inundation of rivers and terrestrial ecosystems behind dams can lead to enhanced rates of greenhouse gas emissions, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. The research approaches include 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions. To inform th

  8. Wetland restoration, flood pulsing, and disturbance dynamics

    Science.gov (United States)

    Middleton, Beth A.

    1999-01-01

    While it is generally accepted that flood pulsing and disturbance dynamics are critical to wetland viability, there is as yet no consensus among those responsible for wetland restoration about how best to plan for those phenomena or even whether it is really necessary to do so at all. In this groundbreaking book, Dr. Beth Middleton draws upon the latest research from around the world to build a strong case for making flood pulsing and disturbance dynamics integral to the wetland restoration planning process.While the initial chapters of the book are devoted to laying the conceptual foundations, most of the coverage is concerned with demonstrating the practical implications for wetland restoration and management of the latest ecological theory and research. It includes a fascinating case history section in which Dr. Middleton explores the restoration models used in five major North American, European, Australian, African, and Asian wetland projects, and analyzes their relative success from the perspective of flood pulsing and disturbance dynamics planning.Wetland Restoration also features a wealth of practical information useful to all those involved in wetland restoration and management, including: * A compendium of water level tolerances, seed germination, seedling recruitment, adult survival rates, and other key traits of wetland plant species * A bibliography of 1,200 articles and monographs covering all aspects of wetland restoration * A comprehensive directory of wetland restoration ftp sites worldwide * An extensive glossary of essential terms

  9. Big Biomedical data as the key resource for discovery science

    Energy Technology Data Exchange (ETDEWEB)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W.; Price, Nathan D.; Glusman, Gustavo; Heavner, Benjamin D.; Dinov, Ivo D.; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-07-21

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.

  10. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  11. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  12. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  13. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  14. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  15. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  16. A satellite and model based flood inundation climatology of Australia

    Science.gov (United States)

    Schumann, G.; Andreadis, K.; Castillo, C. J.

    2013-12-01

    To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.

  17. Coupling Modelling of Urban Development and Flood Risk – An Attempt for a Combined Software Framework

    DEFF Research Database (Denmark)

    Löwe, Roland; Sto Domingo, Nina; Urich, Christian

    2015-01-01

    to use the results of the hydraulic simulation to condition DANCE4WATER and to account for flood risk in the simulated urban development. In an Australian case study, we demonstrate that future flood risk can be significantly reduced while maintaining the overall speed of urban development.......We have developed a setup that couples the urban development model DANCE4WATER with the 1D-2D hydraulic model MIKE FLOOD. The setup makes it possible to assess the impact of urban development and infrastructural change scenarios on flood risk in an automated manner. In addition, it permits us...

  18. Use of a dam break model to assess flooding at Haddam Neck Nuclear Power Plant

    International Nuclear Information System (INIS)

    Scherrer, J.S.; Chery, D.L. Jr.

    1984-01-01

    Because of their proximity to necessary supplies of cooling water, nuclear power plants are susceptible to riverine flooding. Greater flood hazards exist where plants are located downstream of larger dams. The consequences of the Quabbin Reservoir dam failure on the Haddam Neck Nuclear Power Plant situated on the Connecticut River were investigated using a dam break flood routing model. Reasons for selecting a particular model are presented and the input assumption for the modeling process are developed. Relevant information concerning the level of manpower involvement is presented. The findings of this analysis demonstrate that the plant is adequately protected from the consequences of the postulated flood event

  19. Protection of base nuclear installations against external flooding - Guide nr 13, release of the 08/01/2013

    International Nuclear Information System (INIS)

    2013-01-01

    As the French law requires the flooding risk to be taken into account in the demonstration of the nuclear safety of base nuclear installations (INB), this guide aims at defining situations to be taken into account when assessing the flooding risk for a site (identification of water sources and of flooding causes, definition of flooding situations), at proposing an acceptable method to quantify these situations (local rains, rise of water level, problems on hydraulic works, dam failure, ocean waves, and so on), and at listing recommendations to define the protection means which are adapted to the specificities of the flooding risk, and are implemented by the operator with respect to the installation lifetime

  20. Flooding Fragility Experiments and Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Tahhan, Antonio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Muchmore, Cody [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nichols, Larinda [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bhandari, Bishwo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pope, Chad [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This report describes the work that has been performed on flooding fragility, both the experimental tests being carried out and the probabilistic fragility predictive models being produced in order to use the text results. Flooding experiments involving full-scale doors have commenced in the Portal Evaluation Tank. The goal of these experiments is to develop a full-scale component flooding experiment protocol and to acquire data that can be used to create Bayesian regression models representing the fragility of these components. This work is in support of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluation research and development.

  1. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  2. Addressing Data Veracity in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Aman, Saima [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Computer Science; Chelmis, Charalampos [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering; Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering

    2014-10-27

    Big data applications such as in smart electric grids, transportation, and remote environment monitoring involve geographically dispersed sensors that periodically send back information to central nodes. In many cases, data from sensors is not available at central nodes at a frequency that is required for real-time modeling and decision-making. This may be due to physical limitations of the transmission networks, or due to consumers limiting frequent transmission of data from sensors located at their premises for security and privacy concerns. Such scenarios lead to partial data problem and raise the issue of data veracity in big data applications. We describe a novel solution to the problem of making short term predictions (up to a few hours ahead) in absence of real-time data from sensors in Smart Grid. A key implication of our work is that by using real-time data from only a small subset of influential sensors, we are able to make predictions for all sensors. We thus reduce the communication complexity involved in transmitting sensory data in Smart Grids. We use real-world electricity consumption data from smart meters to empirically demonstrate the usefulness of our method. Our dataset consists of data collected at 15-min intervals from 170 smart meters in the USC Microgrid for 7 years, totaling 41,697,600 data points.

  3. Structural evaluation of multifunctional flood defenses

    NARCIS (Netherlands)

    Voorendt, M.Z.; Kothuis, Baukje; Kok, Matthijs

    2017-01-01

    Flood risk reduction aims to minimize losses in low-lying areas. One of the ways to reduce flood risks is to protect land by means of flood defenses. The Netherlands has a long tradition of flood protection and, therefore, a wide variety of technical reports written

  4. Local Flood Action Groups: Governance And Resilience

    NARCIS (Netherlands)

    Forrest, Steven; Trell, Elen-Maarja; Woltjer, Johan; Macoun, Milan; Maier, Karel

    2015-01-01

    A diverse range of citizen groups focusing on flood risk management have been identified in several European countries. The paper discusses the role of flood action (citizen) groups in the context of flood resilience and will do this by analysing the UK and its diverse range of flood groups. These

  5. Adjustable Robust Strategies for Flood Protection

    NARCIS (Netherlands)

    Postek, Krzysztof; den Hertog, Dick; Kind, J.; Pustjens, Chris

    2016-01-01

    Flood protection is of major importance to many flood-prone regions and involves substantial investment and maintenance costs. Modern flood risk management requires often to determine a cost-efficient protection strategy, i.e., one with lowest possible long run cost and satisfying flood protection

  6. A framework for global river flood risk assessments

    Science.gov (United States)

    Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.

    2013-05-01

    There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from the EM

  7. A framework for global river flood risk assessments

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2013-05-01

    Full Text Available There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population. The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE. We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from

  8. Sequential planning of flood protection infrastructure under limited historic flood record and climate change uncertainty

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Straub, Daniel

    2017-04-01

    Flood protection is often designed to safeguard people and property following regulations and standards, which specify a target design flood protection level, such as the 100-year flood level prescribed in Germany (DWA, 2011). In practice, the magnitude of such an event is only known within a range of uncertainty, which is caused by limited historic records and uncertain climate change impacts, among other factors (Hall & Solomatine, 2008). As more observations and improved climate projections become available in the future, the design flood estimate changes and the capacity of the flood protection may be deemed insufficient at a future point in time. This problem can be mitigated by the implementation of flexible flood protection systems (that can easily be adjusted in the future) and/or by adding an additional reserve to the flood protection, i.e. by applying a safety factor to the design. But how high should such a safety factor be? And how much should the decision maker be willing to pay to make the system flexible, i.e. what is the Value of Flexibility (Špačková & Straub, 2017)? We propose a decision model that identifies cost-optimal decisions on flood protection capacity in the face of uncertainty (Dittes et al. 2017). It considers sequential adjustments of the protection system during its lifetime, taking into account its flexibility. The proposed framework is based on pre-posterior Bayesian decision analysis, using Decision Trees and Markov Decision Processes, and is fully quantitative. It can include a wide range of uncertainty components such as uncertainty associated with limited historic record or uncertain climate or socio-economic change. It is shown that since flexible systems are less costly to adjust when flood estimates are changing, they justify initially lower safety factors. Investigation on the Value of Flexibility (VoF) demonstrates that VoF depends on the type and degree of uncertainty, on the learning effect (i.e. kind and quality of

  9. The Complex Relationship Between Heavy Storms and Floods: Implication on Stormwater Drainage design and Management

    Science.gov (United States)

    Demissie, Y.; Mortuza, M. R.; Moges, E.; Yan, E.; Li, H. Y.

    2017-12-01

    Due to the lack of historical and future streamflow data for flood frequency analysis at or near most drainage sites, it is a common practice to directly estimate the design flood (maximum discharge or volume of stream for a given return period) based on storm frequency analysis and the resulted Intensity-Duration-Frequency (IDF) curves. Such analysis assumes a direct relationship between storms and floods with, for example, the 10-year rainfall expected to produce the 10-year flood. However, in reality, a storm is just one factor among the many other hydrological and metrological factors that can affect the peak flow and hydrograph. Consequently, a heavy storm does not necessarily always lead to flooding or a flood events with the same frequency. This is evident by the observed difference in the seasonality of heavy storms and floods in most regions. In order to understand site specific causal-effect relationship between heavy storms and floods and improve the flood analysis for stormwater drainage design and management, we have examined the contributions of various factors that affect floods using statistical and information theory methods. Based on the identified dominant causal-effect relationships, hydrologic and probability analyses were conducted to develop the runoff IDF curves taking into consideration the snowmelt and rain-on-snow effect, the difference in the storm and flood seasonality, soil moisture conditions, and catchment potential for flash and riverine flooding. The approach was demonstrated using data from military installations located in different parts of the United States. The accuracy of the flood frequency analysis and the resulted runoff IDF curves were evaluated based on the runoff IDF curves developed from streamflow measurements.

  10. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  11. A National Assessment of Changes in Flood Exposure in the United States

    Science.gov (United States)

    Lam, N.; Qiang, Y.; Cai, H.; Zou, L.

    2017-12-01

    Analyzing flood exposure and its temporal trend is the first step toward understanding flood risk, flood hazard, and flood vulnerability. This presentation is based on a national, county-based study assessing the changes in population and urban areas in high-risk flood zones from 2001-2011 in the contiguous United States. Satellite land use land cover data, Federal Emergency Management Agency (FEMA)'s 100-year flood maps, and census data were used to extract the proportion of developed (urban) land in flood zones by county in the two time points, and indices of difference were calculated. Local Moran's I statistic was applied to identify hotspots of increase in urban area in flood zones, and geographically weighted regression was used to estimate the population in flood zones from the land cover data. Results show that in 2011, an estimate of about 25.3 million people (8.3% of the total population) lived in the high-risk flood zones. Nationally, the ratio of urban development in flood zones is less than the ratio of land in flood zones, implying that Americans were responsive to flood hazards by avoiding development in flood zones. However, this trend varied from place to place, with coastal counties having less urban development in flood zones than the inland counties. Furthermore, the contrast between coastal and inland counties increased during 2001-2011. Finally, several exceptions from the trend (hotspots) were detected, most notably New York City and Miami where significant increases in urban development in flood zones were found. This assessment provides important baseline information on the spatial patterns of flood exposure and their changes from 2001-2011. The study pinpoints regions that may need further investigations and better policy to reduce the overall flood risks. Methodologically, the study demonstrates that pixelated land cover data can be integrated with other natural and human data to investigate important societal problems. The same

  12. Flood susceptibility analysis through remote sensing, GIS and frequency ratio model

    Science.gov (United States)

    Samanta, Sailesh; Pal, Dilip Kumar; Palsamanta, Babita

    2018-05-01

    Papua New Guinea (PNG) is saddled with frequent natural disasters like earthquake, volcanic eruption, landslide, drought, flood etc. Flood, as a hydrological disaster to humankind's niche brings about a powerful and often sudden, pernicious change in the surface distribution of water on land, while the benevolence of flood manifests in restoring the health of the thalweg from excessive siltation by redistributing the fertile sediments on the riverine floodplains. In respect to social, economic and environmental perspective, flood is one of the most devastating disasters in PNG. This research was conducted to investigate the usefulness of remote sensing, geographic information system and the frequency ratio (FR) for flood susceptibility mapping. FR model was used to handle different independent variables via weighted-based bivariate probability values to generate a plausible flood susceptibility map. This study was conducted in the Markham riverine precinct under Morobe province in PNG. A historical flood inventory database of PNG resource information system (PNGRIS) was used to generate 143 flood locations based on "create fishnet" analysis. 100 (70%) flood sample locations were selected randomly for model building. Ten independent variables, namely land use/land cover, elevation, slope, topographic wetness index, surface runoff, landform, lithology, distance from the main river, soil texture and soil drainage were used into the FR model for flood vulnerability analysis. Finally, the database was developed for areas vulnerable to flood. The result demonstrated a span of FR values ranging from 2.66 (least flood prone) to 19.02 (most flood prone) for the study area. The developed database was reclassified into five (5) flood vulnerability zones segmenting on the FR values, namely very low (less that 5.0), low (5.0-7.5), moderate (7.5-10.0), high (10.0-12.5) and very high susceptibility (more than 12.5). The result indicated that about 19.4% land area as `very high

  13. Entering the 'big data' era in medicinal chemistry: molecular promiscuity analysis revisited.

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2017-06-01

    The 'big data' concept plays an increasingly important role in many scientific fields. Big data involves more than unprecedentedly large volumes of data that become available. Different criteria characterizing big data must be carefully considered in computational data mining, as we discuss herein focusing on medicinal chemistry. This is a scientific discipline where big data is beginning to emerge and provide new opportunities. For example, the ability of many drugs to specifically interact with multiple targets, termed promiscuity, forms the molecular basis of polypharmacology, a hot topic in drug discovery. Compound promiscuity analysis is an area that is much influenced by big data phenomena. Different results are obtained depending on chosen data selection and confidence criteria, as we also demonstrate.

  14. Entering the ‘big data’ era in medicinal chemistry: molecular promiscuity analysis revisited

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2017-01-01

    The ‘big data’ concept plays an increasingly important role in many scientific fields. Big data involves more than unprecedentedly large volumes of data that become available. Different criteria characterizing big data must be carefully considered in computational data mining, as we discuss herein focusing on medicinal chemistry. This is a scientific discipline where big data is beginning to emerge and provide new opportunities. For example, the ability of many drugs to specifically interact with multiple targets, termed promiscuity, forms the molecular basis of polypharmacology, a hot topic in drug discovery. Compound promiscuity analysis is an area that is much influenced by big data phenomena. Different results are obtained depending on chosen data selection and confidence criteria, as we also demonstrate. PMID:28670471

  15. Smoky River coal flood risk mapping study

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-06-01

    The Canada-Alberta Flood Damage Reduction Program (FDRP) is designed to reduce flood damage by identifying areas susceptible to flooding and by encouraging application of suitable land use planning, zoning, and flood preparedness and proofing. The purpose of this study is to define flood risk and floodway limits along the Smoky River near the former Smoky River Coal (SRC) plant. Alberta Energy has been responsible for the site since the mine and plant closed in 2000. The study describes flooding history, available data, features of the river and valley, calculation of flood levels, and floodway determination, and includes flood risk maps. The HEC-RAS program is used for the calculations. The flood risk area was calculated using the 1:100 year return period flood as the hydrological event. 7 refs., 11 figs., 7 tabs., 3 apps.

  16. Flood Resilient Systems and their Application for Flood Resilient Planning

    Science.gov (United States)

    Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.

    2012-04-01

    Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project

  17. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    Science.gov (United States)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.

  18. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  19. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  20. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  1. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  2. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  3. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  4. Flood Hazard Areas - High Risk

    Data.gov (United States)

    Department of Homeland Security — The S_Fld_Haz_Ar table contains information about the flood hazards within the study area. A spatial file with locational information also corresponds with this data...

  5. FEMA Flood Insurance Studies Inventory

    Data.gov (United States)

    Kansas Data Access and Support Center — This digital data set provides an inventory of Federal Emergency Management Agency (FEMA) Flood Insurance Studies (FIS) that have been conducted for communities and...

  6. Flooding characteristics of Goodloe packing

    International Nuclear Information System (INIS)

    Begovich, J.M.; Watson, J.S.

    1976-08-01

    Experimental flooding data for the countercurrent flow of air and water in a 7.62-cm-diam glass column filled with Goodloe packing were compared with a correlation reported by the packing manufacturer. Flooding rates observed in this study were as low as one-half those predicted by the correlation. Rearranging the packing by inverting the column and removing some packing segments yielded results similar to the correlation for liquid-to-gas (L/G) mass flow rate ratios greater than 10, but the experimental flooding curve fell significantly below the correlation at lower L/G ratios. When the column was repacked with new packing, the results were essentially the same as those obtained in the inverted column. Thus, it is believed that a carefully packed column is more likely to yield flooding rates similar to those obtained in the new or inverted columns rather than rates predicted by the original correlation

  7. Flood Fighting Products Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — A wave research basin at the ERDC Coastal and Hydraulics Laboratory has been modified specifically for testing of temporary, barrier-type, flood fighting products....

  8. FLOOD CHARACTERISTICS AND MANAGEMENT ADAPTATIONS ...

    African Journals Online (AJOL)

    Dr Osondu

    2011-10-26

    Oct 26, 2011 ... Ethiopian Journal of Environmental Studies and Management Vol. ... people are estimated to be at such risk by 2080 .... SCS-CN method is based on the water balance .... and psychological burden of flood hazard often fall.

  9. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  10. Nowcasting using news topics Big Data versus big bank

    OpenAIRE

    Thorsrud, Leif Anders

    2016-01-01

    The agents in the economy use a plethora of high frequency information, including news media, to guide their actions and thereby shape aggregate economic fluctuations. Traditional nowcasting approches have to a relatively little degree made use of such information. In this paper, I show how unstructured textual information in a business newspaper can be decomposed into daily news topics and used to nowcast quarterly GDP growth. Compared with a big bank of experts, here represented by o cial c...

  11. Introduction to flood control science

    International Nuclear Information System (INIS)

    Lee, Dong U; Ha, Jin Uk; Kim, Dong Ha; Shin, Hong Ryeol; Song, Seok Hwan; Kim, Jin Gyu; Moon, Heon Cheol

    2003-01-01

    This book covers introduction, industrialization disaster such as Bhopal and Chernobyl disaster, earthquake disaster, volcano disaster, avalanche disaster including loss allocation and prevention measures, and natural fire by showing California, Yellowstone park and similarity between fire and flood. It also introduces climate change and disaster, Earth's greenhouse effect and disaster due to current sea level rise, flood damage, drought disaster, famine and drought, prediction of drought, population problems, outlook of world population, and disaster prevention administration system of Korea.

  12. Elk River Watershed - Flood Study

    Science.gov (United States)

    Barnes, C. C.; Byrne, J. M.; MacDonald, R. J.; Lewis, D.

    2014-12-01

    Flooding has the potential to cause significant impacts to economic activities as well as to disrupt or displace populations. Changing climate regimes such as extreme precipitation events increase flood vulnerability and put additional stresses on infrastructure. Potential flooding from just under 100 (2009 NPRI Reviewed Facility Data Release, Environment Canada) toxic tailings ponds located in Canada increase risk to human safety and the environment. One such geotechnical failure spilt billions of litres of toxic tailings into the Fraser River watershed, British Columbia, when a tailings pond dam breach occurred in August 2014. Damaged and washed out roadways cut access to essential services as seen by the extensive floods that occurred in Saskatchewan and Manitoba in July 2014, and in Southern Alberta in 2013. Recovery efforts from events such as these can be lengthy, and have substantial social and economic impacts both in loss of revenue and cost of repair. The objective of this study is to investigate existing conditions in the Elk River watershed and model potential future hydrological changes that can increase flood risk hazards. By analyzing existing hydrology, meteorology, land cover, land use, economic, and settlement patterns a baseline is established for existing conditions in the Elk River watershed. Coupling the Generate Earth Systems Science (GENESYS) high-resolution spatial hydrometeorological model with flood hazard analysis methodology, high-resolution flood vulnerability base line maps are created using historical climate conditions. Further work in 2015 will examine possible impacts for a range of climate change and land use change scenarios to define changes to future flood risk and vulnerability.

  13. Big Data Analytics

    DEFF Research Database (Denmark)

    Buch, Rasmus Brødsgaard; Beheshti-Kashi, Samaneh; Nielsen, Thomas Alexander Sick

    2018-01-01

    is not unveiled for various domains, such as also for the transportation sector. Accordingly, this research aims at examining the potential of textual data in transportation. For this purpose, a case study was designed on public opinion towards the adoption of driverless cars. This case study was framed together...... with the Danish road directorate, which is, in this case, the problem owner. Traditionally, public opinion is often captured by means of surveys. However, this paper provides demonstrations in which public opinion towards the adoption of driverless cars is examined through the exploitation of newspaper articles......, the Danish Road Directorate can use these result to supplement their strategies and expectations towards the adoption of driverless cars by incorporating the public’s opinion more carefully....

  14. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  15. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  16. Extent and frequency of floods on Delaware River in vicinity of Belvidere, New Jersey

    Science.gov (United States)

    Farlekas, George M.

    1966-01-01

    control, such as dams and levees. Both physical works and flood-plain regulations are included in the comprehensive plans for development of the Delaware River basin.Recommendations for land use, or suggestions for limitations of land use, are not made herein. Other reports on recommended general use and regulation of land in flood-prone areas are available (Dola, 1961; White, 1961; American Society of Civil Engineers Task Force on Flood Plain Regulations, 1962; and Goddard, 1963). The primary responsibility for planning for the optimum land use in the flood plain and the implementation of flood-plain zoning or other regulations to achieve such optimum use rest with the state and local interests. The preparation of this report was undertaken after consultation with representatives of the Lehigh-Northampton Counties, Pennsylvania, Joint Planning Commission and the Warren County, New Jersey, Regional Planning Board and after both had demonstrated their need for flood-plain information and their willingness to consider flood-plain regulations.

  17. Cyber Surveillance for Flood Disasters

    Directory of Open Access Journals (Sweden)

    Shi-Wei Lo

    2015-01-01

    Full Text Available Regional heavy rainfall is usually caused by the influence of extreme weather conditions. Instant heavy rainfall often results in the flooding of rivers and the neighboring low-lying areas, which is responsible for a large number of casualties and considerable property loss. The existing precipitation forecast systems mostly focus on the analysis and forecast of large-scale areas but do not provide precise instant automatic monitoring and alert feedback for individual river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image processing methods, in order to obtain instant flooding and waterlogging event feedback. The intrusion detection mode of these surveillance systems is used in this study, wherein a flood is considered a possible invasion object. Through the detection and verification of flood objects, automatic flood risk-level monitoring of specific individual river segments, as well as the automatic urban inundation detection, has become possible. The proposed method can better meet the practical needs of disaster prevention than the method of large-area forecasting. It also has several other advantages, such as flexibility in location selection, no requirement of a standard water-level ruler, and a relatively large field of view, when compared with the traditional water-level measurements using video screens. The results can offer prompt reference for appropriate disaster warning actions in small areas, making them more accurate and effective.

  18. Scales of Natural Flood Management

    Science.gov (United States)

    Nicholson, Alex; Quinn, Paul; Owen, Gareth; Hetherington, David; Piedra Lara, Miguel; O'Donnell, Greg

    2016-04-01

    The scientific field of Natural flood Management (NFM) is receiving much attention and is now widely seen as a valid solution to sustainably manage flood risk whilst offering significant multiple benefits. However, few examples exist looking at NFM on a large scale (>10km2). Well-implemented NFM has the effect of restoring more natural catchment hydrological and sedimentological processes, which in turn can have significant flood risk and WFD benefits for catchment waterbodies. These catchment scale improvements in-turn allow more 'natural' processes to be returned to rivers and streams, creating a more resilient system. Although certain NFM interventions may appear distant and disconnected from main stem waterbodies, they will undoubtedly be contributing to WFD at the catchment waterbody scale. This paper offers examples of NFM, and explains how they can be maximised through practical design across many scales (from feature up to the whole catchment). New tools to assist in the selection of measures and their location, and to appreciate firstly, the flooding benefit at the local catchment scale and then show a Flood Impact Model that can best reflect the impacts of local changes further downstream. The tools will be discussed in the context of our most recent experiences on NFM projects including river catchments in the north east of England and in Scotland. This work has encouraged a more integrated approach to flood management planning that can use both traditional and novel NFM strategies in an effective and convincing way.

  19. Flooding Effect on Earth Walls

    Directory of Open Access Journals (Sweden)

    Meysam Banimahd

    2010-12-01

    Full Text Available Earth building is a sustainable, environmentally friendly and economical method of construction that has been used worldwide for many centuries. For the past three decades, earth has seen a revival as a building material for a modern construction method due to its benefits in terms of low carbon content, low cost and energy involved during construction, as well as the fact that it is a sustainable technology of building. Climate change is influencing precipitation levels and patterns around the world, and as a consequence, flood risk is increasing rapidly. When flooding occurs, earth buildings are exposed to water by submersion, causing an increase in the degree of saturation of the earth structures and therefore a decrease of the suction between particles. This study investigated the effect of cycles of flooding (consecutive events of flooding followed by dry periods on earth walls. A series of characterization tests were carried out to obtain the physical and mechanical properties of the studied earth material. In a second stage, Flooding Simulation Tests (FST were performed to explore the earth walls’ response to repeated flooding events. The results obtained for the tested earth wall/samples with reinforced material (straw reveal hydraulic hysteresis when wall/samples are subject to cycles of wetting and drying.

  20. Does Implementation of Big Data Analytics Improve Firms’ Market Value? Investors’ Reaction in Stock Market

    Directory of Open Access Journals (Sweden)

    Hansol Lee

    2017-06-01

    Full Text Available Recently, due to the development of social media, multimedia, and the Internet of Things (IoT, various types of data have increased. As the existing data analytics tools cannot cover this huge volume of data, big data analytics becomes one of the emerging technologies for business today. Considering that big data analytics is an up-to-date term, in the present study, we investigated the impact of implementing big data analytics in the short-term perspective. We used an event study methodology to investigate the changes in stock price caused by announcements on big data analytics solution investment. A total of 54 investment announcements of firms publicly traded in NASDAQ and NYSE from 2010 to 2015 were collected. Our results empirically demonstrate that announcement of firms’ investment on big data solution leads to positive stock market reactions. In addition, we also found that investments on small vendors’ solution with industry-oriented functions tend to result in higher abnormal returns than those on big vendors’ solution with general functions. Finally, our results also suggest that stock market investors highly evaluate big data analytics investments of big firms as compared to those of small firms.

  1. Constructing risks – Internalisation of flood risks in the flood risk management plan

    NARCIS (Netherlands)

    Roos, Matthijs; Hartmann, T.; Spit, T.J.M.; Johann, Georg

    Traditional flood protection methods have focused efforts on different measures to keep water out of floodplains. However, the European Flood Directive challenges this paradigm (Hartmann and Driessen, 2013). Accordingly, flood risk management plans should incorporate measures brought about by

  2. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  3. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  4. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  5. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  6. Big Data in Drug Discovery.

    Science.gov (United States)

    Brown, Nathan; Cambruzzi, Jean; Cox, Peter J; Davies, Mark; Dunbar, James; Plumbley, Dean; Sellwood, Matthew A; Sim, Aaron; Williams-Jones, Bryn I; Zwierzyna, Magdalena; Sheppard, David W

    2018-01-01

    Interpretation of Big Data in the drug discovery community should enhance project timelines and reduce clinical attrition through improved early decision making. The issues we encounter start with the sheer volume of data and how we first ingest it before building an infrastructure to house it to make use of the data in an efficient and productive way. There are many problems associated with the data itself including general reproducibility, but often, it is the context surrounding an experiment that is critical to success. Help, in the form of artificial intelligence (AI), is required to understand and translate the context. On the back of natural language processing pipelines, AI is also used to prospectively generate new hypotheses by linking data together. We explain Big Data from the context of biology, chemistry and clinical trials, showcasing some of the impressive public domain sources and initiatives now available for interrogation. © 2018 Elsevier B.V. All rights reserved.

  7. Variations in flood magnitude-effect relations and the implications for flood risk assessment and river management

    Science.gov (United States)

    Hooke, J. M.

    2015-12-01

    In spite of major physical impacts from large floods, present river management rarely takes into account the possible dynamics and variation in magnitude-impact relations over time in flood risk mapping and assessment nor incorporates feedback effects of changes into modelling. Using examples from the literature and from field measurements over several decades in two contrasting environments, a semi-arid region and a humid-temperate region, temporal variations in channel response to flood events are evaluated. The evidence demonstrates how flood physical impacts can vary at a location over time. The factors influencing that variation on differing timescales are examined. The analysis indicates the importance of morphological changes and trajectory of adjustment in relation to thresholds, and that trends in force or resistance can take place over various timescales, altering those thresholds. Sediment supply can also change with altered connectivity upstream and changes in state of hillslope-channel coupling. It demonstrates that seasonal timing and sequence of events can affect response, particularly deposition through sediment supply. Duration can also have a significant effect and modify the magnitude relation. Lack of response or deposits in some events can mean that flood frequency using such evidence is underestimated. A framework for assessment of both past and possible future changes is provided which emphasises the uncertainty and the inconstancy of the magnitude-impact relation and highlights the dynamic factors and nature of variability that should be considered in sustainable management of river channels.

  8. Spatiotemporal hazard mapping of a flood event "migration" in a transboundary river basin as an operational tool in flood risk management

    Science.gov (United States)

    Perrou, Theodora; Papastergios, Asterios; Parcharidis, Issaak; Chini, Marco

    2017-10-01

    Flood disaster is one of the heaviest disasters in the world. It is necessary to monitor and evaluate the flood disaster in order to mitigate the consequences. As floods do not recognize borders, transboundary flood risk management is imperative in shared river basins. Disaster management is highly dependent on early information and requires data from the whole river basin. Based on the hypothesis that the flood events over the same area with same magnitude have almost identical evolution, it is crucial to develop a repository database of historical flood events. This tool, in the case of extended transboundary river basins, could constitute an operational warning system for the downstream area. The utility of SAR images for flood mapping, was demonstrated by previous studies but the SAR systems in orbit were not characterized by high operational capacity. Copernicus system will fill this gap in operational service for risk management, especially during emergency phase. The operational capabilities have been significantly improved by newly available satellite constellation, such as the Sentinel-1A AB mission, which is able to provide systematic acquisitions with a very high temporal resolution in a wide swath coverage. The present study deals with the monitoring of a transboundary flood event in Evros basin. The objective of the study is to create the "migration story" of the flooded areas on the basis of the evolution in time for the event occurred from October 2014 till May 2015. Flood hazard maps will be created, using SAR-based semi-automatic algorithms and then through the synthesis of the related maps in a GIS-system, a spatiotemporal thematic map of the event will be produced. The thematic map combined with TanDEM-X DEM, 12m/pixel spatial resolution, will define the non- affected areas which is a very useful information for the emergency planning and emergency response phases. The Sentinels meet the main requirements to be an effective and suitable

  9. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  10. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  11. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  12. Hydrometeorological network for flood monitoring and modeling

    Science.gov (United States)

    Efstratiadis, Andreas; Koussis, Antonis D.; Lykoudis, Spyros; Koukouvinos, Antonis; Christofides, Antonis; Karavokiros, George; Kappos, Nikos; Mamassis, Nikos; Koutsoyiannis, Demetris

    2013-08-01

    Due to its highly fragmented geomorphology, Greece comprises hundreds of small- to medium-size hydrological basins, in which often the terrain is fairly steep and the streamflow regime ephemeral. These are typically affected by flash floods, occasionally causing severe damages. Yet, the vast majority of them lack flow-gauging infrastructure providing systematic hydrometric data at fine time scales. This has obvious impacts on the quality and reliability of flood studies, which typically use simplistic approaches for ungauged basins that do not consider local peculiarities in sufficient detail. In order to provide a consistent framework for flood design and to ensure realistic predictions of the flood risk -a key issue of the 2007/60/EC Directive- it is essential to improve the monitoring infrastructures by taking advantage of modern technologies for remote control and data management. In this context and in the research project DEUCALION, we have recently installed and are operating, in four pilot river basins, a telemetry-based hydro-meteorological network that comprises automatic stations and is linked to and supported by relevant software. The hydrometric stations measure stage, using 50-kHz ultrasonic pulses or piezometric sensors, or both stage (piezometric) and velocity via acoustic Doppler radar; all measurements are being temperature-corrected. The meteorological stations record air temperature, pressure, relative humidity, wind speed and direction, and precipitation. Data transfer is made via GPRS or mobile telephony modems. The monitoring network is supported by a web-based application for storage, visualization and management of geographical and hydro-meteorological data (ENHYDRIS), a software tool for data analysis and processing (HYDROGNOMON), as well as an advanced model for flood simulation (HYDROGEIOS). The recorded hydro-meteorological observations are accessible over the Internet through the www-application. The system is operational and its

  13. Do flood risk perceptions provide useful insights for flood risk management? Findings from central Vietnam

    OpenAIRE

    Bubeck, P.; Botzen, W.J.W.; Suu, L.T.T.; Aerts, J.C.J.H.

    2012-01-01

    Following the renewed attention for non-structural flood risk reduction measures implemented at the household level, there has been an increased interest in individual flood risk perceptions. The reason for this is the commonly-made assumption that flood risk perceptions drive the motivation of individuals to undertake flood risk mitigation measures, as well as the public's demand for flood protection, and therefore provide useful insights for flood risk management. This study empirically exa...

  14. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  15. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  16. Big russian oil round

    International Nuclear Information System (INIS)

    Slovak, K.; Beer, G.

    2006-01-01

    The departure of Mikhail Khodorkovsky has brought an end to the idyllic times of supplies of Russian oil to the MOL-Slovnaft group. The group used to purchase oil directly from Yukos. But now brokers have again entered the Central European oil business. And their aim is to take control over all of the oil business. The Russians demonstrated the changed situation to Slovakia last autumn: you will either accept the new model, or there will be problems with oil deliveries. Consumers got the message. The main brokers of Russian oil in Central Europe are the Swiss companies Glencore and Fisotra. Little information is available regarding these commodity brokers. But the information available is sufficient to indicate that these are not small companies. Glencore undertakes 3% of all international oil trades. With an annual turnover of 72 billions USD, it was the biggest Swiss company by turnover in 2004. Fisotra also has an extensive product portfolio. It offers financial and commercial services and does not hide its good relations with Russian oil companies. Between 1994 and 1998, it managed their financial operations with major western companies such as BP, Cargill, Elf, Exxon, Shell, Total, and Mutsubishi and also with Glencore. Fisotra states that some of its clients achieved an annual turnover of 1.5 billions USD. At present, the Swiss brokers receive a fee of 1 to 1.5 USD per barrel. The Russian political elite must be aware of these brokerage services as the oil transport through the transit system is closely monitored by the state owned company Transneft. (authors)

  17. Long-term changes in community assembly, resistance, and resilience following experimental floods.

    Science.gov (United States)

    Robinson, Christopher T

    2012-10-01

    This study examined the long-term changes in community assembly, resistance, and resilience of macroinvertebrates following 10 years of experimental floods in a flow regulated river. Physico-chemistry, macroinvertebrates, and periphyton biomass were monitored before and sequentially after each of 22 floods, and drift/seston was collected during six separate floods over the study period. The floods reduced the density and taxon richness of macroinvertebrates, and a nonmetric dimensional scaling (NMDS) analysis distinguished temporal shifts in community assembly. Resistance (measured as the relative lack of loss in density) tofloods varied among taxa, and the abundance of resistant taxa was related to the temporal changes in community assembly. Community resistance was inversely related to flood magnitude with all larger floods (> 25 m3/s, > 16-fold over baseflow) reducing densities by > 75% regardless of flood year, whereas smaller floods (floods. No relationship was found between flood magnitude and the relative loss in periphyton biomass. Resilience was defined as the recovery slope (positive slope of a parameter with time following each flood) and was unrelated to shifts in community assembly or resistance. Macroinvertebrate drift and seston demonstrated hysteresis (i.e., a temporal response in parameter quantity with change in discharge) during each flood, although larger floods typically had two peaks in both parameters. The first peak was a response to the initial increases in flow, whereas the second peak was associated with streambed disturbance (substrate mobility) and side-slope failure causing increased scour. Drift density was 3-9 times greater and that of seston 3-30 times greater during larger floods than smaller floods. These results demonstrate temporal shifts in macroinvertebrate community assembly toward a pre-dam assemblage following sequential floods in this flow regulated river, thus confirming the ecological role of habitat filtering in

  18. Natural Flood Management in context: evaluating and enhancing the impact.

    Science.gov (United States)

    Metcalfe, Peter; Beven, Keith; Hankin, Barry; Lamb, Rob

    2016-04-01

    The series of flood events in the UK throughout December 2015 have led to calls for a reappraisal of the country's approach to flood management. In parts of Cumbria so-called "1 in 100" year floods have occurred three times in the last ten years, leading to significant infrastructure damage. Hard-engineered defences upgraded to cope with an anticipated 20% increase in peak flows and these 1% AEP events have been overwhelmed. It has become more widely acknowledged that unsympathetic agricultural and upland management practices, mainly since the Second World War, have led to a significant loss of storage in mid and upper catchments and their consequent ability to retain and slow storm run-off. Natural Flood Management (NFM) is a nature-based solution to restoring this storage and flood peak attenuation through a network of small-scale features exploiting natural topography and materials. Combined with other "soft" interventions such as restoring flood plain roughness and tree-planting, NFM offers the attractive prospect of an intervention that can target both the ecological and chemical objectives of the Water Framework Directive and the resilience demanded by the Floods Directive. We developed a simple computerised physical routing model that can account for the presence of in-channel and offline features such as would be found in a NFM scheme. These will add storage to the channel and floodplain and throttle the downstream discharge at storm flows. The model was applied to the heavily-modified channel network of an agricultural catchment in North Yorkshire using the run-off simulated for two storm events that caused flooding downstream in the autumn of 2012. Using up to 60 online features we demonstrated some gains in channel storage and a small impact on the flood hydrograph which would, however, have been insufficient to prevent the downstream floods in either of the storms. Complementary research at JBA has applied their hydrodynamic model JFLOW+ to identify

  19. Planning of technical flood retention measures in large river basins under consideration of imprecise probabilities of multivariate hydrological loads

    Directory of Open Access Journals (Sweden)

    D. Nijssen

    2009-08-01

    . With regard to these known unknowns the bias of the simulations was considered by imprecise probabilities. Probabilities, derived from measured flood data were combined with probabilities which were estimated from long simulated series. To consider imprecise probabilities, fuzzy sets were used to distinguish the results between more or less possible design floods. The need for such a differentiated view on the performance of flood protection systems is demonstrated by a case study.

  20. Drivers of flood damage on event level

    DEFF Research Database (Denmark)

    Kreibich, H.; Aerts, J. C. J. H.; Apel, H.

    2016-01-01

    Flood risk is dynamic and influenced by many processes related to hazard, exposure and vulnerability. Flood damage increased significantly over the past decades, however, resulting overall economic loss per event is an aggregated indicator and it is difficult to attribute causes to this increasing...... trend. Much has been learned about damaging processes during floods at the micro-scale, e.g. building level. However, little is known about the main factors determining the amount of flood damage on event level. Thus, we analyse and compare paired flood events, i.e. consecutive, similar damaging floods...... example are the 2002 and 2013 floods in the Elbe and Danube catchments in Germany. The 2002 flood caused the highest economic damage (EUR 11600 million) due to a natural hazard event in Germany. Damage was so high due to extreme flood hazard triggered by extreme precipitation and a high number...