WorldWideScience

Sample records for big sioux aquifer

  1. Construction of a groundwater-flow model for the Big Sioux Aquifer using airborne electromagnetic methods, Sioux Falls, South Dakota

    Science.gov (United States)

    Valder, Joshua F.; Delzer, Gregory C.; Carter, Janet M.; Smith, Bruce D.; Smith, David V.

    2016-09-28

    The city of Sioux Falls is the fastest growing community in South Dakota. In response to this continued growth and planning for future development, Sioux Falls requires a sustainable supply of municipal water. Planning and managing sustainable groundwater supplies requires a thorough understanding of local groundwater resources. The Big Sioux aquifer consists of glacial outwash sands and gravels and is hydraulically connected to the Big Sioux River, which provided about 90 percent of the city’s source-water production in 2015. Managing sustainable groundwater supplies also requires an understanding of groundwater availability. An effective mechanism to inform water management decisions is the development and utilization of a groundwater-flow model. A groundwater-flow model provides a quantitative framework for synthesizing field information and conceptualizing hydrogeologic processes. These groundwater-flow models can support decision making processes by mapping and characterizing the aquifer. Accordingly, the city of Sioux Falls partnered with the U.S. Geological Survey to construct a groundwater-flow model. Model inputs will include data from advanced geophysical techniques, specifically airborne electromagnetic methods.

  2. Delineation of the hydrogeologic framework of the Big Sioux aquifer near Sioux Falls, South Dakota, using airborne electromagnetic data

    Science.gov (United States)

    Valseth, Kristen J.; Delzer, Gregory C.; Price, Curtis V.

    2018-03-21

    The U.S. Geological Survey, in cooperation with the City of Sioux Falls, South Dakota, began developing a groundwater-flow model of the Big Sioux aquifer in 2014 that will enable the City to make more informed water management decisions, such as delineation of areas of the greatest specific yield, which is crucial for locating municipal wells. Innovative tools are being evaluated as part of this study that can improve the delineation of the hydrogeologic framework of the aquifer for use in development of a groundwater-flow model, and the approach could have transfer value for similar hydrogeologic settings. The first step in developing a groundwater-flow model is determining the hydrogeologic framework (vertical and horizontal extents of the aquifer), which typically is determined by interpreting geologic information from drillers’ logs and surficial geology maps. However, well and borehole data only provide hydrogeologic information for a single location; conversely, nearly continuous geophysical data are collected along flight lines using airborne electromagnetic (AEM) surveys. These electromagnetic data are collected every 3 meters along a flight line (on average) and subsequently can be related to hydrogeologic properties. AEM data, coupled with and constrained by well and borehole data, can substantially improve the accuracy of aquifer hydrogeologic framework delineations and result in better groundwater-flow models. AEM data were acquired using the Resolve frequency-domain AEM system to map the Big Sioux aquifer in the region of the city of Sioux Falls. The survey acquired more than 870 line-kilometers of AEM data over a total area of about 145 square kilometers, primarily over the flood plain of the Big Sioux River between the cities of Dell Rapids and Sioux Falls. The U.S. Geological Survey inverted the survey data to generate resistivity-depth sections that were used in two-dimensional maps and in three-dimensional volumetric visualizations of the Earth

  3. 76 FR 53827 - Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of...

    Science.gov (United States)

    2011-08-30

    ...-AA00 Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of... restricting navigation on the Big Sioux River from the Military Road Bridge in North Sioux City, South Dakota... zone on the Big Sioux River from the Military Road Bridge in North Sioux City, SD at 42.52 degrees...

  4. 76 FR 38013 - Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of...

    Science.gov (United States)

    2011-06-29

    ...-AA00 Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of... Military Road Bridge in North Sioux City, South Dakota to the confluence of the Missouri River and... Big Sioux River from the Military Road Bridge in North Sioux City, SD at 42.52 degrees North, 096.48...

  5. Use of geochemical tracers for estimating groundwater influxes to the Big Sioux River, eastern South Dakota, USA

    Science.gov (United States)

    Neupane, Ram P.; Mehan, Sushant; Kumar, Sandeep

    2017-09-01

    Understanding the spatial distribution and variability of geochemical tracers is crucial for estimating groundwater influxes into a river and can contribute to better future water management strategies. Because of the much higher radon (222Rn) activities in groundwater compared to river water, 222Rn was used as the main tracer to estimate groundwater influxes to river discharge over a 323-km distance of the Big Sioux River, eastern South Dakota, USA; these influx estimates were compared to the estimates using Cl- concentrations. In the reaches overall, groundwater influxes using the 222Rn activity approach ranged between 0.3 and 6.4 m3/m/day (mean 1.8 m3/m/day) and the cumulative groundwater influx estimated during the study period was 3,982-146,594 m3/day (mean 40,568 m3/day), accounting for 0.2-41.9% (mean 12.5%) of the total river flow rate. The mean groundwater influx derived using the 222Rn activity approach was lower than that calculated based on Cl- concentration (35.6 m3/m/day) for most of the reaches. Based on the Cl- approach, groundwater accounted for 37.3% of the total river flow rate. The difference between the method estimates may be associated with minimal differences between groundwater and river Cl- concentrations. These assessments will provide a better understanding of estimates used for the allocation of water resources to sustain agricultural productivity in the basin. However, a more detailed sampling program is necessary for accurate influx estimation, and also to understand the influence of seasonal variation on groundwater influxes into the basin.

  6. Hydrogeologic data for the Big River-Mishnock River stream-aquifer system, central Rhode Island

    Science.gov (United States)

    Craft, P.A.

    2001-01-01

    Hydrogeology, ground-water development alternatives, and water quality in the BigMishnock stream-aquifer system in central Rhode Island are being investigated as part of a long-term cooperative program between the Rhode Island Water Resources Board and the U.S. Geological Survey to evaluate the ground-water resources throughout Rhode Island. The study area includes the Big River drainage basin and that portion of the Mishnock River drainage basin upstream from the Mishnock River at State Route 3. This report presents geologic data and hydrologic and water-quality data for ground and surface water. Ground-water data were collected from July 1996 through September 1998 from a network of observation wells consisting of existing wells and wells installed for this study, which provided a broad distribution of data-collection sites throughout the study area. Streambed piezometers were used to obtain differences in head data between surface-water levels and ground-water levels to help evaluate stream-aquifer interactions throughout the study area. The types of data presented include monthly ground-water levels, average daily ground-water withdrawals, drawdown data from aquifer tests, and water-quality data. Historical water-level data from other wells within the study area also are presented in this report. Surface-water data were obtained from a network consisting of surface-water impoundments, such as ponds and reservoirs, existing and newly established partial-record stream-discharge sites, and synoptic surface-water-quality sites. Water levels were collected monthly from the surface-water impoundments. Stream-discharge measurements were made at partial-record sites to provide measurements of inflow, outflow, and internal flow throughout the study area. Specific conductance was measured monthly at partial-record sites during the study, and also during the fall and spring of 1997 and 1998 at 41 synoptic sites throughout the study area. General geologic data, such as

  7. The U.S. Army's Sioux Campaign of 1876: Identifying the Horse as the Center of Gravity of the Sioux

    National Research Council Canada - National Science Library

    Hoyt, Mark

    2003-01-01

    .... If the Army had a complete understanding of the Sioux they would have realized that the hub of all power or center of gravity of the Sioux was the horse, which every major aspect of Sioux life...

  8. Lower Sioux Wind Feasibility & Development

    Energy Technology Data Exchange (ETDEWEB)

    Minkel, Darin

    2012-04-01

    This report describes the process and findings of a Wind Energy Feasibility Study (Study) conducted by the Lower Sioux Indian Community (Community). The Community is evaluating the development of a wind energy project located on tribal land. The project scope was to analyze the critical issues in determining advantages and disadvantages of wind development within the Community. This analysis addresses both of the Community's wind energy development objectives: the single turbine project and the Commerical-scale multiple turbine project. The main tasks of the feasibility study are: land use and contraint analysis; wind resource evaluation; utility interconnection analysis; and project structure and economics.

  9. Aquifers

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This map layer contains the shallowest principal aquifers of the conterminous United States, Hawaii, Puerto Rico, and the U.S. Virgin Islands, portrayed as polygons....

  10. Lakota Sioux Indian Dance Theatre. Cuesheet for Students.

    Science.gov (United States)

    Carr, John C.; And Others

    This performance guide provides students with an introduction to Lakota Sioux history and culture and to the dances performed by the Lakota Sioux Indian Dance Theatre. The Lakota Sioux believe that life is a sacred circle in which all things are connected, and that the circle was broken for them in 1890 by the massacre at Wounded Knee. Only in…

  11. The Dakota or Sioux. Gopher Historian Leaflet Series No. 5.

    Science.gov (United States)

    Minnesota Historical Society, St. Paul.

    The Dakota or Sioux people may well be the best known of all the nations which first lived in North America. Tribal members gave themselves the name Dakota, meaning friends. Their Minnesota neighbors called them by a long name meaning enemy. French traders in the 1600s took the last part of the name and wrote it down as Sioux. Since then, they…

  12. Occurrence of anthropogenic organic compounds and nutrients in source and finished water in the Sioux Falls area, South Dakota, 2009-10

    Science.gov (United States)

    Hoogestraat, Galen K.

    2012-01-01

    Anthropogenic organic compounds (AOCs) in drinking-water sources commonly are derived from municipal, agricultural, and industrial wastewater sources, and are a concern for water-supply managers. A cooperative study between the city of Sioux Falls, S. Dak., and the U.S. Geological Survey was initiated in 2009 to (1) characterize the occurrence of anthropogenic organic compounds in the source waters (groundwater and surface water) to water supplies in the Sioux Falls area, (2) determine if the compounds detected in the source waters also are present in the finished water, and (3) identify probable sources of nitrate in the Big Sioux River Basin and determine if sources change seasonally or under different hydrologic conditions. This report presents analytical results of water-quality samples collected from source waters and finished waters in the Sioux Falls area. The study approach included the collection of water samples from source and finished waters in the Sioux Falls area for the analyses of AOCs, nutrients, and nitrogen and oxygen isotopes in nitrate. Water-quality constituents monitored in this study were chosen to represent a variety of the contaminants known or suspected to occur within the Big Sioux River Basin, including pesticides, pharmaceuticals, sterols, household and industrial products, polycyclic aromatic hydrocarbons, antibiotics, and hormones. A total of 184 AOCs were monitored, of which 40 AOCs had relevant human-health benchmarks. During 11 sampling visits, 45 AOCs (24 percent) were detected in at least one sample of source or finished water, and 13 AOCs were detected in at least 20 percent of all samples. Concentrations of detected AOCs were all less than 1 microgram per liter, except for two AOCs in multiple samples from the Big Sioux River, and one AOC in finished-water samples. Concentrations of AOCs were less than 0.1 microgram per liter in more than 75 percent of the detections. Nutrient concentrations varied seasonally in source

  13. Saltwater intrusion in the surficial aquifer system of the Big Cypress Basin, southwest Florida, and a proposed plan for improved salinity monitoring

    Science.gov (United States)

    Prinos, Scott T.

    2013-01-01

    The installation of drainage canals, poorly cased wells, and water-supply withdrawals have led to saltwater intrusion in the primary water-use aquifers in southwest Florida. Increasing population and water use have exacerbated this problem. Installation of water-control structures, well-plugging projects, and regulation of water use have slowed saltwater intrusion, but the chloride concentration of samples from some of the monitoring wells in this area indicates that saltwater intrusion continues to occur. In addition, rising sea level could increase the rate and extent of saltwater intrusion. The existing saltwater intrusion monitoring network was examined and found to lack the necessary organization, spatial distribution, and design to properly evaluate saltwater intrusion. The most recent hydrogeologic framework of southwest Florida indicates that some wells may be open to multiple aquifers or have an incorrect aquifer designation. Some of the sampling methods being used could result in poor-quality data. Some older wells are badly corroded, obstructed, or damaged and may not yield useable samples. Saltwater in some of the canals is in close proximity to coastal well fields. In some instances, saltwater occasionally occurs upstream from coastal salinity control structures. These factors lead to an incomplete understanding of the extent and threat of saltwater intrusion in southwest Florida. A proposed plan to improve the saltwater intrusion monitoring network in the South Florida Water Management District’s Big Cypress Basin describes improvements in (1) network management, (2) quality assurance, (3) documentation, (4) training, and (5) data accessibility. The plan describes improvements to hydrostratigraphic and geospatial network coverage that can be accomplished using additional monitoring, surface geophysical surveys, and borehole geophysical logging. Sampling methods and improvements to monitoring well design are described in detail. Geochemical analyses

  14. Virginia Driving Hawk Sneve, Sioux Author. With Teacher's Guide. Native Americans of the Twentieth Century.

    Science.gov (United States)

    Minneapolis Public Schools, MN.

    A biography for elementary school students describes the life and career of Virginia Driving Hawk Sneve (Sioux), a Native American free-lance writer, and includes her photograph and a map of South Dakota reservations. A story by Mrs. Sneve tells about a half-Sioux boy who confronts his heritage when his grandfather makes a long journey between his…

  15. Lower Sioux Indian Community Repository Research Project: Final report

    International Nuclear Information System (INIS)

    O'Brien, L.; Farmer, D.; Lewis, S.

    1988-01-01

    The Upper and Lower Sioux Communities have undertaken a review of the geotechnical aspects of the Department of Energy (DOE) program document entitled Draft Area Recommendation Report (DARR). The DARR recommends twenty sites be retained for continued consideration as a possible location for the second high-level nuclear waste repository. Of these twenty sites, twelve are designated as Potentially Acceptable Sites (PAS), and eight are designated as candidate areas to serve as /open quotes/back-ups/close quotes/ to the PAS's. It is understood there are no current plans to investigate any of the eight candidate areas. It is distressing to the Upper and Lower Sioux Communities that the DOE intends to hold these eight sites in reserve. We do not feel it is appropriate to identify /open quotes/reserve/close quotes/ sites which could be elevated to a PAS at any time during the area phase of investigation. The following chapters in this report provide a summary of the specific procedural and technical problems noted in the screening methodology; and describe our concerns over the selection of NC-13 and NC-14 as reserve sites. Also included are the specific comments recorded by our technical subcontractors as they examined the DARR for us. 10 refs

  16. Alluvial Aquifer

    Data.gov (United States)

    Kansas Data Access and Support Center — This coverage shows the extents of the alluvial aquifers in Kansas. The alluvial aquifers consist of unconsolidated Quaternary alluvium and contiguous terrace...

  17. Indianness, Sex, and Grade Differences on Behavior and Personality Measures Among Oglala Sioux Adolescents

    Science.gov (United States)

    Cress, Joseph N.; O'Donnell, James P.

    1974-01-01

    This study assesses Indianness (mixed or full-blood), sex, and grade differences among Oglala Sioux high school students, using the Coopersmith Behavior Rating Forms and the Quay-Peterson Behavior Problem Checklist. Results indicate that mixed-bloods had higher achievement and greater popularity than full-bloods. Fewer problems and higher…

  18. Patriarchy and the "Fighting Sioux": A Gendered Look at Racial College Sports Nicknames

    Science.gov (United States)

    Williams, Dana M.

    2006-01-01

    The use of Native American nicknames and symbols by US college athletics is a long-standing practice that embodies various forms of authoritarian oppression. One type of authoritarianism is that of patriarchy and it has been present in the struggle over the nickname at the University of North Dakota, the "Fighting Sioux". This article…

  19. Carbonate aquifers

    Science.gov (United States)

    Cunningham, Kevin J.; Sukop, Michael; Curran, H. Allen

    2012-01-01

    Only limited hydrogeological research has been conducted using ichnology in carbonate aquifer characterization. Regardless, important applications of ichnology to carbonate aquifer characterization include its use to distinguish and delineate depositional cycles, correlate mappable biogenically altered surfaces, identify zones of preferential groundwater flow and paleogroundwater flow, and better understand the origin of ichnofabric-related karst features. Three case studies, which include Pleistocene carbonate rocks of the Biscayne aquifer in southern Florida and Cretaceous carbonate strata of the Edwards–Trinity aquifer system in central Texas, demonstrate that (1) there can be a strong relation between ichnofabrics and groundwater flow in carbonate aquifers and (2) ichnology can offer a useful methodology for carbonate aquifer characterization. In these examples, zones of extremely permeable, ichnofabric-related macroporosity are mappable stratiform geobodies and as such can be represented in groundwater flow and transport simulations.

  20. 77 FR 27417 - Foreign-Trade Zone 220-Sioux Falls, SD; Notification of Proposed Production Activity, Rosenbauer...

    Science.gov (United States)

    2012-05-10

    ... is used for the production of emergency vehicles and firefighting equipment (pumps, tankers, rescue... drives, DC motors, static converters, rechargeable flashlights, flashlight parts, electrical foam..., LLC, (Emergency Vehicles/Firefighting Equipment), Lyons, SD The Sioux Falls Development Foundation...

  1. Sediment and Hydraulic Measurements with Computed Bed Load on the Missouri River, Sioux City to Hermann, 2014

    Science.gov (United States)

    2017-05-01

    ER D C /C HL T R- 17 -8 Sediment and Hydraulic Measurements with Computed Bed Load on the Missouri River , Sioux City to Hermann, 2014...Hydraulic Measurements with Computed Bed Load on the Missouri River , Sioux City to Hermann, 2014 David Abraham, Marielys Ramos-Villanueva, Thad Pratt...Engineers, Omaha and Kansas City Districts, in quantifying sediment bed load and suspended load at several sites on the Missouri River for the

  2. Ozark Aquifer

    Data.gov (United States)

    Kansas Data Access and Support Center — These digital maps contain information on the altitude of the base and top, the extent, and the potentiometric surface of the Ozark aquifer in Kansas. The Ozark...

  3. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  4. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  5. Guarani aquifer

    International Nuclear Information System (INIS)

    2007-01-01

    The environmental protection and sustain ability develop project of Guarani Aquifer System is a join work from Argentina, Brazil, Paraguay and Uruguay with a purpose to increase the knowledge resource and propose technical legal and organizational framework for sustainable management between countries.The Universities funds were created as regional universities support in promotion, training and academic research activities related to environmental al social aspects of the Guarani Aquifer System.The aim of the project is the management and protection of the underground waters resources taking advantage and assesment for nowadays and future generations

  6. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  7. "I Am Not a Fairy Tale": Contextualizing Sioux Spirituality and Story Traditions in Susan Power's "The Grass Dancer"

    Science.gov (United States)

    Diana, Vanessa Holford

    2009-01-01

    Standing Rock Sioux writer Susan Power's best-selling novel "The Grass Dancer" (1994) includes depictions of the supernatural and spiritual that do not conform to the Judeo-Christian or, in some cases, the atheist or rationalist worldviews of many readers. Power writes of ghost characters and haunted places, communication between the living and…

  8. The medicine wheel nutrition intervention: a diabetes education study with the Cheyenne River Sioux Tribe.

    Science.gov (United States)

    Kattelmann, Kendra K; Conti, Kibbe; Ren, Cuirong

    2009-09-01

    The Northern Plains Indians of the Cheyenne River Sioux Tribe have experienced significant lifestyle and dietary changes over the past seven generations that have resulted in increased rates of diabetes and obesity. The objective of this study was to determine if Northern Plains Indians with type 2 diabetes mellitus who are randomized to receive culturally adapted educational lessons based on the Medicine Wheel Model for Nutrition in addition to their usual dietary education will have better control of their type 2 diabetes than a nonintervention, usual care group who received only the usual dietary education from their personal providers. A 6-month, randomized, controlled trial was conducted January 2005 through December 2005, with participants randomized to the education intervention or usual care control group. The education group received six nutrition lessons based on the Medicine Wheel Model for Nutrition. The usual care group received the usual dietary education from their personal providers. One hundred fourteen Northern Plains Indians from Cheyenne River Sioux Tribe aged 18 to 65 years, with type 2 diabetes. Weight, body mass index (BMI), hemoglobin A1c, fasting serum glucose and lipid parameters, circulating insulin, and blood pressure were measured at the beginning and completion. Diet histories, physical activity, and dietary satiety surveys were measured at baseline and monthly through completion. Differences were determined using Student t tests, chi(2) tests, and analysis of variance. The education group had a significant weight loss (1.4+/-0.4 kg, Pnutrition intervention promoted small but positive changes in weight. Greater frequency and longer duration of educational support may be needed to influence blood glucose and lipid parameters.

  9. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  10. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  11. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  12. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  13. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  14. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  15. The U.S. Army’s Sioux Campaign of 1876: Identifying the Horse as the Center of Gravity of the Sioux

    Science.gov (United States)

    2003-06-06

    Books, 1994), 10-11; and Utley, 11, 252-253. 6Thomas Dunlay, Wolves for the Blue Soldiers: Indian Scouts and Auxiliaries with the United States Army... Yellowstone were under the overall command of General Alfred Terry, commanding the Department of the Dakotas. All three columns were attempting to find and...Annual Report, 1875, 35-48; Bourke, On the Border With Crook, 289-291. 4John F. Finerty, War-Path and Bivouac: The Big Horn and Yellowstone Expedition

  16. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  17. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  18. Cyclone Boiler Field Testing of Advanced Layered NOx Control Technology in Sioux Unit 1

    Energy Technology Data Exchange (ETDEWEB)

    Marc A. Cremer; Bradley R. Adams

    2006-06-30

    A four week testing program was completed during this project to assess the ability of the combination of deep staging, Rich Reagent Injection (RRI), and Selective Non-Catalytic Reduction (SNCR) to reduce NOx emissions below 0.15 lb/MBtu in a cyclone fired boiler. The host site for the tests was AmerenUE's Sioux Unit 1, a 500 MW cyclone fired boiler located near St. Louis, MO. Reaction Engineering International (REI) led the project team including AmerenUE, FuelTech Inc., and the Electric Power Research Institute (EPRI). This layered approach to NOx reduction is termed the Advanced Layered Technology Approach (ALTA). Installed RRI and SNCR port locations were guided by computational fluid dynamics (CFD) based modeling conducted by REI. During the parametric testing, NOx emissions of 0.12 lb/MBtu were achieved consistently from overfire air (OFA)-only baseline NOx emissions of 0.25 lb/MBtu or less, when firing the typical 80/20 fuel blend of Powder River Basin (PRB) and Illinois No.6 coals. From OFA-only baseline levels of 0.20 lb/MBtu, NOx emissions of 0.12 lb/MBtu were also achieved, but at significantly reduced urea flow rates. Under the deeply staged conditions that were tested, RRI performance was observed to degrade as higher blends of Illinois No.6 were used. NOx emissions achieved with ALTA while firing a 60/40 blend were approximately 0.15 lb/MBtu. NOx emissions while firing 100% Illinois No.6 were approximately 0.165 lb/MBtu. Based on the performance results of these tests, economics analyses of the application of ALTA to a nominal 500 MW cyclone unit show that the levelized cost to achieve 0.15 lb/MBtu is well below 75% of the cost of a state of the art SCR.

  19. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  20. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  1. Ground-water appraisal in northwestern Big Stone County, west-central Minnesota

    Science.gov (United States)

    Soukup, W.G.

    1980-01-01

    The development of ground water for irrigation in northwestern Big Stone County has not kept up with development in other irrigable areas of the State. This is due, in part, to the absence of extensive surficial aquifers and the difficulty in locating buried aquifers.

  2. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  3. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  4. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  5. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  6. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  7. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  8. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  9. EPA Sole Source Aquifers

    Data.gov (United States)

    U.S. Environmental Protection Agency — Information on sole source aquifers (SSAs) is widely used in assessments under the National Environmental Policy Act and at the state and local level. A national...

  10. Tracers Detect Aquifer Contamination

    National Research Council Canada - National Science Library

    Enfield, Carl

    1995-01-01

    The EPA's National Laboratory (NRMRL) at Ada, OK, along with the University of Florida and the University of Texas, have developed a tracer procedure to detect the amount of contamination in aquifer formations...

  11. Ogallala Aquifer Mapping Program

    International Nuclear Information System (INIS)

    1984-10-01

    A computerized data file has been established which can be used efficiently by the contour-plotting program SURFACE II to produce maps of the Ogallala aquifer in 17 counties of the Texas Panhandle. The data collected have been evaluated and compiled into three sets, from which SURFACE II can generate maps of well control, aquifer thickness, saturated thickness, water level, and the difference between virgin (pre-1942) and recent (1979 to 1981) water levels. 29 figures, 1 table

  12. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  13. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  14. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  15. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  16. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  17. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  18. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  19. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  20. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  1. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  2. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  3. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  4. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  5. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  6. Arsenic levels in groundwater aquifer

    African Journals Online (AJOL)

    Miodrag Jelic

    resistance (ρ); dielectric constant (ε); magnetic permeability (η); electrochemical activity ..... comprises grey sands of different particle size distribution ..... groundwater: testing pollution mechanisms for sedimentary aquifers in. Bangladesh.

  7. EPA Region 1 Sole Source Aquifers

    Data.gov (United States)

    U.S. Environmental Protection Agency — This coverage contains boundaries of EPA-approved sole source aquifers. Sole source aquifers are defined as an aquifer designated as the sole or principal source of...

  8. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  9. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  10. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  11. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  12. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  13. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  14. AQUIFER IN AJAOKUTA, SOUTHWESTERN NIGERIA

    African Journals Online (AJOL)

    2005-03-08

    Mar 8, 2005 ... To establish the feasibility of water supply in a basement complex area ofAjaokuta, Southwestern Nigeria, pumping test results were used to investigate the storage properties and groundwater potential of the aquifer. The aquifer system consists of weathered and weathered/fractured zone of decomposed ...

  15. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  16. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  17. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  18. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  19. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  20. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  1. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  2. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  3. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  4. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  5. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  6. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  7. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  8. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  9. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  10. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  11. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  12. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  13. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  14. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  15. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  16. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  17. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  18. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  19. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  20. Toward developing more realistic groundwater models using big data

    Science.gov (United States)

    Vahdat Aboueshagh, H.; Tsai, F. T. C.; Bhatta, D.; Paudel, K.

    2017-12-01

    Rich geological data is the backbone of developing realistic groundwater models for groundwater resources management. However, constructing realistic groundwater models can be challenging due to inconsistency between different sources of geological, hydrogeological and geophysical data and difficulty in processing big data to characterize the subsurface environment. This study develops a framework to utilize a big geological dataset to create a groundwater model for the Chicot Aquifer in the southwestern Louisiana, which borders on the Gulf of Mexico at south. The Chicot Aquifer is the principal source of fresh water in southwest Louisiana, underlying an area of about 9,000 square miles. Agriculture is the largest groundwater consumer in this region and overpumping has caused significant groundwater head decline and saltwater intrusion from the Gulf and deep formations. A hydrostratigraphy model was constructed using around 29,000 electrical logs and drillers' logs as well as screen lengths of pumping wells through a natural neighbor interpolation method. These sources of information have different weights in terms of accuracy and trustworthy. A data prioritization procedure was developed to filter untrustworthy log information, eliminate redundant data, and establish consensus of various lithological information. The constructed hydrostratigraphy model shows 40% sand facies, which is consistent with the well log data. The hydrostratigraphy model confirms outcrop areas of the Chicot Aquifer in the north of the study region. The aquifer sand formation is thinning eastward to merge into Atchafalaya River alluvial aquifer and coalesces to the underlying Evangeline aquifer. A grid generator was used to convert the hydrostratigraphy model into a MODFLOW grid with 57 layers. A Chicot groundwater model was constructed using the available hydrologic and hydrogeological data for 2004-2015. Pumping rates for irrigation wells were estimated using the crop type and acreage

  1. aquifer in ajaokuta, southwestern nigeria

    African Journals Online (AJOL)

    2005-03-08

    Mar 8, 2005 ... (1969) straight line method (observation well) of draw-down analysis in an unconfined aquifer (B=1) yield ... April) and a short wet season (May-September). .... DECOMPOSED. GRANITIC ROCK WITH. QUARTZ VEINS. 13.

  2. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  3. EPA Region 1 Sole Source Aquifers

    Science.gov (United States)

    This coverage contains boundaries of EPA-approved sole source aquifers. Sole source aquifers are defined as an aquifer designated as the sole or principal source of drinking water for a given aquifer service area; that is, an aquifer which is needed to supply 50% or more of the drinking water for the area and for which there are no reasonable alternative sources should the aquifer become contaminated.The aquifers were defined by a EPA hydrogeologist. Aquifer boundaries were then drafted by EPA onto 1:24000 USGS quadrangles. For the coastal sole source aquifers the shoreline as it appeared on the quadrangle was used as a boundary. Delineated boundaries were then digitized into ARC/INFO.

  4. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  5. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  6. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  8. Estimating Groundwater Mounding in Sloping Aquifers for Managed Aquifer Recharge.

    Science.gov (United States)

    Zlotnik, Vitaly A; Kacimov, Anvar; Al-Maktoumi, Ali

    2017-11-01

    Design of managed aquifer recharge (MAR) for augmentation of groundwater resources often lacks detailed data, and simple diagnostic tools for evaluation of the water table in a broad range of parameters are needed. In many large-scale MAR projects, the effect of a regional aquifer base dip cannot be ignored due to the scale of recharge sources (e.g., wadis, streams, reservoirs). However, Hantush's (1967) solution for a horizontal aquifer base is commonly used. To address sloping aquifers, a new closed-form analytical solution for water table mound accounts for the geometry and orientation of recharge sources at the land surface with respect to the aquifer base dip. The solution, based on the Dupiuit-Forchheimer approximation, Green's function method, and coordinate transformations is convenient for computing. This solution reveals important MAR traits in variance with Hantush's solution: mounding is limited in time and space; elevation of the mound is strongly affected by the dip angle; and the peak of the mound moves over time. These findings have important practical implications for assessment of various MAR scenarios, including waterlogging potential and determining proper rates of recharge. Computations are illustrated for several characteristic MAR settings. © 2017, National Ground Water Association.

  9. Characterising aquifer treatment for pathogens in managed aquifer recharge.

    Science.gov (United States)

    Page, D; Dillon, P; Toze, S; Sidhu, J P S

    2010-01-01

    In this study the value of subsurface treatment of urban stormwater during Aquifer Storage Transfer Recovery (ASTR) is characterised using quantitative microbial risk assessment (QMRA) methodology. The ASTR project utilizes a multi-barrier treatment train to treat urban stormwater but to date the role of the aquifer has not been quantified. In this study it was estimated that the aquifer barrier provided 1.4, 2.6, >6.0 log(10) removals for rotavirus, Cryptosporidium and Campylobacter respectively based on pathogen diffusion chamber results. The aquifer treatment barrier was found to vary in importance vis-à-vis the pre-treatment via a constructed wetland and potential post-treatment options of UV-disinfection and chlorination for the reference pathogens. The risk assessment demonstrated that the human health risk associated with potable reuse of stormwater can be mitigated (disability adjusted life years, DALYs aquifer is integrated with suitable post treatment options into a treatment train to attenuate pathogens and protect human health.

  10. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  11. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  12. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  13. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  14. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  15. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  16. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  17. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  18. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  19. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  20. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  1. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  2. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  3. Aquifers Characterization and Productivity in Ellala Catchment ...

    African Journals Online (AJOL)

    user

    Aquifers Characterization and Productivity in Ellala Catchment, Tigray, ... using geological and hydrogeological methods in Ellala catchment (296.5km. 2. ) ... Current estimates put the available groundwater ... Aquifer characterization takes into.

  4. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  5. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  6. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  7. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  8. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  9. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  10. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  11. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  12. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  13. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  14. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  15. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  16. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  17. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  18. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  19. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  20. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  1. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    has significant potential to sequester large amounts of CO{sub 2}. Simulations conducted to evaluate mineral trapping potential of mafic volcanic rock formations located in the Idaho province suggest that supercritical CO{sub 2} is converted to solid carbonate mineral within a few hundred years and permanently entombs the carbon. Although MMV for this rock type may be challenging, a carefully chosen combination of geophysical and geochemical techniques should allow assessment of the fate of CO{sub 2} in deep basalt hosted aquifers. Terrestrial carbon sequestration relies on land management practices and technologies to remove atmospheric CO{sub 2} where it is stored in trees, plants, and soil. This indirect sequestration can be implemented today and is on the front line of voluntary, market-based approaches to reduce CO{sub 2} emissions. Initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil Carbon (C) on rangelands, and forested, agricultural, and reclaimed lands. Rangelands can store up to an additional 0.05 mt C/ha/yr, while the croplands are on average four times that amount. Estimates of technical potential for soil sequestration within the region in cropland are in the range of 2.0 M mt C/yr over 20 year time horizon. This is equivalent to approximately 7.0 M mt CO{sub 2}e/yr. The forestry sinks are well documented, and the potential in the Big Sky region ranges from 9-15 M mt CO{sub 2} equivalent per year. Value-added benefits include enhanced yields, reduced erosion, and increased wildlife habitat. Thus the terrestrial sinks provide a viable, environmentally beneficial, and relatively low cost sink that is available to sequester C in the current time frame. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts

  2. Aquifer thermal energy stores in Germany

    International Nuclear Information System (INIS)

    Kabus, F.; Seibt, P.; Poppei, J.

    2000-01-01

    This paper describes the state of essential demonstration projects of heat and cold storage in aquifers in Germany. Into the energy supply system of the buildings of the German Parliament in Berlin, there are integrated both a deep brine-bearing aquifer for the seasonal storage of waste heat from power and heat cogeneration and a shallow-freshwater bearing aquifer for cold storage. In Neubrandenburg, a geothermal heating plant which uses a 1.200 m deep aquifer is being retrofitted into an aquifer heat storage system which can be charged with the waste heat from a gas and steam cogeneration plant. The first centralised solar heating plant including an aquifer thermal energy store in Germany was constructed in Rostock. Solar collectors with a total area of 1000m 2 serve for the heating of a complex of buildings with 108 flats. A shallow freshwater-bearing aquifer is used for thermal energy storage. (Authors)

  3. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  4. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  6. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  7. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  8. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  9. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  10. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  11. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  12. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  13. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  14. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  15. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  16. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  17. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  18. Results from the Big Spring basin water quality monitoring and demonstration projects, Iowa, USA

    Science.gov (United States)

    Rowden, R.D.; Liu, H.; Libra, R.D.

    2001-01-01

    Agricultural practices, hydrology, and water quality of the 267-km2 Big Spring groundwater drainage basin in Clayton County, Iowa, have been monitored since 1981. Land use is agricultural; nitrate-nitrogen (-N) and herbicides are the resulting contaminants in groundwater and surface water. Ordovician Galena Group carbonate rocks comprise the main aquifer in the basin. Recharge to this karstic aquifer is by infiltration, augmented by sinkhole-captured runoff. Groundwater is discharged at Big Spring, where quantity and quality of the discharge are monitored. Monitoring has shown a threefold increase in groundwater nitrate-N concentrations from the 1960s to the early 1980s. The nitrate-N discharged from the basin typically is equivalent to over one-third of the nitrogen fertilizer applied, with larger losses during wetter years. Atrazine is present in groundwater all year; however, contaminant concentrations in the groundwater respond directly to recharge events, and unique chemical signatures of infiltration versus runoff recharge are detectable in the discharge from Big Spring. Education and demonstration efforts have reduced nitrogen fertilizer application rates by one-third since 1981. Relating declines in nitrate and pesticide concentrations to inputs of nitrogen fertilizer and pesticides at Big Spring is problematic. Annual recharge has varied five-fold during monitoring, overshadowing any water-quality improvements resulting from incrementally decreased inputs. ?? Springer-Verlag 2001.

  19. Arsenic, microbes and contaminated aquifers

    Science.gov (United States)

    Oremland, Ronald S.; Stolz, John F.

    2005-01-01

    The health of tens of millions of people world-wide is at risk from drinking arsenic-contaminated well water. In most cases this arsenic occurs naturally within the sub-surface aquifers, rather than being derived from identifiable point sources of pollution. The mobilization of arsenic into the aqueous phase is the first crucial step in a process that eventually leads to human arsenicosis. Increasing evidence suggests that this is a microbiological phenomenon.

  20. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  1. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  2. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  3. Aquifer thermal-energy-storage modeling

    Science.gov (United States)

    Schaetzle, W. J.; Lecroy, J. E.

    1982-09-01

    A model aquifer was constructed to simulate the operation of a full size aquifer. Instrumentation to evaluate the water flow and thermal energy storage was installed in the system. Numerous runs injecting warm water into a preconditioned uniform aquifer were made. Energy recoveries were evaluated and agree with comparisons of other limited available data. The model aquifer is simulated in a swimming pool, 18 ft by 4 ft, which was filled with sand. Temperature probes were installed in the system. A 2 ft thick aquifer is confined by two layers of polyethylene. Both the aquifer and overburden are sand. Four well configurations are available. The system description and original tests, including energy recovery, are described.

  4. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  5. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  6. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  7. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  8. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  9. Hydrochemistry of New Zealand's aquifers

    International Nuclear Information System (INIS)

    Rosen, M.R.

    2001-01-01

    Groundwater chemistry on a national scale has never been studied in New Zealand apart from a few studies on nitrate concentrations and pesticides. These studies are covered in Chapter 8 of this book. However general studies of groundwater chemistry, groundwater-rock interaction and regional characteristics of water quality have not been previously addressed in much detail. This is partly because New Zealand aquifers are relatively small on a world scale and are geologically and tectonically diverse (see Chapter 3). But New Zealand has also recently lacked a centralised agency responsible for groundwater quality, and therefore, no national assessments have been undertaken. In recent years, the Institute of Geological and Nuclear Sciences has managed a programme of collecting and analysing the groundwater chemistry of key New Zealand aquifers. This programme is called the National Groundwater Monitoring Programme (NGMP) and is funded by the New Zealand Public Good Science Fund. The programme started in 1990 using only 22 wells, with four regional authorities of the country participating. The NGMP now includes all 15 regional and unitary authorities that use groundwater and over 100 monitoring sites. The NGMP is considered a nationally significant database by the New Zealand Foundation for Research Science and Technology. The NGMP allows a national comparison of aquifer chemistries because the samples are all analysed at one laboratory in a consistent manner and undergo stringent quality control checks. Poor quality analyses are thus minimised. In addition, samples are collected quarterly so that long-term seasonal trends in water quality can be analysed, and the effects of changes in land use and the vulnerability of aquifers to contaminant leaching can be assessed. This chapter summarises the water quality data collected for the NGMP over the past 10 years. Some records are much shorter than others, but most are greater than three years. Additional information is

  10. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  11. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  12. An evaluation of aquifer intercommunication between the unconfined and Rattlesnake Ridge aquifers on the Hanford Site

    International Nuclear Information System (INIS)

    Jensen, E.J.

    1987-10-01

    During 1986, Pacific Northwest Laboratory conducted a study of a portion of the Rattlesnake Ridge aquifer (confined aquifer) that lies beneath the B Pond - Gable Mountain Pond area of the Hanford Site. The purpose was to determine the extent of intercommunication between the unconfined aquifer and the uppermost regionally extensive confined aquifer, referred to as the Rattlesnake Ridge aquifer. Hydraulic head data and chemical data were collected from the ground water in the study area during December 1986. The hydraulic head data were used to determine the effects caused by water discharged to the ground from B Pond on both the water table of the unconfined aquifer and the potentiometric surface of the confined aquifer. The chemical data were collected to determine the extent of chemical constituents migrating from the unconfined aquifer to the confined aquifer. Analysis of chemical constituents in the Rattlesnake Ridge aquifer demonstrated that communication between the unconfined and confined aquifers had occurred. However, the levels of contaminants found in the Rattlesnake Ridge aquifer during this study were below the DOE Derived Concentration Guides

  13. Aquifer Characterization and Groundwater Potential Assessment

    African Journals Online (AJOL)

    Timothy Ademakinwa

    Keywords: Aquifer Characterization, Groundwater Potential, Electrical Resistivity, Lithologic Logs ... State Water Corporation currently cannot meet the daily water ... METHOD OF STUDY ... sections which were constrained with the available.

  14. The Guarani Aquifer System: estimation of recharge along the Uruguay-Brazil border

    Science.gov (United States)

    Gómez, Andrea A.; Rodríguez, Leticia B.; Vives, Luis S.

    2010-11-01

    The cities of Rivera and Santana do Livramento are located on the outcropping area of the sandstone Guarani Aquifer on the Brazil-Uruguay border, where the aquifer is being increasingly exploited. Therefore, recharge estimates are needed to address sustainability. First, a conceptual model of the area was developed. A multilayer, heterogeneous and anisotropic groundwater-flow model was built to validate the conceptual model and to estimate recharge. A field campaign was conducted to collect water samples and monitor water levels used for model calibration. Field data revealed that there exists vertical gradients between confining basalts and underlying sandstones, suggesting basalts could indirectly recharge sandstone in fractured areas. Simulated downward flow between them was a small amount within the global water budget. Calibrated recharge rates over basalts and over outcropping sandstones were 1.3 and 8.1% of mean annual precipitation, respectively. A big portion of sandstone recharge would be drained by streams. The application of a water balance yielded a recharge of 8.5% of average annual precipitation. The numerical model and the water balance yielded similar recharge values consistent with determinations from previous authors in the area and other regions of the aquifer, providing an upper bound for recharge in this transboundary aquifer.

  15. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  16. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  17. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  18. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  19. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  20. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  1. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  2. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  3. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  4. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  5. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  6. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  7. Diagnosis of the Ghiss Nekor aquifer in order to elaborate the aquifer contract

    Science.gov (United States)

    Baite, Wissal; Boukdir, A.; Zitouni, A.; Dahbi, S. D.; Mesmoudi, H.; Elissami, A.; Sabri, E.; Ikhmerdi, H.

    2018-05-01

    The Ghiss-Nekor aquifer, located in the north-east of the action area of the ABHL, plays a strategic role in the drinkable water supply of the city of Al Hoceima and of the neighboring urban areas. It also participates in the irrigation of PMH. However, this aquifer has problems such as over-exploitation and pollution. In the face of these problems, the only Solution is the establishment of a new mode of governance, which privileges the participation, the involvement and the responsibility of the actors concerned in a negotiated contractual framework, namely the aquifer contract. The purpose of this study is to diagnose the current state of the Ghiss Nekor aquifer, the hydrogeological characterization of the aquifer, the use of the waters of the aquifer, the Problem identification and the introduction of the aquifer contract, which aims at the participatory and sustainable management of underground water resources in the Ghiss- Nekor plain, to ensure sustainable development.

  8. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  9. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  10. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  11. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  12. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  13. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  14. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  15. Geohydrology of the Cerro Prieto geothermal aquifer

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez R, J.; de la Pena L, A.

    1981-01-01

    The most recent information on the Cerro Prieto geothermal aquifer is summarized, with special emphasis on the initial production zone where the wells completed in the Alpha aquifer are located. These wells produce steam for power plant units 1 and 2. Brief comments also are made on the Beta aquifer, which underlies the Alpha aquifer in the Cerro Prieto I area and which extends to the east to what is known as the Cerro Prieto II and Cerro Prieto III areas. The location of the area studied is shown. The Alpha and Beta aquifers differ in their mineralogy and cementing mineral composition, temperatures, and piezometric levels. The difference in piezometric levels indicates that there is no local communication between the two aquifers. This situation has been verified by a well interference test, using well E-1 as a producer in the Beta aquifer and well M-46 as the observation well in the Alpha aquifer. No interference between them was observed. Information on the geology, geohydrology, and geochemistry of Cerro Prieto is presented.

  16. Estimating Aquifer Properties Using Sinusoidal Pumping Tests

    Science.gov (United States)

    Rasmussen, T. C.; Haborak, K. G.; Young, M. H.

    2001-12-01

    We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.

  17. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  18. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  19. Geochemistry of the Arbuckle-Simpson Aquifer

    Science.gov (United States)

    Christenson, Scott; Hunt, Andrew G.; Parkhurst, David L.; Osborn, Noel I.

    2009-01-01

    The Arbuckle-Simpson aquifer in south-central Oklahoma provides water for public supply, farms, mining, wildlife conservation, recreation, and the scenic beauty of springs, streams, and waterfalls. A new understanding of the aquifer flow system was developed as part of the Arbuckle-Simpson Hydrology Study, done in 2003 through 2008 as a collaborative research project between the State of Oklahoma and the Federal government. The U.S. Geological Survey collected 36 water samples from 32 wells and springs in the Arbuckle-Simpson aquifer in 2004 through 2006 for geochemical analyses of major ions, trace elements, isotopes of oxygen and hydrogen, dissolved gases, and dating tracers. The geochemical analyses were used to characterize the water quality in the aquifer, to describe the origin and movement of ground water from recharge areas to discharge at wells and springs, and to determine the age of water in the aquifer.

  20. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  1. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  2. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  3. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  4. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  5. Economics of Managed Aquifer Recharge

    Directory of Open Access Journals (Sweden)

    Robert G. Maliva

    2014-05-01

    Full Text Available Managed aquifer recharge (MAR technologies can provide a variety of water resources management benefits by increasing the volume of stored water and improving water quality through natural aquifer treatment processes. Implementation of MAR is often hampered by the absence of a clear economic case for the investment to construct and operate the systems. Economic feasibility can be evaluated using cost benefit analysis (CBA, with the challenge of monetizing benefits. The value of water stored or treated by MAR systems can be evaluated by direct and indirect measures of willingness to pay including market price, alternative cost, value marginal product, damage cost avoided, and contingent value methods. CBAs need to incorporate potential risks and uncertainties, such as failure to meet performance objectives. MAR projects involving high value uses, such as potable supply, tend to be economically feasible provided that local hydrogeologic conditions are favorable. They need to have low construction and operational costs for lesser value uses, such as some irrigation. Such systems should therefore be financed by project beneficiaries, but dichotomies may exist between beneficiaries and payers. Hence, MAR projects in developing countries may be economically viable, but external support is often required because of limited local financial resources.

  6. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  7. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  8. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  9. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  10. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  11. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  12. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  13. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  14. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  15. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  16. Bend-scale geomorphic classification and assessment of the Lower Missouri River from Sioux City, Iowa, to the Mississippi River for application to pallid sturgeon management

    Science.gov (United States)

    Jacobson, Robert B.; Colvin, Michael E.; Bulliner, Edward A.; Pickard, Darcy; Elliott, Caroline M.

    2018-06-07

    Management actions intended to increase growth and survival of pallid sturgeon (Scaphirhynchus albus) age-0 larvae on the Lower Missouri River require a comprehensive understanding of the geomorphic habitat template of the river. The study described here had two objectives relating to where channel-reconfiguration projects should be located to optimize effectiveness. The first objective was to develop a bend-scale (that is, at the scale of individual bends, defined as “cross-over to cross-over”) geomorphic classification of the Lower Missouri River to help in the design of monitoring and evaluation of such projects. The second objective was to explore whether geomorphic variables could provide insight into varying capacities of bends to intercept drifting larvae. The bend-scale classification was based on geomorphic and engineering variables for 257 bends from Sioux City, Iowa, to the confluence with the Mississippi River near St. Louis, Missouri. We used k-means clustering to identify groupings of bends that shared the same characteristics. Separate 3-, 4-, and 6-cluster classifications were developed and mapped. The three classifications are nested in a hierarchical structure. We also explored capacities of bends to intercept larvae through evaluation of linear models that predicted persistent sand area or catch per unit effort (CPUE) of age-0 sturgeon as a function of the same geomorphic variables used in the classification. All highly ranked models that predict persistent sand area contained mean channel width and standard deviation of channel width as significant variables. Some top-ranked models also included contributions of channel sinuosity and density of navigation structures. The sand-area prediction models have r-squared values of 0.648–0.674. In contrast, the highest-ranking CPUE models have r-squared values of 0.011–0.170, indicating much more uncertainty for the biological response variable. Whereas the persistent sand model documents that

  17. Hydrology of the Claiborne aquifer and interconnection with the Upper Floridan aquifer in southwest Georgia

    Science.gov (United States)

    Gordon, Debbie W.; Gonthier, Gerard

    2017-04-24

    The U.S. Geological Survey conducted a study, in cooperation with the Georgia Environmental Protection Division, to define the hydrologic properties of the Claiborne aquifer and evaluate its connection with the Upper Floridan aquifer in southwest Georgia. The effort involved collecting and compiling hydrologic data from the aquifer in subarea 4 of southwestern Georgia. Data collected for this study include borehole geophysical logs in 7 wells, and two 72-hour aquifer tests to determine aquifer properties.The top of the Claiborne aquifer extends from an altitude of about 200 feet above the North American Vertical Datum of 1988 (NAVD 88) in Terrell County to 402 feet below NAVD 88 in Decatur County, Georgia. The base of the aquifer extends from an altitude of about 60 feet above NAVD 88 in eastern Sumter County to about 750 feet below NAVD 88 in Decatur County. Aquifer thickness ranges from about 70 feet in eastern Early County to 400 feet in Decatur County.The transmissivity of the Claiborne aquifer, determined from two 72-hour aquifer tests, was estimated to be 1,500 and 700 feet squared per day in Mitchell and Early Counties, respectively. The storage coefficient was estimated to be 0.0006 and 0.0004 for the same sites, respectively. Aquifer test data from Mitchell County indicate a small amount of leakage occurred during the test. Groundwater-flow models suggest that the source of the leakage was the underlying Clayton aquifer, which produced about 2.5 feet of drawdown in response to pumping in the Claiborne aquifer. The vertical hydraulic conductivity of the confining unit between the Claiborne and Clayton aquifers was simulated to be about 0.02 foot per day.Results from the 72-hour aquifer tests run for this study indicated no interconnection between the Claiborne and overlying Upper Floridan aquifers at the two test sites. Additional data are needed to monitor the effects that increased withdrawals from the Claiborne aquifer may have on future water resources.

  18. Development and Modelling of a High-Resolution Aquifer Analog in the Guarani Aquifer (Brazil)

    OpenAIRE

    Höyng, Dominik

    2014-01-01

    A comprehensive and detailed knowledge about the spatial distribution of physical and chemical properties in heterogeneous porous aquifers plays a decisive role for a realistic representation of governing parameters in mathematical models. Models allow the simulation, prediction and reproduction of subsurface flow and transport characteristics. This work explains the identification, characterization and effects of small-scale aquifer heterogeneities in the Guarani Aquifer System (GAS) in S...

  19. Comparison of aquifer characteristics derived from local and regional aquifer tests.

    Science.gov (United States)

    Randolph, R.B.; Krause, R.E.; Maslia, M.L.

    1985-01-01

    A comparison of the aquifer parameter values obtained through the analysis of a local and a regional aquifer test involving the same area in southeast Georgia is made in order to evaluate the validity of extrapolating local aquifer-test results for use in large-scale flow simulations. Time-drawdown and time-recovery data were analyzed by using both graphical and least-squares fitting of the data to the Theis curve. Additionally, directional transmissivity, transmissivity tensor, and angle of anisotropy were computed for both tests. -from Authors Georgia drawdown transmissivity regional aquifer tests

  20. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  1. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  2. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  3. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  4. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  5. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  6. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  7. Evaluating connection of aquifers to springs and streams, Great Basin National Park and vicinity, Nevada

    Science.gov (United States)

    Prudic, David E.; Sweetkind, Donald S.; Jackson, Tracie R.; Dotson, K. Elaine; Plume, Russell W.; Hatch, Christine E.; Halford, Keith J.

    2015-12-22

    Federal agencies that oversee land management for much of the Snake Range in eastern Nevada, including the management of Great Basin National Park by the National Park Service, need to understand the potential extent of adverse effects to federally managed lands from nearby groundwater development. As a result, this study was developed (1) to attain a better understanding of aquifers controlling groundwater flow on the eastern side of the southern part of the Snake Range and their connection with aquifers in the valleys, (2) to evaluate the relation between surface water and groundwater along the piedmont slopes, (3) to evaluate sources for Big Springs and Rowland Spring, and (4) to assess groundwater flow from southern Spring Valley into northern Hamlin Valley. The study focused on two areas—the first, a northern area along the east side of Great Basin National Park that included Baker, Lehman, and Snake Creeks, and a second southern area that is the potential source area for Big Springs. Data collected specifically for this study included the following: (1) geologic field mapping; (2) drilling, testing, and water quality sampling from 7 test wells; (3) measuring discharge and water chemistry of selected creeks and springs; (4) measuring streambed hydraulic gradients and seepage rates from 18 shallow piezometers installed into the creeks; and (5) monitoring stream temperature along selected reaches to identify places of groundwater inflow.

  8. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  9. Steam Injection For Soil And Aquifer Remediation

    Science.gov (United States)

    The purpose of this Issue Paper is to provide to those involved in assessing remediation technologies for specific sites basic technical information on the use of steam injection for the remediation of soils and aquifers that are contaminated by...

  10. Hydrogeologic characterization of devonian aquifers in Uruguay

    International Nuclear Information System (INIS)

    Massa, E.

    1988-01-01

    This article carried out the assistance research project implementation in devonian sedimentary units as a potentials aquifers and their best use to school supplying and rural population in central area of Uruguay.

  11. Aquifer parameter identification and interpretation with different ...

    African Journals Online (AJOL)

    unfortunately, field data deviations from the model type curves are not considered in ... Such an extensive Study can only he done when there is a set of aquifer test data with main and .... 1990; 1995) methods are employed for qualitative.

  12. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  13. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  14. A General Solution for Groundwater Flow in Estuarine Leaky Aquifer System with Considering Aquifer Anisotropy

    Science.gov (United States)

    Chen, Po-Chia; Chuang, Mo-Hsiung; Tan, Yih-Chi

    2014-05-01

    In recent years the urban and industrial developments near the coastal area are rapid and therefore the associated population grows dramatically. More and more water demand for human activities, agriculture irrigation, and aquaculture relies on heavy pumping in coastal area. The decline of groundwater table may result in the problems of seawater intrusion and/or land subsidence. Since the 1950s, numerous studies focused on the effect of tidal fluctuation on the groundwater flow in the coastal area. Many studies concentrated on the developments of one-dimensional (1D) and two-dimensional (2D) analytical solutions describing the tide-induced head fluctuations. For example, Jacob (1950) derived an analytical solution of 1D groundwater flow in a confined aquifer with a boundary condition subject to sinusoidal oscillation. Jiao and Tang (1999) derived a 1D analytical solution of a leaky confined aquifer by considered a constant groundwater head in the overlying unconfined aquifer. Jeng et al. (2002) studied the tidal propagation in a coupled unconfined and confined costal aquifer system. Sun (1997) presented a 2D solution for groundwater response to tidal loading in an estuary. Tang and Jiao (2001) derived a 2D analytical solution in a leaky confined aquifer system near open tidal water. This study aims at developing a general analytical solution describing the head fluctuations in a 2D estuarine aquifer system consisted of an unconfined aquifer, a confined aquifer, and an aquitard between them. Both the confined and unconfined aquifers are considered to be anisotropic. The predicted head fluctuations from this solution will compare with the simulation results from the MODFLOW program. In addition, the solutions mentioned above will be shown to be special cases of the present solution. Some hypothetical cases regarding the head fluctuation in costal aquifers will be made to investigate the dynamic effects of water table fluctuation, hydrogeological conditions, and

  15. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  16. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  17. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  18. Big Data and Social Media

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    A critical analysis of the "keep everything" Big Data era, the impact on our lives of the information, at first glance "convenient for future use" that we make known about ourselves on the network. NB! The lecture will be recorded like all Academic Training lectures. Lecturer's biography: Father of the Internet, see https://internethalloffame.org/inductees/vint-cerf or https://en.wikipedia.org/wiki/Vint_Cerf The video on slide number 9 is from page https://www.gapminder.org/tools/#$state$time$value=2018&value;;&chart-type=bubbles   Keywords: Big Data, Internet, History, Applications, tools, privacy, technology, preservation, surveillance, google, Arpanet, CERN, Web  

  19. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  20. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  1. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  2. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  3. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  4. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  5. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  6. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  7. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  8. Guarani aquifer hydrogeological synthesis of the Guarani aquifer system. Edicion bilingue

    International Nuclear Information System (INIS)

    2009-01-01

    This work represents the synthesis of current knowledge of the Guarani Aquifer System, based on technical products made by different companies and consultants who participated in the framework of the Project for Environmental Protection and Sustainable Development of the Guarani Aquifer.

  9. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  10. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  11. Aquifer response to earth tides

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Narasimhan, T.N.

    1981-01-01

    The relation presented in the first part of this paper are applicable to packed-off wells and other situations where appreciable flow to the well does not exist. Comparisons of aquifer properties determined from the response to earth tides and from the more standard pumping tests for the two California fields are reasonably good. The case of an open well makes the problem more complicated, since there may be an appreciable amount of flow to the well. This flow to the well is seen as either a phase lag or as a difference in the ratio of the well signal to the tide for the semidiurnal and diurnal components of the tide. The latter is probably the better and more accurate indicator of flow to the well. Analyses of such situations, however, become involved and are probably best done as case-by-case studies. The numerical solutions show that treating the inverse problem through numerical modeling is at least feasible for any individual situation. It may be possible to simplify the inverse problem through the generation of type curves, but general type curves that are applicable to diverse situations are not likely to be practical. 7 figures

  12. Aquifer Storage Recovery (ASR) of chlorinated municipal drinking water in a confined aquifer

    Science.gov (United States)

    Izbicki, John A.; Petersen, Christen E.; Glotzbach, Kenneth J.; Metzger, Loren F.; Christensen, Allen H.; Smith, Gregory A.; O'Leary, David R.; Fram, Miranda S.; Joseph, Trevor; Shannon, Heather

    2010-01-01

    About 1.02 x 106 m3 of chlorinated municipal drinking water was injected into a confined aquifer, 94-137 m below Roseville, California, between December 2005 and April 2006. The water was stored in the aquifer for 438 days, and 2.64 x 106 m3 of water were extracted between July 2007 and February 2008. On the basis of Cl data, 35% of the injected water was recovered and 65% of the injected water and associated disinfection by-products (DBPs) remained in the aquifer at the end of extraction. About 46.3 kg of total trihalomethanes (TTHM) entered the aquifer with the injected water and 37.6 kg of TTHM were extracted. As much as 44 kg of TTHMs remained in the aquifer at the end of extraction because of incomplete recovery of injected water and formation of THMs within the aquifer by reactions with freechlorine in the injected water. Well-bore velocity log data collected from the Aquifer Storage Recovery (ASR) well show as much as 60% of the injected water entered the aquifer through a 9 m thick, high-permeability layer within the confined aquifer near the top of the screened interval. Model simulations of ground-water flow near the ASR well indicate that (1) aquifer heterogeneity allowed injected water to move rapidly through the aquifer to nearby monitoring wells, (2) aquifer heterogeneity caused injected water to move further than expected assuming uniform aquifer properties, and (3) physical clogging of high-permeability layers is the probable cause for the observed change in the distribution of borehole flow. Aquifer heterogeneity also enhanced mixing of native anoxic ground water with oxic injected water, promoting removal of THMs primarily through sorption. A 3 to 4-fold reduction in TTHM concentrations was observed in the furthest monitoring well 427 m downgradient from the ASR well, and similar magnitude reductions were observed in depth-dependent water samples collected from the upper part of the screened interval in the ASR well near the end of the extraction

  13. Water Flow in Karst Aquifer Considering Dynamically Variable Saturation Conduit

    Science.gov (United States)

    Tan, Chaoqun; Hu, Bill X.

    2017-04-01

    The karst system is generally conceptualized as dual-porosity system, which is characterized by low conductivity and high storage continuum matrix and high conductivity and quick flow conduit networks. And so far, a common numerical model for simulating flow in karst aquifer is MODFLOW2005-CFP, which is released by USGS in 2008. However, the steady-state approach for conduit flow in CFP is physically impractical when simulating very dynamic hydraulics with variable saturation conduit. So, we adopt the method proposed by Reimann et al. (2011) to improve current model, in which Saint-Venant equations are used to model the flow in conduit. Considering the actual background that the conduit is very big and varies along flow path and the Dirichlet boundary varies with rainfall in our study area in Southwest China, we further investigate the influence of conduit diameter and outflow boundary on numerical model. And we also analyze the hydraulic process in multi-precipitation events. We find that the numerical model here corresponds well with CFP for saturated conduit, and it could depict the interaction between matrix and conduit during very dynamic hydraulics pretty well compare with CFP.

  14. Conceptual and numerical models of groundwater flow in the Ogallala aquifer in Gregory and Tripp Counties, South Dakota, water years 1985--2009

    Science.gov (United States)

    Davis, Kyle W.; Putnam, Larry D.

    2013-01-01

    The Ogallala aquifer is an important water resource for the Rosebud Sioux Tribe in Gregory and Tripp Counties in south-central South Dakota and is used for irrigation, public supply, domestic, and stock water supplies. To better understand groundwater flow in the Ogallala aquifer, conceptual and numerical models of groundwater flow were developed for the aquifer. A conceptual model of the Ogallala aquifer was used to analyze groundwater flow and develop a numerical model to simulate groundwater flow in the aquifer. The MODFLOW–NWT model was used to simulate transient groundwater conditions for water years 1985–2009. The model was calibrated using statistical parameter estimation techniques. Potential future scenarios were simulated using the input parameters from the calibrated model for simulations of potential future drought and future increased pumping. Transient simulations were completed with the numerical model. A 200-year transient initialization period was used to establish starting conditions for the subsequent 25-year simulation of water years 1985–2009. The 25-year simulation was discretized into three seasonal stress periods per year and used to simulate transient conditions. A single-layer model was used to simulate flow and mass balance in the Ogallala aquifer with a grid of 133 rows and 282 columns and a uniform spacing of 500 meters (1,640 feet). Regional inflow and outflow were simulated along the western and southern boundaries using specified-head cells. All other boundaries were simulated using no-flow cells. Recharge to the aquifer occurs through precipitation on the outcrop area. Model calibration was accomplished using the Parameter Estimation (PEST) program that adjusted individual model input parameters and assessed the difference between estimated and model-simulated values of hydraulic head and base flow. This program was designed to estimate parameter values that are statistically the most likely set of values to result in the

  15. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  16. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  17. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  18. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  19. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  20. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  1. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  2. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  3. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  4. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  5. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  6. Big Math for Little Kids

    Science.gov (United States)

    Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert

    2004-01-01

    "Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…

  7. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  8. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    at different distances (that is, at different epochs in the past) to come to this ... that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant ...

  9. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  10. Review of Aquifer Storage and Recovery Performance in the Upper Floridan Aquifer in Southern Florida

    Science.gov (United States)

    Reese, Ronald S.

    2006-01-01

    Introduction: Interest and activity in aquifer storage and recovery (ASR) in southern Florida has increased greatly during the past 10 to 15 years. ASR wells have been drilled to the carbonate Floridan aquifer system at 30 sites in southern Florida, mostly by local municipalities or counties located in coastal areas. The primary storage zone at these sites is contained within the brackish to saline Upper Floridan aquifer of the Floridan aquifer system. The strategy for use of ASR in southern Florida is to store excess freshwater available during the wet season in an aquifer and recover it during the dry season when needed for supplemental water supply. Each ASR cycle is defined by three periods: recharge, storage, and recovery. This fact sheet summarizes some of the findings of a second phase retrospective assessment of existing ASR facilities and sites.

  11. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  12. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  13. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  14. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  15. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  16. Hydrological connectivity of perched aquifers and regional aquifers in semi-arid environments: a case study from Namibia

    Science.gov (United States)

    Hamutoko, J. T.; Wanke, H.

    2017-12-01

    Integrated isotopic and hydrological tracers along with standard hydrological data are used to understand complex dry land hydrological processes on different spatial and temporal scales. The objective of this study is to analyse the relationship between the perched aquifers and the regional aquifer using hydrochemical data and isotopic composition in the Cuvelai-Etosha Basin in Namibia. This relation between the aquifers will aid in understanding groundwater recharge processes and flow dynamics. Perched aquifers are discontinuous shallow aquifers with water level ranging from 0 to 30 meters below ground level. The regional aquifer occurs in semi-consolidated sandstone at depths between about 60 and 160 meters below ground level. Water samples were collected from both aquifers in 10 villages and were analysed for major ions and stable isotopes. The results show overlapping hydrochemistry and isotopic compositions of both aquifers in 8 villages which suggest the possibility of perched aquifer water infiltrating into the regional aquifer. In two villages the hydrochemistry and isotopic composition of the aquifers are totally different and this suggests that there is no interaction between this aquifers. Areas where perched aquifers are connected to regional aquifers maybe recharge zones. These finding have important implications for groundwater resource management.

  17. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  18. Transient well flow in vertically heterogeneous aquifers

    Science.gov (United States)

    Hemker, C. J.

    1999-11-01

    A solution for the general problem of computing well flow in vertically heterogeneous aquifers is found by an integration of both analytical and numerical techniques. The radial component of flow is treated analytically; the drawdown is a continuous function of the distance to the well. The finite-difference technique is used for the vertical flow component only. The aquifer is discretized in the vertical dimension and the heterogeneous aquifer is considered to be a layered (stratified) formation with a finite number of homogeneous sublayers, where each sublayer may have different properties. The transient part of the differential equation is solved with Stehfest's algorithm, a numerical inversion technique of the Laplace transform. The well is of constant discharge and penetrates one or more of the sublayers. The effect of wellbore storage on early drawdown data is taken into account. In this way drawdowns are found for a finite number of sublayers as a continuous function of radial distance to the well and of time since the pumping started. The model is verified by comparing results with published analytical and numerical solutions for well flow in homogeneous and heterogeneous, confined and unconfined aquifers. Instantaneous and delayed drainage of water from above the water table are considered, combined with the effects of partially penetrating and finite-diameter wells. The model is applied to demonstrate that the transient effects of wellbore storage in unconfined aquifers are less pronounced than previous numerical experiments suggest. Other applications of the presented solution technique are given for partially penetrating wells in heterogeneous formations, including a demonstration of the effect of decreasing specific storage values with depth in an otherwise homogeneous aquifer. The presented solution can be a powerful tool for the analysis of drawdown from pumping tests, because hydraulic properties of layered heterogeneous aquifer systems with

  19. A Black Hills-Madison Aquifer origin for Dakota Aquifer groundwater in northeastern Nebraska.

    Science.gov (United States)

    Stotler, Randy; Harvey, F Edwin; Gosselin, David C

    2010-01-01

    Previous studies of the Dakota Aquifer in South Dakota attributed elevated groundwater sulfate concentrations to Madison Aquifer recharge in the Black Hills with subsequent chemical evolution prior to upward migration into the Dakota Aquifer. This study examines the plausibility of a Madison Aquifer origin for groundwater in northeastern Nebraska. Dakota Aquifer water samples were collected for major ion chemistry and isotopic analysis ((18)O, (2)H, (3)H, (14)C, (13)C, (34)S, (18)O-SO(4), (87)Sr, (37)Cl). Results show that groundwater beneath the eastern, unconfined portion of the study area is distinctly different from groundwater sampled beneath the western, confined portion. In the east, groundwater is calcium-bicarbonate type, with delta(18)O values (-9.6 per thousand to -12.4 per thousand) similar to local, modern precipitation (-7.4 per thousand to -10 per thousand), and tritium values reflecting modern recharge. In the west, groundwater is calcium-sulfate type, having depleted delta(18)O values (-16 per thousand to -18 per thousand) relative to local, modern precipitation, and (14)C ages 32,000 to more than 47,000 years before present. Sulfate, delta(18)O, delta(2)H, delta(34)S, and delta(18)O-SO(4) concentrations are similar to those found in Madison Aquifer groundwater in South Dakota. Thus, it is proposed that Madison Aquifer source water is also present within the Dakota Aquifer beneath northeastern Nebraska. A simple Darcy equation estimate of groundwater velocities and travel times using reported physical parameters from the Madison and Dakota Aquifers suggests such a migration is plausible. However, discrepancies between (14)C and Darcy age estimates indicate that (14)C ages may not accurately reflect aquifer residence time, due to mixtures of varying aged water.

  20. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  1. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  2. Groundwater vulnerability mapping of Qatar aquifers

    Science.gov (United States)

    Baalousha, Husam Musa

    2016-12-01

    Qatar is one of the most arid countries in the world with limited water resources. With little rainfall and no surface water, groundwater is the only natural source of fresh water in the country. Whilst the country relies mainly on desalination of seawater to secure water supply, groundwater has extensively been used for irrigation over the last three decades, which caused adverse environmental impact. Vulnerability assessment is a widely used tool for groundwater protection and land-use management. Aquifers in Qatar are carbonate with lots of fractures, depressions and cavities. Karst aquifers are generally more vulnerable to contamination than other aquifers as any anthropogenic-sourced contaminant, especially above a highly fractured zone, can infiltrate quickly into the aquifer and spread over a wide area. The vulnerability assessment method presented in this study is based on two approaches: DRASTIC and EPIK, within the framework of Geographical Information System (GIS). Results of this study show that DRASTIC vulnerability method suits Qatar hydrogeological settings more than EPIK. The produced vulnerability map using DRASTIC shows coastal and karst areas have the highest vulnerability class. The southern part of the country is located in the low vulnerability class due to occurrence of shale formation within aquifer media, which averts downward movement of contaminants.

  3. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  4. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  5. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  6. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  7. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  8. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  9. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  10. Nowcasting using news topics Big Data versus big bank

    OpenAIRE

    Thorsrud, Leif Anders

    2016-01-01

    The agents in the economy use a plethora of high frequency information, including news media, to guide their actions and thereby shape aggregate economic fluctuations. Traditional nowcasting approches have to a relatively little degree made use of such information. In this paper, I show how unstructured textual information in a business newspaper can be decomposed into daily news topics and used to nowcast quarterly GDP growth. Compared with a big bank of experts, here represented by o cial c...

  11. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  12. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  13. 40 CFR 147.502 - Aquifer exemptions. [Reserved

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Aquifer exemptions. [Reserved] 147.502... (CONTINUED) STATE, TRIBAL, AND EPA-ADMINISTERED UNDERGROUND INJECTION CONTROL PROGRAMS Florida § 147.502 Aquifer exemptions. [Reserved] ...

  14. Simulation of seawater intrusion in coastal aquifers: Some typical ...

    Indian Academy of Sciences (India)

    Springer Verlag Heidelberg #4 2048 1996 Dec 15 10:16:45

    Seawater intrusion; coastal aquifers; density-dependent flow and ... The seawater intrusion mechanism in coastal aquifers generally causes the occurrence of ... (4) The dynamic viscosity of the fluid does not change with respect to salinity and.

  15. Ground Water movement in crystalline rock aquifers

    International Nuclear Information System (INIS)

    Serejo, A.N.C.; Freire, C.; Siqueira, H.B. de; Frischkorn, H.; Torquato, J.R.F.; Santiago, M.M.F.; Barbosa, P.C.

    1984-01-01

    Ground water movement studies were performed in crystalline rock aquifers from the upper Acarau River hydrographic basin, state of Ceara, Brazil. The studies included carbon-14, 18 O/ 16 O and tritium measurements as well as chemical analysis. A total of 35 wells were surveyed during drought seasons. Carbon-14 values displayed little variation which implied that the water use was adequate despite of the slower recharge conditions. Fairly constant isotopic 18 O/ 16 O ratio values in the wells and their similarity with rainwater values indicated that the recharge is done exclusively by pluvial waters. A decreasing tendency within the tritium concentration values were interpreted as a periodic rainwater renewal for these aquifers. The chemical analysis demonstrated that there is in fact no correlation between salinity and the time the water remains in the aquifer itself. (D.J.M.) [pt

  16. Unconsolidated Aquifers in Tompkins County, New York

    Science.gov (United States)

    Miller, Todd S.

    2000-01-01

    Unconsolidated aquifers consisting of saturated sand and gravel are capable of supplying large quantities of good-quality water to wells in Tompkins County, but little published geohydrologic inform ation on such aquifers is available. In 1986, the U.S.Geological Survey (USGS) began collecting geohydrologic information and well data to construct an aquifer map showing the extent of unconsolidated aquifers in Tompkins county. Data sources included (1) water-well drillers. logs; (2) highway and other construction test-boring logs; (3) well data gathered by the Tompkins County Department of Health, (4) test-well logs from geohydrologic consultants that conducted projects for site-specific studies, and (5) well data that had been collected during past investigations by the USGS and entered into the National Water Information System (NWIS) database. In 1999, the USGS, in cooperation with the Tompkins County Department of Planning, compiled these data to construct this map. More than 600 well records were entered into the NWIS database in 1999 to supplement the 350 well records already in the database; this provided a total of 950 well records. The data were digitized and imported into a geographic information system (GIS) coverage so that well locations could be plotted on a map, and well data could be tabulated in a digital data base through ARC/INFO software. Data on the surficial geology were used with geohydrologic data from well records and previous studies to delineate the extent of aquifers on this map. This map depicts (1) the extent of unconsolidated aquifers in Tompkins County, and (2) locations of wells whose records were entered into the USGS NWIS database and made into a GIS digital coverage. The hydrologic information presented here is generalized and is not intended for detailed site evaluations. Precise locations of geohydrologic-unit boundaries, and a description of the hydrologic conditions within the units, would require additional detailed, site

  17. Aquifer thermal energy storage. International symposium: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    Aquifers have been used to store large quantities of thermal energy to supply process cooling, space cooling, space heating, and ventilation air preheating, and can be used with or without heat pumps. Aquifers are used as energy sinks and sources when supply and demand for energy do not coincide. Aquifer thermal energy storage may be used on a short-term or long-term basis; as the sole source of energy or as a partial storage; at a temperature useful for direct application or needing upgrade. The sources of energy used for aquifer storage are ambient air, usually cold winter air; waste or by-product energy; and renewable energy such as solar. The present technical, financial and environmental status of ATES is promising. Numerous projects are operating and under development in several countries. These projects are listed and results from Canada and elsewhere are used to illustrate the present status of ATES. Technical obstacles have been addressed and have largely been overcome. Cold storage in aquifers can be seen as a standard design option in the near future as it presently is in some countries. The cost-effectiveness of aquifer thermal energy storage is based on the capital cost avoidance of conventional chilling equipment and energy savings. ATES is one of many developments in energy efficient building technology and its success depends on relating it to important building market and environmental trends. This paper attempts to provide guidance for the future implementation of ATES. Individual projects have been processed separately for entry onto the Department of Energy databases.

  18. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  19. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  20. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  1. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  2. Big Data in Drug Discovery.

    Science.gov (United States)

    Brown, Nathan; Cambruzzi, Jean; Cox, Peter J; Davies, Mark; Dunbar, James; Plumbley, Dean; Sellwood, Matthew A; Sim, Aaron; Williams-Jones, Bryn I; Zwierzyna, Magdalena; Sheppard, David W

    2018-01-01

    Interpretation of Big Data in the drug discovery community should enhance project timelines and reduce clinical attrition through improved early decision making. The issues we encounter start with the sheer volume of data and how we first ingest it before building an infrastructure to house it to make use of the data in an efficient and productive way. There are many problems associated with the data itself including general reproducibility, but often, it is the context surrounding an experiment that is critical to success. Help, in the form of artificial intelligence (AI), is required to understand and translate the context. On the back of natural language processing pipelines, AI is also used to prospectively generate new hypotheses by linking data together. We explain Big Data from the context of biology, chemistry and clinical trials, showcasing some of the impressive public domain sources and initiatives now available for interrogation. © 2018 Elsevier B.V. All rights reserved.

  3. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  4. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  5. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  6. Modelling contaminant transport in saturated aquifers

    International Nuclear Information System (INIS)

    Lakshminarayana, V.; Nayak, T.R.

    1990-01-01

    With the increase in population and industrialization the problem of pollution of groundwater has become critical. The present study deals with modelling of pollutant transport through saturated aquifers. Using this model it is possible to predict the concentration distribution, spatial as well as temporal, in the aquifer. The paper also deals with one of the methods of controlling the pollutant movement, namely by pumping wells. A simulation model is developed to determine the number, location and rate of pumping of a number of wells near the source of pollution so that the concentration is within acceptable limits at the point of interest. (Author) (18 refs., 14 figs., tab.)

  7. Comparison of groundwater flow in Southern California coastal aquifers

    Science.gov (United States)

    Hanson, Randall T.; Izbicki, John A.; Reichard, Eric G.; Edwards, Brian D.; Land, Michael; Martin, Peter

    2009-01-01

    Development of the coastal aquifer systems of Southern California has resulted in overdraft, changes in streamflow, seawater intrusion, land subsidence, increased vertical flow between aquifers, and a redirection of regional flow toward pumping centers. These water-management challenges can be more effectively addressed by incorporating new understanding of the geologic, hydrologic, and geochemical setting of these aquifers.

  8. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  9. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  10. Hydrologic conditions and distribution of selected radiochemical and chemical constituents in water, Snake River Plain aquifer, Idaho National Engineering Laboratory, Idaho, 1989 through 1991

    International Nuclear Information System (INIS)

    Bartholomay, R.C.; Orr, B.R.; Liszewski, M.J.; Jensen, R.G.

    1995-08-01

    Radiochemical and chemical wastewater discharged since 1952 to infiltration ponds and disposal wells at the Idaho National Engineering Laboratory (INEL) has affected water quality in the Snake River Plain aquifer. The U.S. Geological Survey, in cooperation with the U.S. Department of Energy, maintains a continuous monitoring network at the INEL to determine hydrologic trends and to delineate the movement of radiochemical and chemical wastes in the aquifer. This report presents an analysis of water-level and water-quality data collected from the Snake River Plain aquifer during 1989-91. Water in the eastern Snake River Plain aquifer moves principally through fractures and interflow zones in basalt, generally flows southwestward, and eventually discharges at springs along the Snake River. The aquifer is recharged principally from irrigation water, infiltration of streamflow, and ground-water inflow from adjoining mountain drainage basins. Water levels in wells throughout the INEL generally declined during 1989-91 due to drought. Detectable concentrations of radiochemical constituents in water samples from wells in the Snake River Plain aquifer at the INEL decreased or remained constant during 1989-91. Decreased concentrations are attributed to reduced rates of radioactive-waste disposal, sorption processes, radioactive decay, and changes in waste-disposal practices. Detectable concentrations of chemical constituents in water from the Snake River Plain aquifer at the INEL were variable during 1989-91. Sodium and chloride concentrations in the southern part of the INEL increased slightly during 1989-91 because of increased waste-disposal rates and a lack of recharge from the Big Lost River. Plumes of 1,1,1-trichloroethane have developed near the Idaho Chemical Processing Plant and the Radioactive Waste Management Complex as a result of waste disposal practices

  11. Aquifer geochemistry at potential aquifer storage and recovery sites in coastal plain aquifers in the New York city area, USA

    Science.gov (United States)

    Brown, C.J.; Misut, P.E.

    2010-01-01

    The effects of injecting oxic water from the New York city (NYC) drinking-water supply and distribution system into a nearby anoxic coastal plain aquifer for later recovery during periods of water shortage (aquifer storage and recovery, or ASR) were simulated by a 3-dimensional, reactive-solute transport model. The Cretaceous aquifer system in the NYC area of New York and New Jersey, USA contains pyrite, goethite, locally occurring siderite, lignite, and locally varying amounts of dissolved Fe and salinity. Sediment from cores drilled on Staten Island and western Long Island had high extractable concentrations of Fe, Mn, and acid volatile sulfides (AVS) plus chromium-reducible sulfides (CRS) and low concentrations of As, Pb, Cd, Cr, Cu and U. Similarly, water samples from the Lloyd aquifer (Cretaceous) in western Long Island generally contained high concentrations of Fe and Mn and low concentrations of other trace elements such as As, Pb, Cd, Cr, Cu and U, all of which were below US Environmental Protection Agency (USEPA) and NY maximum contaminant levels (MCLs). In such aquifer settings, ASR operations can be complicated by the oxidative dissolution of pyrite, low pH, and high concentrations of dissolved Fe in extracted water.The simulated injection of buffered, oxic city water into a hypothetical ASR well increased the hydraulic head at the well, displaced the ambient groundwater, and formed a spheroid of injected water with lower concentrations of Fe, Mn and major ions in water surrounding the ASR well, than in ambient water. Both the dissolved O2 concentrations and the pH of water near the well generally increased in magnitude during the simulated 5-a injection phase. The resultant oxidation of Fe2+ and attendant precipitation of goethite during injection provided a substrate for sorption of dissolved Fe during the 8-a extraction phase. The baseline scenario with a low (0.001M) concentration of pyrite in aquifer sediments, indicated that nearly 190% more water

  12. Investigating river–aquifer relations using water temperature in an anthropized environment (Motril-Salobreña aquifer)

    DEFF Research Database (Denmark)

    Duque, Carlos; Calvache, Marie; Engesgaard, Peter Knudegaard

    2010-01-01

    Heat was applied as a tracer for determining river–aquifer relations in the Motril-Salobreña aquifer (S Spain). The aquifer has typically been recharged by River Guadalfeo infiltration, nevertheless from 2005 a dam was constructed changing the traditional dynamic river flow and recharge events...

  13. Hydrological controls on transient aquifer storage in a karst watershed

    Science.gov (United States)

    Spellman, P.; Martin, J.; Gulley, J. D.

    2017-12-01

    While surface storage of floodwaters is well-known to attenuate flood peaks, transient storage of floodwaters in aquifers is a less recognized mechanism of flood peak attenuation. The hydraulic gradient from aquifer to river controls the magnitude of transient aquifer storage and is ultimately a function of aquifer hydraulic conductivity, and effective porosity. Because bedrock and granular aquifers tend to have lower hydraulic conductivities and porosities, their ability to attenuate flood peaks is generally small. In karst aquifers, however, extensive cave systems create high hydraulic conductivities and porosities that create low antecedent hydraulic gradients between aquifers and rivers. Cave springs can reverse flow during high discharges in rivers, temporarily storing floodwaters in the aquifer thus reducing the magnitude of flood discharge downstream. To date however, very few studies have quantified the magnitude or controls of transient aquifer storage in karst watersheds. We therefore investigate controls on transient aquifer storage by using 10 years of river and groundwater data from the Suwannee River Basin, which flows over the karstic upper Floridan aquifer in north-central Florida. We use multiple linear regression to compare the effects of three hydrological controls on the magnitude of transient aquifer storage: antecedent stage, recharge and slope of hydrograph rise. We show the dominant control on transient aquifer storage is antecedent stage, whereby lower stages result in greater magnitudes of transient aquifer storage. Our results suggest that measures of groundwater levels prior to an event can be useful in determining whether transient aquifer storage will occur and may provide a useful metric for improving predictions of flood magnitudes.

  14. Can Remote Sensing Detect Aquifer Characteristics?: A Case Study in the Guarani Aquifer System

    Science.gov (United States)

    Richey, A. S.; Thomas, B.; Famiglietti, J. S.

    2013-12-01

    Global water supply resiliency depends on groundwater, especially regions threatened by population growth and climate change. Aquifer characteristics, even as basic as confined versus unconfined, are necessary to prescribe regulations to sustainably manage groundwater supplies. A significant barrier to sustainable groundwater management exists in the difficulties associated with mapping groundwater resources and characteristics at a large spatial scale. This study addresses this challenge by investigating if remote sensing, including with NASA's Gravity Recovery and Climate Experiment (GRACE), can detect and quantify key aquifer parameters and characteristics. We explore this through a case study in the Guarani Aquifer System (GAS) of South America, validating our remote sensing-based findings against the best available regional estimates. The use of remote sensing to advance the understanding of large aquifers is beneficial to sustainable groundwater management, especially in a trans-boundary system, where consistent information exchange can occur within hydrologic boundaries instead of political boundaries.

  15. State Aquifer Recharge Atlas Plates, Geographic NAD83, LDEQ (1999) [aquifer_recharge_potential_LDEQ_1988

    Data.gov (United States)

    Louisiana Geographic Information Center — This is a polygon dataset depicting the boundaries of aquifer systems in the state of Louisiana and adjacent areas of Texas, Arkansas and a portion of Mississippi....

  16. Nutrient Removal during Stormwater Aquifer Storage and Recovery in an Anoxic Carbonate Aquifer.

    Science.gov (United States)

    Vanderzalm, Joanne L; Page, Declan W; Dillon, Peter J; Barry, Karen E; Gonzalez, Dennis

    2018-03-01

    Stormwater harvesting coupled to managed aquifer recharge (MAR) provides a means to use the often wasted stormwater resource while also providing protection of the natural and built environment. Aquifers can act as a treatment barrier within a multiple-barrier approach to harvest and use urban stormwater. However, it remains challenging to assess the treatment performance of a MAR scheme due to the heterogeneity of aquifers and MAR operations, which in turn influences water treatment processes. This study uses a probabilistic method to evaluate aquifer treatment performance based on the removal of total organic C (TOC), N, and P during MAR with urban stormwater in an anoxic carbonate aquifer. Total organic C, N, and P are represented as stochastic variables and described by probability density functions (PDFs) for the "injectant" and "recovery"; these injectant and recovery PDFs are used to derive a theoretical MAR removal efficiency PDF. Four long-term MAR sites targeting one of two tertiary carbonate aquifers (T1 and T2) were used to describe the nutrient removal efficiencies. Removal of TOC and total N (TN) was dominated by redox processes, with median removal of TOC between 50 and 60% at all sites and TN from 40 to 50% at three sites with no change at the fourth. Total P removal due to filtration and sorption accounted for median removal of 29 to 53%. Thus, the statistical method was able to characterize the capacity of the anoxic carbonate aquifer treatment barrier for nutrient removal, which highlights that aquifers can be an effective long-term natural treatment option for management of water quality, as well as storage of urban stormwater. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  17. Risk assessment and management of an oil contaminated aquifer

    International Nuclear Information System (INIS)

    Braxein, A.; Daniels, H.; Rouve, G.; Rubin, H.

    1991-01-01

    This paper concerns the provision of the basic information needed for the decision making process regarding the remedial measures leading to reutilization of an oil contaminated aquifer. The study refers to the case history of jet fuel contamination of an aquifer comprising part of the coastal aquifer of Israel. Due to that contamination two major water supply wells were abandoned. This study examines the use of numerical simulations in order to restore the contamination history of the aquifer. Such simulations also provide quantitative information needed for the decision making process regarding the future management of the contaminated aquifer

  18. Remediation of a contaminated thin aquifer by horizontal wells

    Energy Technology Data Exchange (ETDEWEB)

    Breh, W.; Suttheimer, J.; Hoetzl, H. [Univ. of Karlsruhe (Germany); Frank, K. [GEO-Service GmbH, Rheinmuenster (Germany)

    1997-12-31

    At an industrial site in Bruchsal (Germany) a huge trichloroethene contamination was found. After common remedial actions proved to be widely ineffective, new investigations led to a highly contaminated thin aquifer above the main aquifer. The investigation and the beginning of the remediation of the thin aquifer by two horizontal wells is described in this paper. Special attention was given to the dependence between precipitation and the flow direction in the thin aquifer and to hydraulic connections between the thin and the main aquifer. Also a short introduction into a new remedial technique by horizontal wells and first results of the test phase of the horizontal wells are given.

  19. Turning big bang into big bounce. I. Classical dynamics

    Science.gov (United States)

    Dzierżak, Piotr; Małkiewicz, Przemysław; Piechocki, Włodzimierz

    2009-11-01

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  20. Groundwater sustainability assessment in coastal aquifers

    Indian Academy of Sciences (India)

    The present work investigates the response of shallow, coastal unconfined aquifers to anticipated overdraft conditions and climate change effect using numerical simulation. The groundwater flow model MODFLOW and variable density groundwater model SEAWAT are used for this investigation. The transmissivity and ...

  1. Biogeochemical aspects of aquifer thermal energy storage

    NARCIS (Netherlands)

    Brons, H.J.

    1992-01-01

    During the process of aquifer thermal energy storage the in situ temperature of the groundwater- sediment system may fluctuate significantly. As a result the groundwater characteristics can be considerably affected by a variety of chemical, biogeochemical and microbiological

  2. Hydrochemical characterization of groundwater aquifer using ...

    African Journals Online (AJOL)

    Hydrochemical data analysis revealed four sources of solutes. The processes responsible for their enrichment include: chemical weathering, leaching of the overlying sediments, domestic activities, climatic condition and the flow pattern of the aquifer. The factors have contributed to the changes of the groundwater chemistry ...

  3. Transient well flow in vertically heterogeneous aquifers.

    NARCIS (Netherlands)

    Hemker, C.J.

    1999-01-01

    A solution for the general problem of computing well flow in vertically heterogeneous aquifers is found by an integration of both analytical and numerical techniques. The radial component of flow is treated analytically; the drawdown is a continuous function of the distance to the well. The

  4. Aquifer restoration: state of the art

    National Research Council Canada - National Science Library

    Knox, Robert C; Knox, R. C

    1986-01-01

    ... of chemicals or waste materials, improper underground injection of liquid wastes, and placement of septic tank systems in hydrologically and geologically unsuitable locations. Incidents of aquifer pollution from man's waste disposal activities have been discovered with increasing regularity. At the same time, demands for groundwater usage have been inc...

  5. Sedimentological analysis of a contaminated groundwater aquifer

    International Nuclear Information System (INIS)

    Towse, D.

    1991-01-01

    The use of sedimentological reservoir analysis techniques adapted from standard oilfield practice can improve the efficiency and reduce the costs of the evaluation of groundwater aquifers and the design of restoration programs. An evaluation/restoration program at a site in California drilled over 200 test wells in about 750 ac. All wells were logged lithologically and with wireline. The shallow aquifer is a complex braided alluvial floodplain deposit of Late Quaternary age. Analysis demonstrates depositional and erosional responses to periodic hinterland uplifts and to changing climatic conditions. Channel, overbank, lacustrine, and minor deltaic deposits can be recognized. The aquifer architecture has been interpreted to explain the movement of fuel and halogenated hydrocarbon solvents in the sediments and water. Routine engineering geology techniques and hydrologic tests were used to evaluate contamination and to design experimental restoration processes. As demonstrated here, sedimentological techniques show promise in reducing the costs and time required for this type of study. The abundant detailed data will be used in an attempt to develop a microcomputer-based expert system for rapid preliminary analyses of similar aquifers or reservoirs

  6. VULNERABILITY AND RISK OF CONTAMINATION KARSTIC AQUIFERS

    Directory of Open Access Journals (Sweden)

    Yameli Aguilar

    2013-08-01

    Full Text Available Karstic systems occupy nearly 20% of the surface of the earth and are inhabited by numerous human communities. Karstic aquifers are the most exposed to pollution from human activities. Pollution of karstic aquifers is a severe environmental problem worldwide.  In order to face the vulnerability of karstic aquifers to pollution, researchers have created a diversity of study approaches and models, each one having their own strengths and weaknesses depending on the discipline from which they were originated, thus requiring a thorough discussion within the required multidisciplinary character. The objective of this article was to analyze the theoretical and methodological approaches applied to the pollution of karstic aquifers. The European hydrogeological, land evaluation, hydropedological and a geographic approach were analyzed. The relevance of a geomorphological analysis as a cartographic basis for the analysis of vulnerability and risks were emphasized. From the analysis of models, approaches and methodologies discussed the following recommendation is made: to form an interdisciplinary work team, to elaborate a conceptual model according to the site and the working scale and to e, apply and validate the model.

  7. isotopic characteristics of aquifers in sinai

    International Nuclear Information System (INIS)

    Al-Gamal, S.A.

    2004-01-01

    the environmental isotopes data (expressed as δ 2 d and δ 18 O) of different aquifers in sinai were treated using correlation and regression techniques. whereas, rain water isotopic data were treated using empirical orthogonal functions (EOF) techniques. environmental isotopes for different aquifers expressed in terms of O-18 and H-2, were taken to represent the isotopic characteristics. regression equations using the highly correlated variables of δ 2 d and δ 18 O were constructed for each aquifer. the latitudinal variations (of rainwater in sinai and selected climatic stations east mediterranean ) versus rainwater isotopic compositions were analyzed using the normalized variables. it was found that the latitudinal variations of the rainwater isotopic compositions ( δ 2 D, δ 18 O), vapor pressure, and surface temperature occurred in parallel and decreased with latitude. in the east mediterranean, empirical linear relationship between altitude and δ 2 D has indicted that the rate of change of δ 2 D with height is comparable with the dry lapse rate in the atmosphere.The obtained regression equations of environmental isotopes data have impacted on different slopes and different constants expressing the non-homogeneity in the isotopic composition of rainwater recharging the aquifers of sinai , due to the presence of different air masses

  8. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  9. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  10. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  11. Hydrogeology - AQUIFER_SYSTEMS_BEDROCK_IDNR_IN: Bedrock Aquifer Systems of Indiana (Indiana Department of Natural Resources, 1:500,000, Polygon Shapefile)

    Data.gov (United States)

    NSGIC State | GIS Inventory — AQUIFER_SYSTEMS_BEDROCK_IDNR_IN is a polygon shapefile that shows bedrock aquifer systems of the State of Indiana. The source scale of the map depicting the aquifers...

  12. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  13. Arsenic release during managed aquifer recharge (MAR)

    Science.gov (United States)

    Pichler, T.; Lazareva, O.; Druschel, G.

    2013-12-01

    The mobilization and addition of geogenic trace metals to groundwater is typically caused by anthropogenic perturbations of the physicochemical conditions in the aquifer. This can add dangerously high levels of toxins to groundwater, thus compromising its use as a source of drinking water. In several regions world-wide, aquifer storage and recovery (ASR), a form of managed aquifer recharge (MAR), faces the problem of arsenic release due to the injection of oxygenated storage water. To better understand this process we coupled geochemical reactive transport modeling to bench-scale leaching experiments to investigate and verify the mobilization of geogenic arsenic (As) under a range of redox conditions from an arsenic-rich pyrite bearing limestone aquifer in Central Florida. Modeling and experimental observations showed similar results and confirmed the following: (1) native groundwater and aquifer matrix, including pyrite, were in chemical equilibrium, thus preventing the release of As due to pyrite dissolution under ambient conditions; (2) mixing of oxygen-rich surface water with oxygen-depleted native groundwater changed the redox conditions and promoted the dissolution of pyrite, and (3) the behavior of As along a flow path was controlled by a complex series of interconnected reactions. This included the oxidative dissolution of pyrite and simultaneous sorption of As onto neo-formed hydrous ferric oxides (HFO), followed by the reductive dissolution of HFO and secondary release of adsorbed As under reducing conditions. Arsenic contamination of drinking water in these systems is thus controlled by the re-equilibration of the system to more reducing conditions rather than a purely oxidative process.

  14. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  15. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  16. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  17. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  18. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  19. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  20. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  1. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  2. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  3. Simulation of spring discharge from a limestone aquifer in Iowa, USA

    Science.gov (United States)

    Zhang, Y.-K.; Bai, E.-W.; Libra, R.; Rowden, R.; Liu, H.

    1996-01-01

    A lumped-parameter model and least-squares method were used to simulate temporal variations of discharge from Big Spring, Iowa, USA, from 1983 to 1994. The simulated discharge rates poorly match the observed one when precipitation is taken as the sole input. The match is improved significantly when the processes of evapotranspiration and infiltration are considered. The best results are obtained when snowmelt is also included in the model. Potential evapotranspiration was estimated with Thornthwaite's formula, infiltration was calculated through a water-balance approach, and snowmelt was generated by a degree-day model. The results show that groundwater in the limestone aquifer is mainly recharged by snowmelt in early spring and by infiltration from rainfall in later spring and early summer. Simulated discharge was visually calibrated against measured discharge; the similarity between the two supports the validity of this approach. The model can be used to study the effects of climate change on groundwater resources and their quality.

  4. The faces of Big Science.

    Science.gov (United States)

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  5. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  6. Inhomogeneous Big Bang Nucleosynthesis Revisited

    OpenAIRE

    Lara, J. F.; Kajino, T.; Mathews, G. J.

    2006-01-01

    We reanalyze the allowed parameters for inhomogeneous big bang nucleosynthesis in light of the WMAP constraints on the baryon-to-photon ratio and a recent measurement which has set the neutron lifetime to be 878.5 +/- 0.7 +/- 0.3 seconds. For a set baryon-to-photon ratio the new lifetime reduces the mass fraction of He4 by 0.0015 but does not significantly change the abundances of other isotopes. This enlarges the region of concordance between He4 and deuterium in the parameter space of the b...

  7. Monitoring Aquifer Depletion from Space: Case Studies from the Saharan and Arabian Aquifers

    Science.gov (United States)

    Ahmed, M.; Sultan, M.; Wahr, J. M.; Yan, E.

    2013-12-01

    Access to potable fresh water resources is a human right and a basic requirement for economic development in any society. In arid and semi-arid areas, the characterization and understanding of the geologic and hydrologic settings of, and the controlling factors affecting, these resources is gaining increasing importance due to the challenges posed by increasing population. In these areas, there is immense natural fossil fresh water resources stored in large extensive aquifers, the transboundary aquifers. Yet, natural phenomena (e.g., rainfall patterns and climate change) together with human-related factors (e.g., population growth, unsustainable over-exploitation, and pollution) are threatening the sustainability of these resources. In this study, we are developing and applying an integrated cost-effective approach to investigate the nature (i.e., natural and anthropogenic) and the controlling factors affecting the hydrologic settings of the Saharan (i.e., Nubian Sandstone Aquifer System [NSAS], Northwest Sahara Aquifer System [NWSA]) and Arabian (i.e., Arabian Peninsula Aquifer System [APAS]) aquifer systems. Analysis of the Gravity Recovery and Climate Experiment (GRACE)-derived Terrestrial Water Storage (TWS) inter-annual trends over the NSAS and the APAS revealed two areas of significant TWS depletions; the first correlated with the Dakhla Aquifer System (DAS) in the NSAS and second with the Saq Aquifer System (SAS) in the APAS. Annual depletion rates were estimated at 1.3 × 0.66 × 109 m3/yr and 6.95 × 0.68 × 109 m3/yr for DAS and SAS, respectively. Findings include (1) excessive groundwater extraction, not climatic changes, is responsible for the observed TWS depletions ;(2) the DAS could be consumed in 350 years if extraction rates continue to double every 50 years and the APAS available reserves could be consumed within 60-140 years at present extraction (7.08 × 109 m3/yr) and depletion rates; and (3) observed depletions over DAS and SAS and their

  8. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  9. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  10. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  11. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  12. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  13. San Pedro River Aquifer Binational Report

    Science.gov (United States)

    Callegary, James B.; Minjárez Sosa, Ismael; Tapia Villaseñor, Elia María; dos Santos, Placido; Monreal Saavedra, Rogelio; Grijalva Noriega, Franciso Javier; Huth, A. K.; Gray, Floyd; Scott, C. A.; Megdal, Sharon; Oroz Ramos, L. A.; Rangel Medina, Miguel; Leenhouts, James M.

    2016-01-01

    The United States and Mexico share waters in a number of hydrological basins and aquifers that cross the international boundary. Both countries recognize that, in a region of scarce water resources and expanding populations, a greater scientific understanding of these aquifer systems would be beneficial. In light of this, the Mexican and U.S. Principal Engineers of the International Boundary and Water Commission (IBWC) signed the “Joint Report of the Principal Engineers Regarding the Joint Cooperative Process United States-Mexico for the Transboundary Aquifer Assessment Program" on August 19, 2009 (IBWC-CILA, 2009). This IBWC “Joint Report” serves as the framework for U.S.-Mexico coordination and dialogue to implement transboundary aquifer studies. The document clarifies several details about the program such as background, roles, responsibilities, funding, relevance of the international water treaties, and the use of information collected or compiled as part of the program. In the document, it was agreed by the parties involved, which included the IBWC, the Mexican National Water Commission (CONAGUA), the U.S. Geological Survey (USGS), and the Universities of Arizona and Sonora, to study two priority binational aquifers, one in the San Pedro River basin and the other in the Santa Cruz River basin. This report focuses on the Binational San Pedro Basin (BSPB). Reasons for the focus on and interest in this aquifer include the fact that it is shared by the two countries, that the San Pedro River has an elevated ecological value because of the riparian ecosystem that it sustains, and that water resources are needed to sustain the river, existing communities, and continued development. This study describes the aquifer’s characteristics in its binational context; however, most of the scientific work has been undertaken for many years by each country without full knowledge of the conditions on the other side of the border. The general objective of this study is to

  14. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  15. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  16. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  17. Astronomical Surveys and Big Data

    Directory of Open Access Journals (Sweden)

    Mickaelian Areg M.

    2016-03-01

    Full Text Available Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ-rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc., proper motions (Tycho, USNO, Gaia, variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS, and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA. An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  18. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2017-06-01

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  19. Aquifer test to determine hydraulic properties of the Elm aquifer near Aberdeen, South Dakota

    Science.gov (United States)

    Schaap, Bryan D.

    2000-01-01

    The Elm aquifer, which consists of sandy and gravelly glacial-outwash deposits, is present in several counties in northeastern South Dakota. An aquifer test was conducted northeast of Aberdeen during the fall of 1999 to determine the hydraulic properties of the Elm aquifer in that area. An improved understanding of the properties of the aquifer will be useful in the possible development of the aquifer as a water resource. Historical water-level data indicate that the saturated thickness of the Elm aquifer can change considerably over time. From September 1977 through November 1985, water levels at three wells completed in the Elm aquifer near the aquifer test site varied by 5.1 ft, 9.50 ft, and 11.1 ft. From June 1982 through October 1999, water levels at five wells completed in the Elm aquifer near the aquifer test site varied by 8.7 ft, 11.4 ft, 13.2 ft, 13.8 ft, and 19.7 ft. The water levels during the fall of 1999 were among the highest on record, so the aquifer test was affected by portions of the aquifer being saturated that might not be saturated during drier times. The aquifer test was conducted using five existing wells that had been installed prior to this study. Well A, the pumped well, has an operating irrigation pump and is centrally located among the wells. Wells B, C, D, and E are about 70 ft, 1,390 ft, 2,200 ft, and 3,100 ft, respectively, in different directions from Well A. Using vented pressure transducers and programmable data loggers, water-level data were collected at the five wells prior to, during, and after the pumping, which started on November 19, 1999, and continued a little over 72 hours. Based on available drilling logs, the Elm aquifer near the test area was assumed to be unconfined. The Neuman (1974) method theoretical response curves that most closely match the observed water-level changes at Wells A and B were calculated using software (AQTESOLV for Windows Version 2.13-Professional) developed by Glenn M. Duffield of Hydro

  20. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  1. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  2. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  3. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  4. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  5. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  6. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  7. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  8. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  9. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Veneziano, G.

    2000-01-01

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  10. Starting Small, Thinking Big - Continuum Magazine | NREL

    Science.gov (United States)

    , Thinking Big Stories NREL Helps Agencies Target New Federal Sustainability Goals Student Engagements Help solar power in the territory. Photo by Don Buchanan, VIEO Starting Small, Thinking Big NREL helps have used these actions to optimize that energy use.'" NREL's cross-organizational work supports

  11. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  12. Practice variation in Big-4 transparency reports

    NARCIS (Netherlands)

    Girdhar, Sakshi; Jeppesen, K.K.

    2018-01-01

    Purpose The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach The study draws on a

  13. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  14. Source, variability, and transformation of nitrate in a regional karst aquifer: Edwards aquifer, central Texas

    Energy Technology Data Exchange (ETDEWEB)

    Musgrove, M., E-mail: mmusgrov@usgs.gov [U.S. Geological Survey, 1505 Ferguson Lane, Austin, TX 78754 (United States); Opsahl, S.P. [U.S. Geological Survey, 5563 DeZavala, Ste. 290, San Antonio, TX 78249 (United States); Mahler, B.J. [U.S. Geological Survey, 1505 Ferguson Lane, Austin, TX 78754 (United States); Herrington, C. [City of Austin Watershed Protection Department, Austin, TX 78704 (United States); Sample, T.L. [U.S. Geological Survey, 19241 David Memorial Dr., Ste. 180, Conroe, TX 77385 (United States); Banta, J.R. [U.S. Geological Survey, 5563 DeZavala, Ste. 290, San Antonio, TX 78249 (United States)

    2016-10-15

    Many karst regions are undergoing rapid population growth and expansion of urban land accompanied by increases in wastewater generation and changing patterns of nitrate (NO{sub 3}{sup −}) loading to surface and groundwater. We investigate variability and sources of NO{sub 3}{sup −} in a regional karst aquifer system, the Edwards aquifer of central Texas. Samples from streams recharging the aquifer, groundwater wells, and springs were collected during 2008–12 from the Barton Springs and San Antonio segments of the Edwards aquifer and analyzed for nitrogen (N) species concentrations and NO{sub 3}{sup −} stable isotopes (δ{sup 15}N and δ{sup 18}O). These data were augmented by historical data collected from 1937 to 2007. NO{sub 3}{sup −} concentrations and discharge data indicate that short-term variability (days to months) in groundwater NO{sub 3}{sup −} concentrations in the Barton Springs segment is controlled by occurrence of individual storms and multi-annual wet-dry cycles, whereas the lack of short-term variability in groundwater in the San Antonio segment indicates the dominance of transport along regional flow paths. In both segments, longer-term increases (years to decades) in NO{sub 3}{sup −} concentrations cannot be attributed to hydrologic conditions; rather, isotopic ratios and land-use change indicate that septic systems and land application of treated wastewater might be the source of increased loading of NO{sub 3}{sup −}. These results highlight the vulnerability of karst aquifers to NO{sub 3}{sup −} contamination from urban wastewater. An analysis of N-species loading in recharge and discharge for the Barton Springs segment during 2008–10 indicates an overall mass balance in total N, but recharge contains higher concentrations of organic N and lower concentrations of NO{sub 3}{sup −} than does discharge, consistent with nitrification of organic N within the aquifer and consumption of dissolved oxygen. This study demonstrates

  15. Source, variability, and transformation of nitrate in a regional karst aquifer: Edwards aquifer, central Texas

    International Nuclear Information System (INIS)

    Musgrove, M.; Opsahl, S.P.; Mahler, B.J.; Herrington, C.; Sample, T.L.; Banta, J.R.

    2016-01-01

    Many karst regions are undergoing rapid population growth and expansion of urban land accompanied by increases in wastewater generation and changing patterns of nitrate (NO 3 − ) loading to surface and groundwater. We investigate variability and sources of NO 3 − in a regional karst aquifer system, the Edwards aquifer of central Texas. Samples from streams recharging the aquifer, groundwater wells, and springs were collected during 2008–12 from the Barton Springs and San Antonio segments of the Edwards aquifer and analyzed for nitrogen (N) species concentrations and NO 3 − stable isotopes (δ 15 N and δ 18 O). These data were augmented by historical data collected from 1937 to 2007. NO 3 − concentrations and discharge data indicate that short-term variability (days to months) in groundwater NO 3 − concentrations in the Barton Springs segment is controlled by occurrence of individual storms and multi-annual wet-dry cycles, whereas the lack of short-term variability in groundwater in the San Antonio segment indicates the dominance of transport along regional flow paths. In both segments, longer-term increases (years to decades) in NO 3 − concentrations cannot be attributed to hydrologic conditions; rather, isotopic ratios and land-use change indicate that septic systems and land application of treated wastewater might be the source of increased loading of NO 3 − . These results highlight the vulnerability of karst aquifers to NO 3 − contamination from urban wastewater. An analysis of N-species loading in recharge and discharge for the Barton Springs segment during 2008–10 indicates an overall mass balance in total N, but recharge contains higher concentrations of organic N and lower concentrations of NO 3 − than does discharge, consistent with nitrification of organic N within the aquifer and consumption of dissolved oxygen. This study demonstrates that subaqueous nitrification of organic N in the aquifer, as opposed to in soils, might be a

  16. Characterizing aquifer hydrogeology and anthropogenic chemical influences on groundwater near the Idaho Chemical Processing Plant, Idaho National Engineering Laboratory, Idaho

    International Nuclear Information System (INIS)

    Fromm, J.M.

    1995-01-01

    A conceptual model of the Eastern Snake River Plain aquifer in the vicinity of monitoring well USGS-44, downgradient of the Idaho Chemical Processing Plant (ICPP) on the Idaho National Engineering Laboratory (INEL), was developed by synthesis and comparison of previous work (40 years) and new investigations into local natural hydrogeological conditions and anthropogenic influences. Quantitative tests of the model, and other recommendations are suggested. The ICPP recovered fissionable uranium from spent nuclear fuel rods and disposed of waste fluids by release to the regional aquifer and lithosphere. Environmental impacts were assessed by a monitoring well network. The conceptual model identifies multiple, highly variable, interacting, and transient components, including INEL facilities multiple operations and liquid waste handling, systems; the anisotropic, in homogeneous aquifer; the network of monitoring and production wells, and the intermittent flow of the Big Lost River. Pre anthropogenic natural conditions and early records of anthropogenic activities were sparsely or unreliably documented making reconstruction of natural conditions or early hydrologic impacts impossible or very broad characterizations

  17. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  18. Aquifer thermal energy storage in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Iihola, H; Ala-Peijari, T; Seppaenen, H

    1988-01-01

    The rapid changes and crises in the field of energy during the 1970s and 1980s have forced us to examine the use of energy more critically and to look for new ideas. Seasonal aquifer thermal energy storage (T < 100/sup 0/C) on a large scale is one of the grey areas which have not yet been extensively explored. However, projects are currently underway in a dozen countries. In Finland there have been three demonstration projects from 1974 to 1987. International co-operation under the auspices of the International Energy Agency, Annex VI, 'Environmental and Chemical Aspects of Thermal Energy Storage in Aquifers and Research and Development of Water Treatment Methods' started in 1987. The research being undertaken in 8 countries includes several elements fundamental to hydrochemistry and biochemistry.

  19. Study of Aquifer Thermal Energy Storage

    Science.gov (United States)

    Okuyama, Masaaki; Umemiya, Hiromichi; Shibuya, Ikuko; Haga, Eiji

    Yamagata University 'Aquifer Thermal Energy Storage (ATES)' is the experimental system which has been running since 1982. From the results for along terms of experiments, we obtain many important knowledge. This paper presents the accomplishments for 16 years and the characteristics of thermal energy storage in thermal energy storage well. The conclusions show as follows. 1)In recent years, the thermal recovery factor of warm energy storage well becomes almost constant at about 60%. 2) The thermal recovery factor of cool energy storage well increases gradually and becomes at about 15%. 3) Since the ferric colloidal dam is formed in aquifer, thermal recovery factor increase year after year. 4) Back wash can remove clogging for ferric colloidal dam. 5) The apparent thermal diffusivity decrease gradually due to ferric colloidal dam.

  20. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  1. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  2. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  3. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  4. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  5. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  6. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  7. Increasing freshwater recovery upon aquifer storage : A field and modelling study of dedicated aquifer storage and recovery configurations in brackish-saline aquifers

    NARCIS (Netherlands)

    Zuurbier, Koen

    2016-01-01

    The subsurface may provide opportunities for robust, effective, sustainable, and cost-efficient freshwater management solutions. For instance, via aquifer storage and recovery (ASR; Pyne, 2005): “the storage of water in a suitable aquifer through a well during times when water is available, and the

  8. Understanding Uranium Behavior in a Reduced Aquifer

    Science.gov (United States)

    Janot, N.; Lezama-Pacheco, J. S.; Williams, K. H.; Bernier-Latmani, R.; Long, P. E.; Davis, J. A.; Fox, P. M.; Yang, L.; Giammar, D.; Cerrato, J. M.; Bargar, J.

    2012-12-01

    Uranium contamination of groundwater is a concern at several US Department of Energy sites, such Old Rifle, CO. Uranium transport in the environment is mainly controlled by its oxidation state, since oxidized U(VI) is relatively mobile, whereas U(IV) is relatively insoluble. Bio-remediation of contaminated aquifers aims at immobilizing uranium in a reduced form. Previous laboratory and field studies have shown that adding electron donor (lactate, acetate, ethanol) to groundwater stimulates the activity of metal- and sulfate-reducing bacteria, which promotes U(VI) reduction in contaminated aquifers. However, obtaining information on chemical and physical forms of U, Fe and S species for sediments biostimulated in the field, as well as kinetic parameters such as U(VI) reduction rate, is challenging due to the low concentration of uranium in the aquifers (typically bio-remediation experiment at the Old Rifle site, CO, from early iron-reducing conditions to the transition to sulfate-reducing conditions. Several in-well chromatographic columns packed with sediment were deployed and were sampled at different days after the start of bio-reduction. X-ray absorption spectroscopy and X-ray microscopy were used to obtain information on Fe, S and U speciation and distribution. Chemical extractions of the reduced sediments have also been performed, to determine the rate of Fe(II) and U(IV) accumulation.

  9. Carbon-14 measurements in aquifers with methane

    International Nuclear Information System (INIS)

    Barker, J.F.; Fritz, P.; Brown, R.M.

    1979-01-01

    A survey of various groundwater systems indicates that methane is a common trace constituent and occasionally a major carbon species in groundwaters. Thermocatalytic methane had delta 13 Csub(CH 4 )>-45 per mille and microbially produced or biogenic methane had delta 13 Csub(CH 4 ) 13 C values for the inorganic carbon. Thermocatalytic methane had no apparent effect on the inorganic carbon. Because methanogenesis seriously affects the carbon isotope geochemistry of groundwaters, the correction of raw 14 C ages of affected groundwaters must consider these effects. Conceptual models are developed which adjust the 14 C activity of the groundwater for the effects of methanogenesis and for the dilution of carbon present during infiltration by simple dissolution of rock carbonate. These preliminary models are applied to groundwaters from the Alliston sand aquifer where methanogenesis has affected most samples. In this system, methanogenic bacteria using organic matter present in the aquifer matrix as substrate have added inorganic carbon to the groundwater which has initiated further carbonate rock dissolution. These processes have diluted the inorganic carbon 14 C activity. The adjusted groundwater ages can be explained in terms of the complex hydrogeology of this aquifer, but also indicate that these conceptual models must be more rigorously tested to evaluate their appropriateness. (author)

  10. Numerical simulation of groundwater and surface-water interactions in the Big River Management Area, central Rhode Island

    Science.gov (United States)

    Masterson, John P.; Granato, Gregory E.

    2013-01-01

    The Rhode Island Water Resources Board is considering use of groundwater resources from the Big River Management Area in central Rhode Island because increasing water demands in Rhode Island may exceed the capacity of current sources. Previous water-resources investigations in this glacially derived, valley-fill aquifer system have focused primarily on the effects of potential groundwater-pumping scenarios on streamflow depletion; however, the effects of groundwater withdrawals on wetlands have not been assessed, and such assessments are a requirement of the State’s permitting process to develop a water supply in this area. A need for an assessment of the potential effects of pumping on wetlands in the Big River Management Area led to a cooperative agreement in 2008 between the Rhode Island Water Resources Board, the U.S. Geological Survey, and the University of Rhode Island. This partnership was formed with the goal of developing methods for characterizing wetland vegetation, soil type, and hydrologic conditions, and monitoring and modeling water levels for pre- and post-water-supply development to assess potential effects of groundwater withdrawals on wetlands. This report describes the hydrogeology of the area and the numerical simulations that were used to analyze the interaction between groundwater and surface water in response to simulated groundwater withdrawals. The results of this analysis suggest that, given the hydrogeologic conditions in the Big River Management Area, a standard 5-day aquifer test may not be sufficient to determine the effects of pumping on water levels in nearby wetlands. Model simulations showed water levels beneath Reynolds Swamp declined by about 0.1 foot after 5 days of continuous pumping, but continued to decline by an additional 4 to 6 feet as pumping times were increased from a 5-day simulation period to a simulation period representative of long-term average monthly conditions. This continued decline in water levels with

  11. Hydrology of the shallow aquifer and uppermost semiconfined aquifer near El Paso, Texas

    Science.gov (United States)

    White, D.E.; Baker, E.T.; Sperka, Roger

    1997-01-01

    The availability of fresh ground water in El Paso and adjacent areas that is needed to meet increased demand for water supply concerns local, State, and Federal agencies. The Hueco bolson is the principal aquifer in the El Paso area. Starting in the early 1900s and continuing to the 1950s, most of the municipal and industrial water supply in El Paso was pumped from the Hueco bolson aquifer from wells in and near the Rio Grande Valley and the international border. The Rio Grande is the principal surface-water feature in the El Paso area, and a major source of recharge to the shallow aquifer (Rio Grande alluvium) within the study area is leakage of flow from the Rio Grande.

  12. Intelligent search in Big Data

    Science.gov (United States)

    Birialtsev, E.; Bukharaev, N.; Gusenkov, A.

    2017-10-01

    An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.

  13. Big Data in Transport Geography

    DEFF Research Database (Denmark)

    Reinau, Kristian Hegner; Agerholm, Niels; Lahrmann, Harry Spaabæk

    for studies that explicitly compare the quality of this new type of data to traditional data sources. With the current focus on Big Data in the transport field, public transport planners are increasingly looking towards smart card data to analyze and optimize flows of passengers. However, in many cases...... it is not all public transport passengers in a city, region or country with a smart card system that uses the system, and in such cases, it is important to know what biases smart card data has in relation to giving a complete view upon passenger flows. This paper therefore analyses the quality and biases...... of smart card data in Denmark, where public transport passengers may use a smart card, may pay with cash for individual trips or may hold a season ticket for a certain route. By analyzing smart card data collected in Denmark in relation to data on sales of cash tickets, sales of season tickets, manual...

  14. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  15. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  16. Big Bang nucleosynthesis in crisis?

    International Nuclear Information System (INIS)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P.; Bludman, S.; Langacker, P.

    1995-01-01

    A new evaluation of the constraint on the number of light neutrino species (N ν ) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4 He abundance has been underestimated by 0.014±0.004 (1σ) or less than 10% (95% C.L.) of 3 He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is N ν =2.1±0.3 (1σ) and the upper limit is N ν ν =3) at the 98.6% C.L. copyright 1995 The American Physical Society

  17. Recharge and Aquifer Response: Manukan Island’s Aquifer, Sabah, Malaysia

    Directory of Open Access Journals (Sweden)

    Sarva Mangala Praveena

    2010-01-01

    Full Text Available Manukan Island is a small island located in North-West of Sabah, Malaysia was used as a case study area for numerical modeling of an aquifer response to recharge and pumping rates. The results in this study present the variations of recharge into the aquifer under the prediction simulations. The recharge rate increases the water level as indicated by hydraulic heads. This shows that it can alter groundwater of Manukan Island which has been suffering from an overexploration in its unconfined the aquifer. The increase in recharge rate (from 600 mm/year to 750 mm/year increases the water level indicated by hydraulic heads. A reduction in pumping rate (from 0.072 m3/day to 0.058 m3/day not only increases the amount of water levels in aquifer but also reduces the supply hence a deficit in supply. The increase in hydraulic heads depends on the percentage reduction of pumping and recharges rates. The well water has 1978.3 mg/L chloride with current pumping (0.072 m3/day and recharge rates (600 mm/year. However, with an increased of recharge rate and current pumping rate it has decreased about 1.13%. In addition, reduction in pumping rate made the chloride concentration decreased about 2.8%. In general, a reduction in pumping with an increase in recharge rate leads to a decreased in chloride concentrations within the vicinity of cone of depression. Next, to further develop the numerical model, the model should focus on climate change variables such as consequences of climate change are increase in air temperature, increase in sea surface temperature, and more extreme weather conditions. These parameters are considered critical parameters for climate change impact modeling in aquifers. The behavior of the aquifer and its sustainable pumping rate can be done by applying a computer modeling component.

  18. Characteristics of Southern California coastal aquifer systems

    Science.gov (United States)

    Edwards, B.D.; Hanson, R.T.; Reichard, E.G.; Johnson, T.A.

    2009-01-01

    Most groundwater produced within coastal Southern California occurs within three main types of siliciclastic basins: (1) deep (>600 m), elongate basins of the Transverse Ranges Physiographic Province, where basin axes and related fluvial systems strike parallel to tectonic structure, (2) deep (>6000 m), broad basins of the Los Angeles and Orange County coastal plains in the northern part of the Peninsular Ranges Physiographic Province, where fluvial systems cut across tectonic structure at high angles, and (3) shallow (75-350 m), relatively narrow fluvial valleys of the generally mountainous southern part of the Peninsular Ranges Physiographic Province in San Diego County. Groundwater pumped for agricultural, industrial, municipal, and private use from coastal aquifers within these basins increased with population growth since the mid-1850s. Despite a significant influx of imported water into the region in recent times, groundwater, although reduced as a component of total consumption, still constitutes a significant component of water supply. Historically, overdraft from the aquifers has caused land surface subsidence, flow between water basins with related migration of groundwater contaminants, as well as seawater intrusion into many shallow coastal aquifers. Although these effects have impacted water quality, most basins, particularly those with deeper aquifer systems, meet or exceed state and national primary and secondary drinking water standards. Municipalities, academicians, and local water and governmental agencies have studied the stratigraphy of these basins intensely since the early 1900s with the goals of understanding and better managing the important groundwater resource. Lack of a coordinated effort, due in part to jurisdictional issues, combined with the application of lithostratigraphic correlation techniques (based primarily on well cuttings coupled with limited borehole geophysics) have produced an often confusing, and occasionally conflicting

  19. Microbiological risks of recycling urban stormwater via aquifers.

    Science.gov (United States)

    Page, D; Gonzalez, D; Dillon, P

    2012-01-01

    With the release of the Australian Guidelines for Water Recycling: Managed Aquifer Recharge (MAR), aquifers are now being included as a treatment barrier when assessing risk of recycled water systems. A MAR research site recharging urban stormwater in a confined aquifer was used in conjunction with a Quantitative Microbial Risk Assessment to assess the microbial pathogen risk in the recovered water for different end uses. The assessment involved undertaking a detailed assessment of the treatment steps and exposure controls, including the aquifer, to achieve the microbial health-based targets.

  20. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  1. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  2. Restoration of Wadi Aquifers by Artificial Recharge with Treated Waste Water

    KAUST Repository

    Missimer, Thomas M.; Drewes, Jö rg E.; Amy, Gary L.; Maliva,, Robert G.; Keller, Stephanie

    2012-01-01

    , such as damage to sensitive nearshore marine environments and creation of high-salinity interior surface water areas. An investigation of the hydrogeology of wadi aquifers in Saudi Arabia revealed that these aquifers can be used to develop aquifer recharge

  3. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  4. Straddle-packer aquifer test analyses of the Snake River Plain aquifer at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Johnson, G.S.; Frederick, D.B.

    1997-01-01

    The State of Idaho INEL Oversight Program, with the University of Idaho, Idaho State University, Boise State University, and the Idaho Geologic Survey, used a straddle-packer system to investigate vertical variations in characteristics of the Snake River Plain aquifer at the Idaho National Engineering Laboratory in southeast Idaho. Sixteen single-well aquifer tests were conducted on.isolated intervals in three observation wells. Each of these wells has approximately 200 feet of open borehole below the water table, penetrating the E through G and I basalt flow groups and interbedded sediments of the Snake River Plain aquifer. The success of the aquifer tests was limited by the inability to induce measurable drawdown in several zones. Time-drawdown data from aquifer tests were matched to type curves for 8 of the 16 zones tested. A single aquifer test at the water table exhibited greater curvature than those at depth. The increased degree of curvature suggests an unconfined response and resulted in an estimate of specific yield of 0.03. Aquifer tests below the water table generally yielded time-drawdown graphs with a rapid initial response followed by constant drawdown throughout the duration of the tests; up to several hours in length. The rapid initial response implies that the aquifer responds as a confined system during brief pumping periods. The nearly constant drawdown suggests a secondary source of water, probably vertical flow from overlying and underlying aquifer layers. Three analytical models were applied for comparison to the conceptual model and to provide estimates of aquifer properties. This, Hantush-Jacob leaky aquifer, and the Moench double-porosity fractured rock models were fit to time-drawdown data. The leaky aquifer type curves of Hantush and Jacob generally provided the best match to observed drawdown. A specific capacity regression equation was also used to estimate hydraulic conductivity

  5. Hydrogeology and Aquifer Storage and Recovery Performance in the Upper Floridan Aquifer, Southern Florida

    Science.gov (United States)

    Reese, Ronald S.; Alvarez-Zarikian, Carlos A.

    2007-01-01

    Well construction, hydraulic well test, ambient water-quality, and cycle test data were inventoried and compiled for 30 aquifer storage and recovery facilities constructed in the Floridan aquifer system in southern Florida. Most of the facilities are operated by local municipalities or counties in coastal areas, but five sites are currently being evaluated as part of the Comprehensive Everglades Restoration Plan. The relative performance of all sites with adequate cycle test data was determined, and compared with four hydrogeologic and design factors that may affect recovery efficiency. Testing or operational cycles include recharge, storage, and recovery periods that each last days or months. Cycle test data calculations were made including the potable water (chloride concentration of less than 250 milligrams per liter) recovery efficiency per cycle, total recovery efficiency per cycle, and cumulative potable water recovery efficiencies for all of the cycles at each site. The potable water recovery efficiency is the percentage of the total amount of potable water recharged for each cycle that is recovered; potable water recovery efficiency calculations (per cycle and cumulative) were the primary measures used to evaluate site performance in this study. Total recovery efficiency, which is the percent recovery at the end of each cycle, however, can be substantially higher and is the performance measure normally used in the operation of water-treatment plants. The Upper Floridan aquifer of the Floridan aquifer system currently is being used, or planned for use, at 29 of the aquifer storage and recovery sites. The Upper Floridan aquifer is continuous throughout southern Florida, and its overlying confinement is generally good; however, the aquifer contains brackish to saline ground water that can greatly affect freshwater storage and recovery due to dispersive mixing within the aquifer. The hydrogeology of the Upper Floridan varies in southern Florida; confinement

  6. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  7. Big Data as Governmentality in International Development

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    2017-01-01

    Statistics have long shaped the field of visibility for the governance of development projects. The introduction of big data has altered the field of visibility. Employing Dean's “analytics of government” framework, we analyze two cases—malaria tracking in Kenya and monitoring of food prices...... in Indonesia. Our analysis shows that big data introduces a bias toward particular types of visualizations. What problems are being made visible through big data depends to some degree on how the underlying data is visualized and who is captured in the visualizations. It is also influenced by technical factors...

  8. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  9. Astroinformatics: the big data of the universe

    OpenAIRE

    Barmby, Pauline

    2016-01-01

    In astrophysics we like to think that our field was the originator of big data, back when it had to be carried around in big sky charts and books full of tables. These days, it's easier to move astrophysics data around, but we still have a lot of it, and upcoming telescope  facilities will generate even more. I discuss how astrophysicists approach big data in general, and give examples from some Western Physics & Astronomy research projects.  I also give an overview of ho...

  10. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  11. Hot big bang or slow freeze?

    Science.gov (United States)

    Wetterich, C.

    2014-09-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze - a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple ;crossover model; without a big bang singularity. In the infinite past space-time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  12. Smart Information Management in Health Big Data.

    Science.gov (United States)

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  13. Hydrogeology of the Cambrian-Ordovician aquifer system in the northern Midwest: B in Regional aquifer-system analysis

    Science.gov (United States)

    Young, H.L.; Siegel, D.I.

    1992-01-01

    The Cambrian-Ordovician aquifer system contains the most extensive and continuous aquifers in the northern Midwest of the United States. It is the source of water for many municipalities, industries, and rural water users. Since the beginning of ground-water development from the aquifer system in the late 1800's, hydraulic heads have declined hundreds of feet in the heavily pumped Chicago-Milwaukee area and somewhat less in other metropolitan areas. The U.S. Geological Survey has completed a regional assessment of this aquifer system within a 161,000-square-mile area encompassing northern Illinois, northwestern Indiana, Iowa, southeastern Minnesota, northern Missouri, and Wisconsin.

  14. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  15. Fresh Water Generation from Aquifer-Pressured Carbon Storage: Annual Report FY09

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T; Aines, R; Hao, Y; Bourcier, W; Wolfe, T; Haussman, C

    2009-11-25

    This project is establishing the potential for using brine pressurized by Carbon Capture and Storage (CCS) operations in saline formations as the feedstock for desalination and water treatment technologies including reverse osmosis (RO) and nanofiltration (NF). The aquifer pressure resulting from the energy required to inject the carbon dioxide provides all or part of the inlet pressure for the desalination system. Residual brine is reinjected into the formation at net volume reduction, such that the volume of fresh water extracted balances the volume of CO{sub 2} injected into the formation. This process provides additional CO{sub 2} storage capacity in the aquifer, reduces operational risks (cap-rock fracturing, contamination of neighboring fresh water aquifers, and seismicity) by relieving overpressure in the formation, and provides a source of low-cost fresh water to offset costs or operational water needs. This multi-faceted project combines elements of geochemistry, reservoir engineering, and water treatment engineering. The range of saline formation waters is being identified and analyzed. Computer modeling and laboratory-scale experimentation are being used to examine mineral scaling and osmotic pressure limitations. Computer modeling is being used to evaluate processes in the storage aquifer, including the evolution of the pressure field. Water treatment costs are being evaluated by comparing the necessary process facilities to those in common use for seawater RO. There are presently limited brine composition data available for actual CCS sites by the site operators including in the U.S. the seven regional Carbon Sequestration Partnerships (CSPs). To work around this, we are building a 'catalog' of compositions representative of 'produced' waters (waters produced in the course of seeking or producing oil and gas), to which we are adding data from actual CCS sites as they become available. Produced waters comprise the most common

  16. NOAA Big Data Partnership RFI

    Science.gov (United States)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  17. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  18. WTAQ - A computer program for aquifer-test analysis of confined and unconfined aquifers

    Science.gov (United States)

    Barlow, P.M.; Moench, A.F.

    2004-01-01

    Computer program WTAQ was developed to implement a Laplace-transform analytical solution for axial-symmetric flow to a partially penetrating, finite-diameter well in a homogeneous and anisotropic unconfined (water-table) aquifer. The solution accounts for wellbore storage and skin effects at the pumped well, delayed response at an observation well, and delayed or instantaneous drainage from the unsaturated zone. For the particular case of zero drainage from the unsaturated zone, the solution simplifies to that of axial-symmetric flow in a confined aquifer. WTAQ calculates theoretical time-drawdown curves for the pumped well and observation wells and piezometers. The theoretical curves are used with measured time-drawdown data to estimate hydraulic parameters of confined or unconfined aquifers by graphical type-curve methods or by automatic parameter-estimation methods. Parameters that can be estimated are horizontal and vertical hydraulic conductivity, specific storage, and specific yield. A sample application illustrates use of WTAQ for estimating hydraulic parameters of a hypothetical, unconfined aquifer by type-curve methods. Copyright ASCE 2004.

  19. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  20. Semantic Web Technologies and Big Data Infrastructures: SPARQL Federated Querying of Heterogeneous Big Data Stores

    OpenAIRE

    Konstantopoulos, Stasinos; Charalambidis, Angelos; Mouchakis, Giannis; Troumpoukis, Antonis; Jakobitsch, Jürgen; Karkaletsis, Vangelis

    2016-01-01

    The ability to cross-link large scale data with each other and with structured Semantic Web data, and the ability to uniformly process Semantic Web and other data adds value to both the Semantic Web and to the Big Data community. This paper presents work in progress towards integrating Big Data infrastructures with Semantic Web technologies, allowing for the cross-linking and uniform retrieval of data stored in both Big Data infrastructures and Semantic Web data. The technical challenges invo...

  1. Source, variability, and transformation of nitrate in a regional karst aquifer: Edwards aquifer, central Texas.

    Science.gov (United States)

    Musgrove, MaryLynn; Opsahl, Stephen P.; Mahler, Barbara J.; Herrington, Chris; Sample, Thomas; Banta, John

    2016-01-01

    Many karst regions are undergoing rapid population growth and expansion of urban land accompanied by increases in wastewater generation and changing patterns of nitrate (NO3−) loading to surface and groundwater. We investigate variability and sources of NO3− in a regional karst aquifer system, the Edwards aquifer of central Texas. Samples from streams recharging the aquifer, groundwater wells, and springs were collected during 2008–12 from the Barton Springs and San Antonio segments of the Edwards aquifer and analyzed for nitrogen (N) species concentrations and NO3− stable isotopes (δ15N and δ18O). These data were augmented by historical data collected from 1937 to 2007. NO3− concentrations and discharge data indicate that short-term variability (days to months) in groundwater NO3− concentrations in the Barton Springs segment is controlled by occurrence of individual storms and multi-annual wet-dry cycles, whereas the lack of short-term variability in groundwater in the San Antonio segment indicates the dominance of transport along regional flow paths. In both segments, longer-term increases (years to decades) in NO3− concentrations cannot be attributed to hydrologic conditions; rather, isotopic ratios and land-use change indicate that septic systems and land application of treated wastewater might be the source of increased loading of NO3−. These results highlight the vulnerability of karst aquifers to NO3− contamination from urban wastewater. An analysis of N-species loading in recharge and discharge for the Barton Springs segment during 2008–10 indicates an overall mass balance in total N, but recharge contains higher concentrations of organic N and lower concentrations of NO3−than does discharge, consistent with nitrification of organic N within the aquifer and consumption of dissolved oxygen. This study demonstrates that subaqueous nitrification of organic N in the aquifer, as opposed to in soils, might be a previously

  2. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  3. 2015 OLC Lidar DEM: Big Wood, ID

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Quantum Spatial has collected Light Detection and Ranging (LiDAR) data for the Oregon LiDAR Consortium (OLC) Big Wood 2015 study area. This study area is located in...

  4. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  5. Scaling big data with Hadoop and Solr

    CERN Document Server

    Karambelkar, Hrishikesh Vijay

    2015-01-01

    This book is aimed at developers, designers, and architects who would like to build big data enterprise search solutions for their customers or organizations. No prior knowledge of Apache Hadoop and Apache Solr/Lucene technologies is required.

  6. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  7. Statistical Challenges in Modeling Big Brain Signals

    KAUST Repository

    Yu, Zhaoxia

    2017-11-01

    Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible solutions, and highlight future research directions.

  8. NDE Big Data Framework, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NDE data has become "Big Data", and is overwhelming the abilities of NDE technicians and commercially available tools to deal with it. In the current state of the...

  9. Cosmic relics from the big bang

    International Nuclear Information System (INIS)

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab

  10. ARC Code TI: BigView

    Data.gov (United States)

    National Aeronautics and Space Administration — BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running linux. Additionally, it can work in a multi-screen environment...

  11. Big Data and HPC: A Happy Marriage

    KAUST Repository

    Mehmood, Rashid

    2016-01-01

    International Data Corporation (IDC) defines Big Data technologies as “a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data produced every day, by enabling high

  12. Don't Trust the Big Man

    National Research Council Canada - National Science Library

    Coerr, Stanton S

    2008-01-01

    .... A full-length analysis of the Democratic Republic of the Congo and its full-scale insurgencies since 1960 -- at the hands of Big Man leaders Lumumba, Mobutu, and Kabila -- provide object lessons...

  13. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  14. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  15. Fisicos argentinos reproduciran el Big Bang

    CERN Multimedia

    De Ambrosio, Martin

    2008-01-01

    Two groups of argentine physicists from La Plata and Buenos Aires Universities work in a sery of experiments who while recreate the conditions of the big explosion that was at the origin of the universe. (1 page)

  16. Statistical Challenges in Modeling Big Brain Signals

    KAUST Repository

    Yu, Zhaoxia; Pluta, Dustin; Shen, Tong; Chen, Chuansheng; Xue, Gui; Ombao, Hernando

    2017-01-01

    Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible

  17. Bioinformatics clouds for big data manipulation

    KAUST Repository

    Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang

    2012-01-01

    -supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues

  18. Big data business models: Challenges and opportunities

    Directory of Open Access Journals (Sweden)

    Ralph Schroeder

    2016-12-01

    Full Text Available This paper, based on 28 interviews from a range of business leaders and practitioners, examines the current state of big data use in business, as well as the main opportunities and challenges presented by big data. It begins with an account of the current landscape and what is meant by big data. Next, it draws distinctions between the ways organisations use data and provides a taxonomy of big data business models. We observe a variety of different business models, depending not only on sector, but also on whether the main advantages derive from analytics capabilities or from having ready access to valuable data sources. Some major challenges emerge from this account, including data quality and protectiveness about sharing data. The conclusion discusses these challenges, and points to the tensions and differing perceptions about how data should be governed as between business practitioners, the promoters of open data, and the wider public.

  19. Cosmic relics from the big bang

    Energy Technology Data Exchange (ETDEWEB)

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  20. Big Bend National Park: Acoustical Monitoring 2010

    Science.gov (United States)

    2013-06-01

    During the summer of 2010 (September October 2010), the Volpe Center collected baseline acoustical data at Big Bend National Park (BIBE) at four sites deployed for approximately 30 days each. The baseline data collected during this period will he...

  1. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  2. Hydrodynamic simulations of physical aquatic habitat availability for Pallid Sturgeon in the Lower Missouri River, at Yankton, South Dakota, Kenslers Bend, Nebraska, Little Sioux, Iowa, and Miami, Missouri, 2006-07

    Science.gov (United States)

    Jacobson, Robert B.; Johnson, Harold E.; Dietsch, Benjamin J.

    2009-01-01

    The objective of this study was to assess the sensitivity of habitat availability in the Lower Missouri River to discharge variation, with emphasis on habitats that might support spawning of the endangered pallid sturgeon. We constructed computational hydrodynamic models for four reaches that were selected because of evidence that sturgeon have spawned in them. The reaches are located at Miami, Missouri (river mile 259.6–263.5), Little Sioux, Iowa (river mile 669.6–673.5), Kenslers Bend, Nebraska (river mile 743.9–748.1), and Yankton, South Dakota reach (river mile 804.8–808.4). The models were calibrated for a range of measured flow conditions, and run for a range of discharges that might be affected by flow modifications from Gavins Point Dam. Model performance was assessed by comparing modeled and measured water velocities.A selection of derived habitat units was assessed for sensitivity to hydraulic input parameters (drag coefficient and lateral eddy viscosity). Overall, model results were minimally sensitive to varying eddy viscosity; varying lateral eddy viscosity by 20 percent resulted in maximum change in habitat units of 5.4 percent. Shallow-water habitat units were most sensitive to variation in drag coefficient with 42 percent change in unit area resulting from 20 percent change in the parameter value; however, no habitat unit value changed more than 10 percent for a 10 percent variation in drag coefficient. Sensitivity analysis provides guidance for selecting habitat metrics that maximize information content while minimizing model uncertainties.To assess model sensitivities arising from topographic variation from sediment transport on an annual time scale, we constructed separate models from two complete independent surveys in 2006 and 2007. The net topographic change was minimal at each site; the ratio of net topographic change to water volume in the reaches at 95 percent exceedance flow was less than 5 percent, indicating that on a reach

  3. Aquifer thermal energy (heat and chill) storage

    Energy Technology Data Exchange (ETDEWEB)

    Jenne, E.A. (ed.)

    1992-11-01

    As part of the 1992 Intersociety Conversion Engineering Conference, held in San Diego, California, August 3--7, 1992, the Seasonal Thermal Energy Storage Program coordinated five sessions dealing specifically with aquifer thermal energy storage technologies (ATES). Researchers from Sweden, The Netherlands, Germany, Switzerland, Denmark, Canada, and the United States presented papers on a variety of ATES related topics. With special permission from the Society of Automotive Engineers, host society for the 1992 IECEC, these papers are being republished here as a standalone summary of ATES technology status. Individual papers are indexed separately.

  4. The detection of boundaries in leaky aquifers

    International Nuclear Information System (INIS)

    Cook, A.J.

    1989-01-01

    Geological faults in sedimentary basins can affect the regional and local groundwater flow patterns by virtue of their enhanced permeability properties. Faults can be regarded as vertical flow boundaries and potentially important routes for radionuclide migration from a theoretical radioactive waste repository. This report investigates the hydraulic testing methods currently available which may be used to locate vertical hydraulic discontinuities (boundaries) within an aquifer. It aims to define the theoretical limitations to boundary detection by a single pumping test, to determine the optimum design of a pumping test for locating boundaries, and to define the practical limitations to boundary detection by a pumping test. (author)

  5. Geopressured-geothermal aquifers. Final contract report

    Energy Technology Data Exchange (ETDEWEB)

    1983-08-01

    Task 1 is to provide petrophysical and reservoir analysis of wells drilled into geopressured-geothermal aquifers containing dissolved methane. The list of Design Wells and Wells of Opportunity analyzed: Fairfax Foster Sutter No. 2 (WOO), Pleasant Bayou No. 2 (Design), Amoco Fee No. 1 (Design), G.M. Koelemay No. 1 (WOO), Gladys McCall No. 1 (Design), P.R. Girouard No. 1 (WOO), and Crown Zellerbach No. 2 (WOO). Petrophysical and reservoir analysis of the above wells were performed based on availability of data. The analysis performed on each well, the assumptions made during simulation, and conclusions reached.

  6. Denitrification in the karstic Floridan Aquifer

    Science.gov (United States)

    Fork, M.; Albertin, A. R.; Heffernan, J. B.; Katz, B. G.; Cohen, M. J.

    2010-12-01

    Nitrate concentrations in the karstic Floridan Aquifer have increased dramatically over the past 50 years, owing to agricultural intensification and urbanization. Due to low concentrations of organic matter and moderately oxic conditions in the Floridan Aquifer, groundwater denitrification has been assumed to be negligible. In this study, we evaluate that assumption using both existing and new data describing dissolved gases (Ne, N2, O2, Ar) and NO3- concentration and isotopic composition (δ18O- and δ15N-NO3) in the aquifer’s artesian springs. For new data, we collected samples from 33 spring vents representing a gradient of both DO and NO3- concentrations in northern Florida and used Membrane Inlet Mass Spectrometry (MIMS) to directly measure dissolved N2 and Ar. We modeled the physical processes (recharge temperature, dissolution of excess air) driving super-saturation of N2 gas using Ne and Ar where data describing Ne were available. Ar concentrations were correlated closely with recharge temperature, which ranged from 15.7 - 22.2°C, while Ne was closely correlated with excess air, which ranged from 1.05 to 2.66 mg L-1 and averaged 1.83 mg L-1. Estimates of physical mechanisms allowed calculation of expected N2 concentrations that were compared to observed N2 concentrations. Where Ne data were unavailable, we assumed excess air equal to the empirical average. Overall, observed N2 exceeded expectations based on physical processes in 33 of 47 cases; average excess N2 was 0.48 mg L-1 across all sites. In addition, excess N2 was negatively correlated with DO (r2 = 0.46); springs with low DO (Aquifer. Low DOC concentrations indicate that alternative electron donors may fuel nitrate reduction. Scaling to regional estimates of N2 production based on springs discharge and DO concentrations indicates that subsurface denitrification may account for some of the imbalance in springshed nutrient budgets. In addition, we conclude that use of δ15N-NO3- to diagnose

  7. SRP baseline hydrogeologic investigation: Aquifer characterization

    Energy Technology Data Exchange (ETDEWEB)

    Strom, R.N.; Kaback, D.S.

    1992-03-31

    An investigation of the mineralogy and chemistry of the principal hydrogeologic units and the geochemistry of the water in the principal aquifers at Savannah River Site (SRS) was undertaken as part of the Baseline Hydrogeologic Investigation. This investigation was conducted to provide background data for future site studies and reports and to provide a site-wide interpretation of the geology and geochemistry of the Coastal Plain Hydrostratigraphic province. Ground water samples were analyzed for major cations and anions, minor and trace elements, gross alpha and beta, tritium, stable isotopes of hydrogen, oxygen, and carbon, and carbon-14. Sediments from the well borings were analyzed for mineralogy and major and minor elements.

  8. The usefulness of multi-well aquifer tests in heterogeneous aquifers

    International Nuclear Information System (INIS)

    Young, S.C.; Benton, D.J.; Herweijer, J.C.; Sims, P.

    1990-01-01

    Three large-scale (100 m) and seven small-scale (3-7 m) multi-well aquifer tests were conducted in a heterogeneous aquifer to determine the transmissivity distribution across a one-hectare test site. Two of the large-scale tests had constant but different rates of discharge; the remaining large-scale test had a discharge that was pulsed at regulated intervals. The small-scale tests were conducted at two well clusters 20 m apart. The program WELTEST was written to analyze the data. By using the methods of non-linear least squares regression analysis and Broyden's method to solve for non-linear extrema, WELTEST automatically determines the best values of transmissivity and the storage coefficient. The test results show that order of magnitude differences in the calculated transmissivities at a well location can be realized by varying the discharge rate at the pumping well, the duration of the aquifer test, and/or the location of the pumping well. The calculated storage coefficients for the tests cover a five-order magnitude range. The data show a definite trend for the storage coefficient to increase with the distance between the pumping and the observation wells. This trend is shown to be related to the orientation of high hydraulic conductivity zones between the pumping and the observation wells. A comparison among single-well aquifer tests, geological investigations and multi-well aquifer tests indicate that the multi-well tests are poorly suited for characterizing a transmissivity field. (Author) (11 refs., 14 figs.)

  9. Bioremediation of a diesel fuel contaminated aquifer: simulation studies in laboratory aquifer columns

    Science.gov (United States)

    Hess, A.; Höhener, P.; Hunkeler, D.; Zeyer, J.

    1996-08-01

    The in situ bioremediation of aquifers contaminated with petroleum hydrocarbons is commonly based on the infiltration of groundwater supplemented with oxidants (e.g., O 2, NO 3-) and nutrients (e.g., NH 4+, PO 43-). These additions stimulate the microbial activity in the aquifer and several field studies describing the resulting processes have been published. However, due to the heterogeneity of the subsurface and due to the limited number of observation wells usually available, these field data do not offer a sufficient spatial and temporal resolution. In this study, flow-through columns of 47-cm length equipped with 17 sampling ports were filled with homogeneously contaminated aquifer material from a diesel fuel contaminated in situ bioremediation site. The columns were operated over 96 days at 12°C with artificial groundwater supplemented with O 2, NO 3- and PO 43-. Concentration profiles of O 2, NO 3-, NO 2-, dissolved inorganic and organic carbon (DIC and DOC, respectively), protein, microbial cells and total residual hydrocarbons were measured. Within the first 12 cm, corresponding to a mean groundwater residence time of < 3.6 h, a steep O 2 decrease from 4.6 to < 0.3 mg l -1, denitrification, a production of DIC and DOC, high microbial cell numbers and a high removal of hydrocarbons were observed. Within a distance of 24 to 40.5 cm from the infiltration, O 2 was below 0.1 mg l -1 and a denitrifying activity was found. In the presence and in the absence of O 2, n-alkanes were preferentially degraded compared to branched alkanes. The results demonstrate that: (1) infiltration of aerobic groundwater into columns filled with aquifer material contaminated with hydrocarbons leads to a rapid depletion of O 2; (2) O 2 and NO 3- can serve as oxidants for the mineralization of hydrocarbons; and (3) the modelling of redox processes in aquifers has to consider denitrifying activity in presence of O 2.

  10. 'Big bang' of quantum universe

    International Nuclear Information System (INIS)

    Pawlowski, M.; Pervushin, V.N.

    2000-01-01

    The reparametrization-invariant generating functional for the unitary and causal perturbation theory in general relativity in a finite space-time is obtained. The classical cosmology of a Universe and the Faddeev-Popov-DeWitt functional correspond to different orders of decomposition of this functional over the inverse 'mass' of a Universe. It is shown that the invariant content of general relativity as a constrained system can be covered by two 'equivalent' unconstrained systems: the 'dynamic' (with 'dynamic' time as the cosmic scale factor and conformal field variables) and 'geometric' (given by the Levi-Civita type canonical transformation to the action-angle variables which determine initial cosmological states with the arrow of the proper time measured by the watch of an observer in the comoving frame). 'Big Bang', the Hubble evolution, and creation of 'dynamic' particles by the 'geometric' vacuum are determined by 'relations' between the dynamic and geometric systems as pure relativistic phenomena, like the Lorentz-type 'relation' between the rest and comoving frames in special relativity

  11. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  12. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  13. Big-bang nucleosynthesis revisited

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  14. Neutrinos and Big Bang Nucleosynthesis

    Directory of Open Access Journals (Sweden)

    Gary Steigman

    2012-01-01

    Full Text Available According to the standard models of particle physics and cosmology, there should be a background of cosmic neutrinos in the present Universe, similar to the cosmic microwave photon background. The weakness of the weak interactions renders this neutrino background undetectable with current technology. The cosmic neutrino background can, however, be probed indirectly through its cosmological effects on big bang nucleosynthesis (BBN and the cosmic microwave background (CMB radiation. In this BBN review, focused on neutrinos and more generally on dark radiation, the BBN constraints on the number of “equivalent neutrinos” (dark radiation, on the baryon asymmetry (baryon density, and on a possible lepton asymmetry (neutrino degeneracy are reviewed and updated. The BBN constraints on dark radiation and on the baryon density following from considerations of the primordial abundances of deuterium and helium-4 are in excellent agreement with the complementary results from the CMB, providing a suggestive, but currently inconclusive, hint of the presence of dark radiation, and they constrain any lepton asymmetry. For all the cases considered here there is a “lithium problem”: the BBN-predicted lithium abundance exceeds the observationally inferred primordial value by a factor of ~3.

  15. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  16. Deuterium and big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Burles, S.

    2000-01-01

    Measurements of deuterium absorption in high redshift quasar absorption systems provide a direct inference of the deuterium abundance produced by big bang nucleosynthesis (BBN). With measurements and limits from five independent absorption systems, we place strong constraints on the primordial ratio of deuterium to hydrogen, (D/H) p = 3.4 ± 0.3 x 10 -5 [1,2]. We employ a direct numerical treatment to improve the estimates of critical reaction rates and reduce the uncertainties in BBN predictions of D/H and 7 Li/H by a factor of three[3] over previous efforts[4]. Using our measurements of (D/H) p and new BBN predictions, we find at 95% confidence the baryon density ρ b = (3.6 ± 0.4) x 10 -31 g cm -3 (Ω b h 2 65 = 0.045 ± 0.006 in units of the critical density), and cosmological baryon-photon ratio η = (5.1 ± 0.6) x 10 -10

  17. A little big history of Tiananmen

    OpenAIRE

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why people built the gate the way they did can be found. These explanations are useful in their own right and may also be used to deepen our understanding of more traditional explanations of why Tiananmen ...

  18. Big Data: Challenges, Opportunities and Realities

    OpenAIRE

    Bhadani, Abhay; Jothimani, Dhanya

    2017-01-01

    With the advent of Internet of Things (IoT) and Web 2.0 technologies, there has been a tremendous growth in the amount of data generated. This chapter emphasizes on the need for big data, technological advancements, tools and techniques being used to process big data are discussed. Technological improvements and limitations of existing storage techniques are also presented. Since, the traditional technologies like Relational Database Management System (RDBMS) have their own limitations to han...

  19. Big Data, Biostatistics and Complexity Reduction

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2018-01-01

    Roč. 14, č. 2 (2018), s. 24-32 ISSN 1801-5603 R&D Projects: GA MZd(CZ) NV15-29835A Institutional support: RVO:67985807 Keywords : Biostatistics * Big data * Multivariate statistics * Dimensionality * Variable selection Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) https://www.ejbi.org/scholarly-articles/big-data-biostatistics-and-complexity-reduction.pdf

  20. Storage and Database Management for Big Data

    Science.gov (United States)

    2015-07-27

    cloud models that satisfy different problem 1.2. THE BIG DATA CHALLENGE 3 Enterprise Big Data - Interactive - On-demand - Virtualization - Java ...replication. Data loss can only occur if three drives fail prior to any one of the failures being corrected. Hadoop is written in Java and is installed in a...visible view into a dataset. There are many popular database management systems such as MySQL [4], PostgreSQL [63], and Oracle [5]. Most commonly

  1. From Data Quality to Big Data Quality

    OpenAIRE

    Batini, Carlo; Rula, Anisa; Scannapieco, Monica; Viscusi, Gianluigi

    2015-01-01

    This article investigates the evolution of data quality issues from traditional structured data managed in relational databases to Big Data. In particular, the paper examines the nature of the relationship between Data Quality and several research coordinates that are relevant in Big Data, such as the variety of data types, data sources and application domains, focusing on maps, semi-structured texts, linked open data, sensor & sensor networks and official statistics. Consequently a set of st...

  2. Automated Big Traffic Analytics for Cyber Security

    OpenAIRE

    Miao, Yuantian; Ruan, Zichan; Pan, Lei; Wang, Yu; Zhang, Jun; Xiang, Yang

    2018-01-01

    Network traffic analytics technology is a cornerstone for cyber security systems. We demonstrate its use through three popular and contemporary cyber security applications in intrusion detection, malware analysis and botnet detection. However, automated traffic analytics faces the challenges raised by big traffic data. In terms of big data's three characteristics --- volume, variety and velocity, we review three state of the art techniques to mitigate the key challenges including real-time tr...

  3. Dark energy, wormholes, and the big rip

    International Nuclear Information System (INIS)

    Faraoni, V.; Israel, W.

    2005-01-01

    The time evolution of a wormhole in a Friedmann universe approaching the big rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid--two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the big rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal

  4. COBE looks back to the Big Bang

    Science.gov (United States)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  5. Big Data in the Aerospace Industry

    Directory of Open Access Journals (Sweden)

    Victor Emmanuell BADEA

    2018-01-01

    Full Text Available This paper presents the approaches related to the need for large volume data analysis, Big Data, and also the information that the beneficiaries of this analysis can interpret. Aerospace companies understand better the challenges of Big Data than the rest of the industries. Also, in this paper we describe a novel analytical system that enables query processing and predictive analytics over streams of large aviation data.

  6. CERN: A big year for LEP

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    In April this year's data-taking period for CERN's big LEP electron-positron collider got underway, and is scheduled to continue until November. The immediate objective of the four big experiments - Aleph, Delphi, L3 and Opal - will be to increase considerably their stock of carefully recorded Z decays, currently totalling about three-quarters of a million

  7. Aquifers in coastal reclaimed lands - real world assessments

    Science.gov (United States)

    Saha, A.; Bironne, A.; Vonhögen-Peeters, L.; Lee, W. K.; Babovic, V. M.; Vermeulen, P.; van Baaren, E.; Karaoulis, M.; Blanchais, F.; Nguyen, M.; Pauw, P.; Doornenbal, P.

    2017-12-01

    Climate change and population growth are significant concerns in coastal regions around the world, where more than 30% of the world's population reside. The numbers continue to rise as coastal areas are increasingly urbanized. Urbanization creates land shortages along the coasts, which has spurred coastal reclamation activities as a viable solution. In this study, we focus on these reclamation areas; reclaimed areas in Singapore, and in the Netherlands, and investigate the potential of these reclaimed bodies as artificial aquifers that could attenuate water shortage problems in addition to their original purpose. We compare how the reclamation methods determine the hydrogeological characteristics of these manmade aquifers. We highlight similarities in freshwater lens development in the artificial shallow aquifers under natural recharge under diverse conditions, i.e. tropical and temperate zones, using numerical models. The characteristics and responses of these aquifers with dynamic freshwater-saltwater interface are contrasted against naturally occurring coastal aquifers where equilibrium was disturbed by anthropogenic activities. Finally, we assess the risks associated with subsidence and saltwater intrusion, combining measurements and numerical models, in case these aquifers are planned for Aquifer Storage and Recovery (ASR) or Managed Aquifer Recharge (MAR) strategies. Relative performances of some ASR schemes are simulated and compared in the reclaimed lands.

  8. Aquifer recharging in South Carolina: radiocarbon in environmental hydrogeology

    International Nuclear Information System (INIS)

    Stone, P.A.; Knox, R.L.; Mathews, T.D.

    1985-01-01

    Radiocarbon activities of dissolved inorganic carbon (and tritium activities where infiltration rates are rapid and aquifers shallow) provide relatively unambiguous and inexpensive evidence for identification of significant recharge areas. Such evidence is for the actual occurrence of modern recharge in the aquifer and thus is less inferential than stratigraphic or potentiometric evidence. These underutilized isotopic techniques are neither arcane nor complex and have been more-or-less standardized by earlier researchers. In South Carolina, isotopic evidence has been used from both calcareous and siliceous sedimentary aquifers and fractured crystalline rock aquifers. The Tertiary limestone aquifer is shown not to be principally recharged in its subcrop area, unlike conditions assumed for many other sedimentary aquifers in southeastern United States, and instead receives considerable lateral recharge from interfingering updip Tertiary sand aquifers in the middle coastal plain. Induced recharging at Hilton Head Island is mixing ancient relict water and modern recharge water. Recharging to deeper portions of the Cretaceous Middendorf basal sand aquifer occurs at least as far coastward as the middle coastal plain, near sampling sites that stratigraphically appear to be confined. Pronounced mineralization of water in fractured rocks cannot be considered as evidence of ancient or relict ground water that is isolated from modern contaminants, some of these waters contain considerable radiocarbon and hydrogen-bomb tritium

  9. Hydraulic properties from pumping tests data of aquifers in Azare ...

    African Journals Online (AJOL)

    Pumping test data from twelve boreholes in Azare area were analysed to determine the hydraulic properties of the aquifers, and the availability of water to meet the conjugate demands of the increasing population. The values of the aquifer constants obtained from the Cooper-Jacob's non-equilibrium graphical method were ...

  10. Estimating aquifer transmissivity from geo-electrical sounding ...

    African Journals Online (AJOL)

    Aquifer resistivity range from 4.26 ohm-m to 755.3 ohm-m with maximum thickness of 52.25m. A maximum 55.52m depth- tobasement was obtained in the study area. Based on the model obtained, aquifer Transmissivity was calculated and was used to delineate the study area into prospective low and high groundwater ...

  11. Hydrologic and isotopic study of the Quito aquifer

    International Nuclear Information System (INIS)

    Villalba, Fabio; Benalcazar, Julio; Garcia, Marco; Altamirano, Cesar; Altamirano, Homero; Sarasti, Santiago; Mancero, Maria; Leiva, Eduardo; Pino, Jose; Alulema, Rafael; Cedeno, Alberto; Burbano, Napoleon; Paquel, Efren; Becerra, Simon; Andrade, Graciela

    2000-10-01

    The dynamics of the Quito basin and surrounding area aquifers were determined through the use of stable and radioactive isotopes, and the monitoring of the freatic levels and of the bacteriological and physico-chemical quality of the water. A conceptual hydrodynamic model of the Quito aquifer was also proposed in order to establish in the future a sustainable management system

  12. Determining shallow aquifer vulnerability by the DRASTIC model ...

    Indian Academy of Sciences (India)

    Shallow aquifer vulnerability has been assessed using GIS-based DRASTIC model by incorporating the major geological and hydrogeological factors that affect and control the groundwater contamination in a granitic terrain. It provides a relative indication of aquifer vulnerability to the contamination. Further, it has been ...

  13. Slaves to Big Data. Or Are We?

    Directory of Open Access Journals (Sweden)

    Mireille Hildebrandt

    2013-10-01

    Full Text Available

    In this contribution, the notion of Big Data is discussed in relation to the monetisation of personal data. The claim of some proponents, as well as adversaries, that Big Data implies that ‘n = all’, meaning that we no longer need to rely on samples because we have all the data, is scrutinised and found to be both overly optimistic and unnecessarily pessimistic. A set of epistemological and ethical issues is presented, focusing on the implications of Big Data for our perception, cognition, fairness, privacy and due process. The article then looks into the idea of user-centric personal data management to investigate to what extent it provides solutions for some of the problems triggered by the Big Data conundrum. Special attention is paid to the core principle of data protection legislation, namely purpose binding. Finally, this contribution seeks to inquire into the influence of Big Data politics on self, mind and society, and asks how we can prevent ourselves from becoming slaves to Big Data.

  14. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  15. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  16. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  17. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  18. Aquifer overexploitation: what does it mean?

    Science.gov (United States)

    Custodio, Emilio

    2002-02-01

    Groundwater overexploitation and aquifer overexploitation are terms that are becoming common in water-resources management. Hydrologists, managers and journalists use them when talking about stressed aquifers or some groundwater conflict. Overexploitation may be defined as the situation in which, for some years, average aquifer ion rate is greater than, or close to the average recharge rate. But rate and extent of recharge areas are often very uncertain. Besides, they may be modified by human activities and aquifer development. In practice, however, an aquifer is often considered as overexploited when some persistent negative results of aquifer development are felt or perceived, such as a continuous water-level drawdown, progressive water-quality deterioration, increase of ion cost, or ecological damage. But negative results do not necessarily imply that ion is greater than recharge. They may be simply due to well interferences and the long transient period that follow changes in the aquifer water balance. Groundwater storage is depleted to some extent during the transient period after ion is increased. Its duration depends on aquifer size, specific storage and permeability. Which level of "aquifer overexploitation" is advisable or bearable, depends on the detailed and updated consideration of aquifer-development effects and the measures implemented for correction. This should not be the result of applying general rules based on some indirect data. Monitoring, sound aquifer knowledge, and calculation or modelling of behaviour are needed in the framework of a set of objectives and policies. They should be established by a management institution, with the involvement of groundwater stakeholders, and take into account the environmental and social constraints. Aquifer overexploitation, which often is perceived to be associated with something ethically bad, is not necessarily detrimental if it is not permanent. It may be a step towards sustainable development. Actually

  19. Simple method for quick estimation of aquifer hydrogeological parameters

    Science.gov (United States)

    Ma, C.; Li, Y. Y.

    2017-08-01

    Development of simple and accurate methods to determine the aquifer hydrogeological parameters was of importance for groundwater resources assessment and management. Aiming at the present issue of estimating aquifer parameters based on some data of the unsteady pumping test, a fitting function of Theis well function was proposed using fitting optimization method and then a unitary linear regression equation was established. The aquifer parameters could be obtained by solving coefficients of the regression equation. The application of the proposed method was illustrated, using two published data sets. By the error statistics and analysis on the pumping drawdown, it showed that the method proposed in this paper yielded quick and accurate estimates of the aquifer parameters. The proposed method could reliably identify the aquifer parameters from long distance observed drawdowns and early drawdowns. It was hoped that the proposed method in this paper would be helpful for practicing hydrogeologists and hydrologists.

  20. Hydrogeology of the Umm Er Radhuma Aquifer (Arabian peninsula)

    Science.gov (United States)

    Dirks, Heiko; Al Ajmi, Hussain; Kienast, Peter; Rausch, Randolf

    2018-03-01

    The aim of this article is to enhance the understanding of the Umm Er Radhuma aquifer's genesis, and its hydraulic and hydrochemical development over time. This is a prerequisite for wise use of the fossil groundwater resources contained within. The Umm Er Radhuma is a karstified limestone aquifer, extending over 1.6 Mio. km2 in the eastern part of the Arabian Peninsula. Both epigene and hypogene karstification contributed to the genesis of what is today the most prolific aquifer in the region. Besides man-made abstractions, even the natural outflows are higher than the small recharge (natural storage depletion). The Umm Er Radhuma shows that large aquifers in arid regions are never in "steady state" (where inflows equal outflows), considering Quaternary climate history. The aquifer's adaption to climate changes (precipitation, sea level) can be traced even after thousands of years, and is slower than the climate changes themselves.

  1. Geomorphic Controls on Aquifer Geometry in Northwestern India

    Science.gov (United States)

    van Dijk, W. M.; Densmore, A. L.; Sinha, R.; Gupta, S.; Mason, P. J.; Singh, A.; Joshi, S. K.; Nayak, N.; Kumar, M.; Shekhar, S.

    2014-12-01

    The Indo-Gangetic foreland basin suffers from one of the highest rates of groundwater extraction in the world, especially in the Indian states of Punjab, Haryana and Rajasthan. To understand the effects of this extraction on ground water levels, we must first understand the geometry and sedimentary architecture of the aquifer system, which in turn depend upon its geomorphic setting. We use satellite images and digital elevation models to map the geomorphology of the Sutlej and Yamuna river systems, while aquifer geometry is assessed using ~250 wells that extend to ~300 m depth in Punjab and Haryana. The Sutlej and Yamuna rivers have deposited large sedimentary fans at their outlets. Elongate downslope ridges on the fan surfaces form distributary networks that radiate from the Sutlej and Yamuna fan apices, and we interpret these ridges as paleochannel deposits associated with discrete fan lobes. Paleochannels picked out by soil moisture variations illustrate a complex late Quaternary history of channel avulsion and incision, probably associated with variations in monsoon intensity. Aquifer bodies on the Sutlej and Yamuna fans have a median thickness of 7 and 6 m, respectively, and follow a heavy-tailed distribution, probably because of stacked sand bodies. The percentage of aquifer material in individual lithologs decreases downstream, although the exponent on the thickness distribution remains the same, indicating that aquifer bodies decrease in number down fan but do not thin appreciably. Critically, the interfan area between the Sutlej and Yamuna fans has thinner aquifers and a lower proportion of aquifer material, despite its proximal location. Our data show that the Sutlej and Yamuna fan systems form the major aquifer systems in this area, and that their geomorphic setting therefore provides a first-order control on aquifer distribution and geometry. The large spatial heterogeneity of the system must be considered in any future aquifer management scheme.

  2. Modeling of CO2 storage in aquifers

    International Nuclear Information System (INIS)

    Savioli, Gabriela B; Santos, Juan E

    2011-01-01

    Storage of CO 2 in geological formations is a means of mitigating the greenhouse effect. Saline aquifers are a good alternative as storage sites due to their large volume and their common occurrence in nature. The first commercial CO 2 injection project is that of the Sleipner field in the Utsira Sand aquifer (North Sea). Nevertheless, very little was known about the effectiveness of CO 2 sequestration over very long periods of time. In this way, numerical modeling of CO 2 injection and seismic monitoring is an important tool to understand the behavior of CO 2 after injection and to make long term predictions in order to prevent CO 2 leaks from the storage into the atmosphere. The description of CO 2 injection into subsurface formations requires an accurate fluid-flow model. To simulate the simultaneous flow of brine and CO 2 we apply the Black-Oil formulation for two phase flow in porous media, which uses the PVT data as a simplified thermodynamic model. Seismic monitoring is modeled using Biot's equations of motion describing wave propagation in fluid-saturated poroviscoelastic solids. Numerical examples of CO 2 injection and time-lapse seismics using data of the Utsira formation show the capability of this methodology to monitor the migration and dispersal of CO 2 after injection.

  3. 14C measurements in aquifers with methane

    International Nuclear Information System (INIS)

    Barker, J.F.; Fritz, P.; Brown, R.M.

    1978-01-01

    A survey of various groundwater systems indicates that methane is a common trace constituent and occasionally a major carbon species in groundwaters. Thermocatalytic methane had delta 13 CCH 4 > -45% 0 and microbially-produced or biogenic methane had delta 13 CCH 4 0 . Groundwaters containing significant biogenic methane had abnormally heavy delta 13 C values for the inorganic carbon. Thermocatalytic methane had no apparent effect on the inorganic carbon. Because methanogenesis seriously affects the carbon isotope geochemistry of groundwaters, the correction of raw 14 C ages of affected groundwaters must consider these effects. Conceptual models are developed which adjust the 14 C activity of the groundwater for the effects of methanogenesis and for the dilution of carbon present during infiltration by simple dissolution of rock carbonate. These preliminary models are applied to groundwaters from the Alliston sand aquifer where methanogenesis has affected most samples. In this system, methanogenic bacteria using organic matter present in the aquifer matrix as substrate, have added inorganic carbon to the groundwater which has initiated further carbonate rock dissolution. These processes have diluted the inorganic carbon 14 C activity. (orig.) [de

  4. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  5. [Big data from clinical routine].

    Science.gov (United States)

    Mansmann, U

    2018-04-01

    Over the past 100 years, evidence-based medicine has undergone several fundamental changes. Through the field of physiology, medical doctors were introduced to the natural sciences. Since the late 1940s, randomized and epidemiological studies have come to provide the evidence for medical practice, which led to the emergence of clinical epidemiology as a new field in the medical sciences. Within the past few years, big data has become the driving force behind the vision for having a comprehensive set of health-related data which tracks individual healthcare histories and consequently that of large populations. The aim of this article is to discuss the implications of data-driven medicine, and to examine how it can find a place within clinical care. The EU-wide discussion on the development of data-driven medicine is presented. The following features and suggested actions were identified: harmonizing data formats, data processing and analysis, data exchange, related legal frameworks and ethical challenges. For the effective development of data-driven medicine, pilot projects need to be conducted to allow for open and transparent discussion on the advantages and challenges. The Federal Ministry of Education and Research ("Bundesministerium für Bildung und Forschung," BMBF) Arthromark project is an important example. Another example is the Medical Informatics Initiative of the BMBF. The digital revolution affects clinic practice. Data can be generated and stored in quantities that are almost unimaginable. It is possible to take advantage of this for development of a learning healthcare system if the principles of medical evidence generation are integrated into innovative IT-infrastructures and processes.

  6. Pockmarks off Big Sur, California

    Science.gov (United States)

    Paull, C.; Ussler, W.; Maher, N.; Greene, H. Gary; Rehder, G.; Lorenson, T.; Lee, H.

    2002-01-01

    A pockmark field was discovered during EM-300 multi-beam bathymetric surveys on the lower continental slope off the Big Sur coast of California. The field contains ??? 1500 pockmarks which are between 130 and 260 m in diameter, and typically are 8-12 m deep located within a 560 km2 area. To investigate the origin of these features, piston cores were collected from both the interior and the flanks of the pockmarks, and remotely operated vehicle observation (ROV) video and sampling transects were conducted which passed through 19 of the pockmarks. The water column within and above the pockmarks was sampled for methane concentration. Piston cores and ROV collected push cores show that the pockmark field is composed of monotonous fine silts and clays and the cores within the pockmarks are indistinguishable from those outside the pockmarks. No evidence for either sediment winnowing or diagenetic alteration suggestive of fluid venting was obtained. 14C measurements of the organic carbon in the sediments indicate continuous sedimentation throughout the time resolution of the radiocarbon technique ( ??? 45000 yr BP), with a sedimentation rate of ??? 10 cm per 1000 yr both within and between the pockmarks. Concentrations of methane, dissolved inorganic carbon, sulfate, chloride, and ammonium in pore water extracted from within the cores are generally similar in composition to seawater and show little change with depth, suggesting low biogeochemical activity. These pore water chemical gradients indicate that neither significant accumulations of gas are likely to exist in the shallow subsurface ( ??? 100 m) nor is active fluid advection occurring within the sampled sediments. Taken together the data indicate that these pockmarks are more than 45000 yr old, are presently inactive, and contain no indications of earlier fluid or gas venting events. ?? 2002 Elsevier Science B.V. All rights reserved.

  7. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  8. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  9. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  10. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  11. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  12. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  13. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  14. Pumping Test Determination of Unsaturated Aquifer Properties

    Science.gov (United States)

    Mishra, P. K.; Neuman, S. P.

    2008-12-01

    Tartakovsky and Neuman [2007] presented a new analytical solution for flow to a partially penetrating well pumping at a constant rate from a compressible unconfined aquifer considering the unsaturated zone. In their solution three-dimensional, axially symmetric unsaturated flow is described by a linearized version of Richards' equation in which both hydraulic conductivity and water content vary exponentially with incremental capillary pressure head relative to its air entry value, the latter defining the interface between the saturated and unsaturated zones. Both exponential functions are characterized by a common exponent k having the dimension of inverse length, or equivalently a dimensionless exponent kd=kb where b is initial saturated thickness. The authors used their solution to analyze drawdown data from a pumping test conducted by Moench et al. [2001] in a Glacial Outwash Deposit at Cape Cod, Massachusetts. Their analysis yielded estimates of horizontal and vertical saturated hydraulic conductivities, specific storage, specific yield and k . Recognizing that hydraulic conductivity and water content seldom vary identically with incremental capillary pressure head, as assumed by Tartakovsky and Neuman [2007], we note that k is at best an effective rather than a directly measurable soil parameter. We therefore ask to what extent does interpretation of a pumping test based on the Tartakovsky-Neuman solution allow estimating aquifer unsaturated parameters as described by more common constitutive water retention and relative hydraulic conductivity models such as those of Brooks and Corey [1964] or van Genuchten [1980] and Mualem [1976a]? We address this question by showing how may be used to estimate the capillary air entry pressure head k and the parameters of such constitutive models directly, without a need for inverse unsaturated numerical simulations of the kind described by Moench [2003]. To assess the validity of such direct estimates we use maximum

  15. Conduit enlargement in an eogenetic karst aquifer

    Science.gov (United States)

    Moore, Paul J.; Martin, Jonathan B.; Screaton, Elizabeth J.; Neuhoff, Philip S.

    2010-11-01

    SummaryMost concepts of conduit development have focused on telogenetic karst aquifers, where low matrix permeability focuses flow and dissolution along joints, fractures, and bedding planes. However, conduits also exist in eogenetic karst aquifers, despite high matrix permeability which accounts for a significant component of flow. This study investigates dissolution within a 6-km long conduit system in the eogenetic Upper Floridan aquifer of north-central Florida that begins with a continuous source of allogenic recharge at the Santa Fe River Sink and discharges from a first-magnitude spring at the Santa Fe River Rise. Three sources of water to the conduit include the allogenic recharge, diffuse recharge through epikarst, and mineralized water upwelling from depth. Results of sampling and inverse modeling using PHREEQC suggest that dissolution within the conduit is episodic, occurring only during 30% of 16 sampling times between March 2003 and April 2007. During low flow conditions, carbonate saturated water flows from the matrix to the conduit, restricting contact between undersaturated allogenic water with the conduit wall. When gradients reverse during high flow conditions, undersaturated allogenic recharge enters the matrix. During these limited periods, estimates of dissolution within the conduit suggest wall retreat averages about 4 × 10 -6 m/day, in agreement with upper estimates of maximum wall retreat for telogenetic karst. Because dissolution is episodic, time-averaged dissolution rates in the sink-rise system results in a wall retreat rate of about 7 × 10 -7 m/day, which is at the lower end of wall retreat for telogenetic karst. Because of the high permeability matrix, conduits in eogenetic karst thus enlarge not just at the walls of fractures or pre-existing conduits such as those in telogenetic karst, but also may produce a friable halo surrounding the conduits that may be removed by additional mechanical processes. These observations stress the

  16. CO2/Brine transport into shallow aquifers along fault zones.

    Science.gov (United States)

    Keating, Elizabeth H; Newell, Dennis L; Viswanathan, Hari; Carey, J W; Zyvoloski, G; Pawar, Rajesh

    2013-01-02

    Unintended release of CO(2) from carbon sequestration reservoirs poses a well-recognized risk to groundwater quality. Research has largely focused on in situ CO(2)-induced pH depression and subsequent trace metal mobilization. In this paper we focus on a second mechanism: upward intrusion of displaced brine or brackish-water into a shallow aquifer as a result of CO(2) injection. Studies of two natural analog sites provide insights into physical and chemical mechanisms controlling both brackish water and CO(2) intrusion into shallow aquifers along fault zones. At the Chimayó, New Mexico site, shallow groundwater near the fault is enriched in CO(2) and, in some places, salinity is significantly elevated. In contrast, at the Springerville, Arizona site CO(2) is leaking upward through brine aquifers but does not appear to be increasing salinity in the shallow aquifer. Using multiphase transport simulations we show conditions under which significant CO(2) can be transported through deep brine aquifers into shallow layers. Only a subset of these conditions favor entrainment of salinity into the shallow aquifer: high aspect-ratio leakage pathways and viscous coupling between the fluid phases. Recognition of the conditions under which salinity is favored to be cotransported with CO(2) into shallow aquifers will be important in environmental risk assessments.

  17. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  18. Big Data: Concept, Potentialities and Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2018-03-01

    Full Text Available The evolution of information systems and the growth in the use of the Internet and social networks has caused an explosion in the amount of available data relevant to the activities of the companies. Therefore, the treatment of these available data is vital to support operational, tactical and strategic decisions. This paper aims to present the concept of big data and the main technologies that support the analysis of large data volumes. The potential of big data is explored considering nine sectors of activity, such as financial, retail, healthcare, transports, agriculture, energy, manufacturing, public, and media and entertainment. In addition, the main current opportunities, vulnerabilities and privacy challenges of big data are discussed. It was possible to conclude that despite the potential for using the big data to grow in the previously identified areas, there are still some challenges that need to be considered and mitigated, namely the privacy of information, the existence of qualified human resources to work with Big Data and the promotion of a data-driven organizational culture.

  19. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  20. Physics with Big Karl Brainstorming. Abstracts

    International Nuclear Information System (INIS)

    Machner, H.; Lieb, J.

    2000-08-01

    Before summarizing details of the meeting, a short description of the spectrometer facility Big Karl is given. The facility is essentially a new instrument using refurbished dipole magnets from its predecessor. The large acceptance quadrupole magnets and the beam optics are new. Big Karl has a design very similar as the focussing spectrometers at MAMI (Mainz), AGOR (Groningen) and the high resolution spectrometer (HRS) in Hall A at Jefferson Laboratory with ΔE/E = 10 -4 but at some lower maximum momentum. The focal plane detectors consisting of multiwire drift chambers and scintillating hodoscopes are similar. Unlike HRS, Big Karl still needs Cerenkov counters and polarimeters in its focal plane; detectors which are necessary to perform some of the experiments proposed during the brainstorming. In addition, BIG KARL allows emission angle reconstruction via track measurements in its focal plane with high resolution. In the following the physics highlights, the proposed and potential experiments are summarized. During the meeting it became obvious that the physics to be explored at Big Karl can be grouped into five distinct categories, and this summary is organized accordingly. (orig.)