WorldWideScience

Sample records for big lost river

  1. Water resources in the Big Lost River Basin, south-central Idaho

    Science.gov (United States)

    Crosthwaite, E.G.; Thomas, C.A.; Dyer, K.L.

    1970-01-01

    The Big Lost River basin occupies about 1,400 square miles in south-central Idaho and drains to the Snake River Plain. The economy in the area is based on irrigation agriculture and stockraising. The basin is underlain by a diverse-assemblage of rocks which range, in age from Precambrian to Holocene. The assemblage is divided into five groups on the basis of their hydrologic characteristics. Carbonate rocks, noncarbonate rocks, cemented alluvial deposits, unconsolidated alluvial deposits, and basalt. The principal aquifer is unconsolidated alluvial fill that is several thousand feet thick in the main valley. The carbonate rocks are the major bedrock aquifer. They absorb a significant amount of precipitation and, in places, are very permeable as evidenced by large springs discharging from or near exposures of carbonate rocks. Only the alluvium, carbonate rock and locally the basalt yield significant amounts of water. A total of about 67,000 acres is irrigated with water diverted from the Big Lost River. The annual flow of the river is highly variable and water-supply deficiencies are common. About 1 out of every 2 years is considered a drought year. In the period 1955-68, about 175 irrigation wells were drilled to provide a supplemental water supply to land irrigated from the canal system and to irrigate an additional 8,500 acres of new land. Average. annual precipitation ranged from 8 inches on the valley floor to about 50 inches at some higher elevations during the base period 1944-68. The estimated water yield of the Big Lost River basin averaged 650 cfs (cubic feet per second) for the base period. Of this amount, 150 cfs was transpired by crops, 75 cfs left the basin as streamflow, and 425 cfs left as ground-water flow. A map of precipitation and estimated values of evapotranspiration were used to construct a water-yield map. A distinctive feature of the Big Lost River basin, is the large interchange of water from surface streams into the ground and from the

  2. Summary of the Big Lost River fish study on the Idaho National Engineering Laboratory Site

    International Nuclear Information System (INIS)

    Overton, C.K.; Johnson, D.W.

    1978-01-01

    Winter fish mortality and fish migration in the Big Lost River were related to natural phenomenon and man-created impacts. Low winter flows resulted in a reduction in habitat and increased rainbow trout mortality. Man-altered flows stimulated movement and created deleterious conditions. Migratory patterns were related to water discharge and temperature. A food habit study of three sympatric salmonid fishes was undertaken during a low water period. The ratio of food items differed between the three species. Flesh of salmonid fishes from within the INEL Site boundary was monitored for three years for radionuclides. Only one trout contained Cs-137 concentrations above the minimum detection limits

  3. 33 CFR 117.677 - Big Sunflower River.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw of...

  4. Big Bang Day : Afternoon Play - Torchwood: Lost Souls

    CERN Multimedia

    2008-01-01

    Martha Jones, ex-time traveller and now working as a doctor for a UN task force, has been called to CERN where they're about to activate the Large Hadron Collider. Once activated, the Collider will fire beams of protons together recreating conditions a billionth of a second after the Big Bang - and potentially allowing the human race a greater insight into what the Universe is made of. But so much could go wrong - it could open a gateway to a parallel dimension, or create a black hole - and now voices from the past are calling out to people and scientists have started to disappear... Where have the missing scientists gone? What is the secret of the glowing man? What is lurking in the underground tunnel? And do the dead ever really stay dead? Lost Souls is a spin-off from the award-winning BBC Wales TV production Torchwood. It stars John Barrowman, Freema Agyeman, Eve Myles, Gareth David-Lloyd, Lucy Montgomery (of Titty Bang Bang) and Stephen Critchlow.

  5. Hydraulic Characteristics of Bedrock Constrictions and Evaluation of One- and Two-Dimensional Models of Flood Flow on the Big Lost River at the Idaho National Engineering and Environmental Laboratory, Idaho

    Science.gov (United States)

    Berenbrock, Charles; Rousseau, Joseph P.; Twining, Brian V.

    2007-01-01

    A 1.9-mile reach of the Big Lost River, between the Idaho National Engineering and Environmental Laboratory (INEEL) diversion dam and the Pioneer diversion structures, was investigated to evaluate the effects of streambed erosion and bedrock constrictions on model predictions of water-surface elevations. Two one-dimensional (1-D) models, a fixed-bed surface-water flow model (HEC-RAS) and a movable-bed surface-water flow and sediment-transport model (HEC-6), were used to evaluate these effects. The results of these models were compared to the results of a two-dimensional (2-D) fixed-bed model [Transient Inundation 2-Dimensional (TRIM2D)] that had previously been used to predict water-surface elevations for peak flows with sufficient stage and stream power to erode floodplain terrain features (Holocene inset terraces referred to as BLR#6 and BLR#8) dated at 300 to 500 years old, and an unmodified Pleistocene surface (referred to as the saddle area) dated at 10,000 years old; and to extend the period of record at the Big Lost River streamflow-gaging station near Arco for flood-frequency analyses. The extended record was used to estimate the magnitude of the 100-year flood and the magnitude of floods with return periods as long as 10,000 years. In most cases, the fixed-bed TRIM2D model simulated higher water-surface elevations, shallower flow depths, higher flow velocities, and higher stream powers than the fixed-bed HEC-RAS and movable-bed HEC-6 models for the same peak flows. The HEC-RAS model required flow increases of 83 percent [100 to 183 cubic meters per second (m3/s)], and 45 percent (100 to 145 m3/s) to match TRIM2D simulations of water-surface elevations at two paleoindicator sites that were used to determine peak flows (100 m3/s) with an estimated return period of 300 to 500 years; and an increase of 13 percent (150 to 169 m3/s) to match TRIM2D water-surface elevations at the saddle area that was used to establish the peak flow (150 m3/s) of a paleoflood

  6. Occurrence and transport of nitrogen in the Big Sunflower River, northwestern Mississippi, October 2009-June 2011

    Science.gov (United States)

    Barlow, Jeannie R.B.; Coupe, Richard H.

    2014-01-01

    The Big Sunflower River Basin, located within the Yazoo River Basin, is subject to large annual inputs of nitrogen from agriculture, atmospheric deposition, and point sources. Understanding how nutrients are transported in, and downstream from, the Big Sunflower River is key to quantifying their eutrophying effects on the Gulf. Recent results from two Spatially Referenced Regressions on Watershed attributes (SPARROW models), which include the Big Sunflower River, indicate minimal losses of nitrogen in stream reaches typical of the main channels of major river systems. If SPARROW assumptions of relatively conservative transport of nitrogen are correct and surface-water losses through the bed of the Big Sunflower River are negligible, then options for managing nutrient loads to the Gulf of Mexico may be limited. Simply put, if every pound of nitrogen entering the Delta is eventually delivered to the Gulf, then the only effective nutrient management option in the Delta is to reduce inputs. If, on the other hand, it can be shown that processes within river channels of the Mississippi Delta act to reduce the mass of nitrogen in transport, other hydrologic approaches may be designed to further limit nitrogen transport. Direct validation of existing SPARROW models for the Delta is a first step in assessing the assumptions underlying those models. In order to characterize spatial and temporal variability of nitrogen in the Big Sunflower River Basin, water samples were collected at four U.S. Geological Survey gaging stations located on the Big Sunflower River between October 1, 2009, and June 30, 2011. Nitrogen concentrations were generally highest at each site during the spring of the 2010 water year and the fall and winter of the 2011 water year. Additionally, the dominant form of nitrogen varied between sites. For example, in samples collected from the most upstream site (Clarksdale), the concentration of organic nitrogen was generally higher than the concentrations of

  7. 76 FR 53827 - Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of...

    Science.gov (United States)

    2011-08-30

    ...-AA00 Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of... restricting navigation on the Big Sioux River from the Military Road Bridge in North Sioux City, South Dakota... zone on the Big Sioux River from the Military Road Bridge in North Sioux City, SD at 42.52 degrees...

  8. Caverns Measureless to Man: Subterranean Rivers and Adventurous Masculinities in the Victorian Lost World Novel

    DEFF Research Database (Denmark)

    McCausland, Elly

    2018-01-01

    This article examines a recurring trope in late Victorian ‘lost world’ adventure novels: the terrifying descent down a subterranean river into the bowels of the earth. More than simply an exciting episode, the subterranean river journey reflects narrative strategies and thematic concerns key to b...

  9. Colonial waterbird predation on Lost River and Shortnose suckers in the Upper Klamath Basin

    Science.gov (United States)

    Evans, Allen F.; Hewitt, David A.; Payton, Quinn; Cramer, Bradley M.; Collis, Ken; Roby, Daniel D.

    2016-01-01

    We evaluated predation on Lost River Suckers Deltistes luxatus and Shortnose Suckers Chasmistes brevirostris by American white pelicans Pelecanus erythrorhynchos and double-crested cormorants Phalacrocorax auritus nesting at mixed-species colonies in the Upper Klamath Basin of Oregon and California during 2009–2014. Predation was evaluated by recovering (detecting) PIT tags from tagged fish on bird colonies and calculating minimum predation rates, as the percentage of available suckers consumed, adjusted for PIT tag detection probabilities but not deposition probabilities (i.e., probability an egested tag was deposited on- or off-colony). Results indicate that impacts of avian predation varied by sucker species, age-class (adult, juvenile), bird colony location, and year, demonstrating dynamic predator–prey interactions. Tagged suckers ranging in size from 72 to 730 mm were susceptible to cormorant or pelican predation; all but the largest Lost River Suckers were susceptible to bird predation. Minimum predation rate estimates ranged annually from <0.1% to 4.6% of the available PIT-tagged Lost River Suckers and from <0.1% to 4.2% of the available Shortnose Suckers, and predation rates were consistently higher on suckers in Clear Lake Reservoir, California, than on suckers in Upper Klamath Lake, Oregon. There was evidence that bird predation on juvenile suckers (species unknown) in Upper Klamath Lake was higher than on adult suckers in Upper Klamath Lake, where minimum predation rates ranged annually from 5.7% to 8.4% of available juveniles. Results suggest that avian predation is a factor limiting the recovery of populations of Lost River and Shortnose suckers, particularly juvenile suckers in Upper Klamath Lake and adult suckers in Clear Lake Reservoir. Additional research is needed to measure predator-specific PIT tag deposition probabilities (which, based on other published studies, could increase predation rates presented herein by a factor of roughly 2

  10. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  11. 76 FR 38013 - Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of...

    Science.gov (United States)

    2011-06-29

    ...-AA00 Safety Zone; Big Sioux River From the Military Road Bridge North Sioux City to the Confluence of... Military Road Bridge in North Sioux City, South Dakota to the confluence of the Missouri River and... Big Sioux River from the Military Road Bridge in North Sioux City, SD at 42.52 degrees North, 096.48...

  12. Effects of Chiloquin Dam on spawning distribution and larval emigration of Lost River, shortnose, and Klamath largescale suckers in the Williamson and Sprague Rivers, Oregon

    Science.gov (United States)

    Martin, Barbara A.; Hewitt, David A.; Ellsworth, Craig M.

    2013-01-01

    Chiloquin Dam was constructed in 1914 on the Sprague River near the town of Chiloquin, Oregon. The dam was identified as a barrier that potentially inhibited or prevented the upstream spawning migrations and other movements of endangered Lost River (Deltistes luxatusChasmistes brevirostris) suckers, as well as other fish species. In 2002, the Bureau of Reclamation led a working group that examined several alternatives to improve fish passage at Chiloquin Dam. Ultimately it was decided that dam removal was the best alternative and the dam was removed in the summer of 2008. The U.S. Geological Survey conducted a long-term study on the spawning ecology of Lost River, shortnose, and Klamath largescale suckers (Catostomus snyderi) in the Sprague and lower Williamson Rivers from 2004 to 2010. The objective of this study was to evaluate shifts in spawning distribution following the removal of Chiloquin Dam. Radio telemetry was used in conjunction with larval production data and detections of fish tagged with passive integrated transponders (PIT tags) to evaluate whether dam removal resulted in increased utilization of spawning habitat farther upstream in the Sprague River. Increased densities of drifting larvae were observed at a site in the lower Williamson River after the dam was removed, but no substantial changes occurred upstream of the former dam site. Adult spawning migrations primarily were influenced by water temperature and did not change with the removal of the dam. Emigration of larvae consistently occurred about 3-4 weeks after adults migrated into a section of river. Detections of PIT-tagged fish showed increases in the numbers of all three suckers that migrated upstream of the dam site following removal, but the increases for Lost River and shortnose suckers were relatively small compared to the total number of fish that made a spawning migration in a given season. Increases for Klamath largescale suckers were more substantial. Post-dam removal monitoring

  13. Habitat Evaluation Procedures (HEP) Report; Big Island - The McKenzie River, Technical Report 1998-2001.

    Energy Technology Data Exchange (ETDEWEB)

    Sieglitz, Greg

    2001-03-01

    The Big Island site is located in the McKenzie River flood plain, containing remnant habitats of what was once more common in this area. A diverse array of flora and fauna, representing significant wildlife habitats, is present on the site. Stands of undisturbed forested wetlands, along with riparian shrub habitats and numerous streams and ponds, support a diversity of wildlife species, including neotropical migratory songbirds, raptors, mammals, reptiles, and amphibians (including two State-listed Sensitive Critical species). The project is located in eastern Springfield, Oregon (Figure 1). The project area encompasses 187 acres under several ownerships in Section 27 of Township 17S, Range 2W. Despite some invasion of non-native species, the site contains large areas of relatively undisturbed wildlife habitat. Over several site visits, a variety of wildlife and signs of wildlife were observed, including an active great blue heron rookery, red-Legged frog egg masses, signs of beaver, and a bald eagle, Wildlife habitat values resulting from the purchase of this site will contribute toward the goal of mitigating for habitat lost as outlined in the Bonneville Power Administration's (BPA) Mitigation and Enhancement Plan for the Willamette River Basin. Under this Plan, mitigation goals and objectives were developed as a result of the loss of wildlife habitat due to the construction of Federal hydroelectric facilities in the Willamette River Basin. Results of the Habitat Evaluation Procedures (HEP) will be used to: (1) determine the current habitat status of the study area and habitat enhancement potential of the site consistent with wildlife mitigation goals and objectives; and (2) develop a management plan for the area.

  14. 76 FR 76337 - Endangered and Threatened Wildlife and Plants; Designation of Critical Habitat for Lost River...

    Science.gov (United States)

    2011-12-07

    ... information on the Lost River sucker's and shortnose sucker's biology and habitat, population abundance and... consumed chironomid larvae as well as micro-crustaceans (amphipods, copepods, cladocerans, and ostracods... information above, we identify an abundant food base, including a broad array of chironomids, micro...

  15. Numerical simulation of groundwater and surface-water interactions in the Big River Management Area, central Rhode Island

    Science.gov (United States)

    Masterson, John P.; Granato, Gregory E.

    2013-01-01

    The Rhode Island Water Resources Board is considering use of groundwater resources from the Big River Management Area in central Rhode Island because increasing water demands in Rhode Island may exceed the capacity of current sources. Previous water-resources investigations in this glacially derived, valley-fill aquifer system have focused primarily on the effects of potential groundwater-pumping scenarios on streamflow depletion; however, the effects of groundwater withdrawals on wetlands have not been assessed, and such assessments are a requirement of the State’s permitting process to develop a water supply in this area. A need for an assessment of the potential effects of pumping on wetlands in the Big River Management Area led to a cooperative agreement in 2008 between the Rhode Island Water Resources Board, the U.S. Geological Survey, and the University of Rhode Island. This partnership was formed with the goal of developing methods for characterizing wetland vegetation, soil type, and hydrologic conditions, and monitoring and modeling water levels for pre- and post-water-supply development to assess potential effects of groundwater withdrawals on wetlands. This report describes the hydrogeology of the area and the numerical simulations that were used to analyze the interaction between groundwater and surface water in response to simulated groundwater withdrawals. The results of this analysis suggest that, given the hydrogeologic conditions in the Big River Management Area, a standard 5-day aquifer test may not be sufficient to determine the effects of pumping on water levels in nearby wetlands. Model simulations showed water levels beneath Reynolds Swamp declined by about 0.1 foot after 5 days of continuous pumping, but continued to decline by an additional 4 to 6 feet as pumping times were increased from a 5-day simulation period to a simulation period representative of long-term average monthly conditions. This continued decline in water levels with

  16. Surface-water quality and suspended-sediment quantity and quality within the Big River Basin, southeastern Missouri, 2011-13

    Science.gov (United States)

    Barr, Miya N.

    2016-01-28

    Missouri was the leading producer of lead in the United States—as well as the world—for more than a century. One of the lead sources is known as the Old Lead Belt, located in southeast Missouri. The primary ore mineral in the region is galena, which can be found both in surface deposits and underground as deep as 200 feet. More than 8.5 million tons of lead were produced from the Old Lead Belt before operations ceased in 1972. Although active lead mining has ended, the effects of mining activities still remain in the form of large mine waste piles on the landscape typically near tributaries and the main stem of the Big River, which drains the Old Lead Belt. Six large mine waste piles encompassing more than 2,800 acres, exist within the Big River Basin. These six mine waste piles have been an available source of trace element-rich suspended sediments transported by natural erosional processes downstream into the Big River.

  17. 78 FR 56264 - Big Bear Mining Corp., Four Rivers BioEnergy, Inc., Mainland Resources, Inc., QI Systems Inc...

    Science.gov (United States)

    2013-09-12

    ... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] Big Bear Mining Corp., Four Rivers BioEnergy, Inc., Mainland Resources, Inc., QI Systems Inc., South Texas Oil Co., and Synova Healthcare Group, Inc... that there is a lack of current and accurate information concerning the securities of Big Bear Mining...

  18. Hydrogeologic data for the Big River-Mishnock River stream-aquifer system, central Rhode Island

    Science.gov (United States)

    Craft, P.A.

    2001-01-01

    Hydrogeology, ground-water development alternatives, and water quality in the BigMishnock stream-aquifer system in central Rhode Island are being investigated as part of a long-term cooperative program between the Rhode Island Water Resources Board and the U.S. Geological Survey to evaluate the ground-water resources throughout Rhode Island. The study area includes the Big River drainage basin and that portion of the Mishnock River drainage basin upstream from the Mishnock River at State Route 3. This report presents geologic data and hydrologic and water-quality data for ground and surface water. Ground-water data were collected from July 1996 through September 1998 from a network of observation wells consisting of existing wells and wells installed for this study, which provided a broad distribution of data-collection sites throughout the study area. Streambed piezometers were used to obtain differences in head data between surface-water levels and ground-water levels to help evaluate stream-aquifer interactions throughout the study area. The types of data presented include monthly ground-water levels, average daily ground-water withdrawals, drawdown data from aquifer tests, and water-quality data. Historical water-level data from other wells within the study area also are presented in this report. Surface-water data were obtained from a network consisting of surface-water impoundments, such as ponds and reservoirs, existing and newly established partial-record stream-discharge sites, and synoptic surface-water-quality sites. Water levels were collected monthly from the surface-water impoundments. Stream-discharge measurements were made at partial-record sites to provide measurements of inflow, outflow, and internal flow throughout the study area. Specific conductance was measured monthly at partial-record sites during the study, and also during the fall and spring of 1997 and 1998 at 41 synoptic sites throughout the study area. General geologic data, such as

  19. Surface-water and karst groundwater interactions and streamflow-response simulations of the karst-influenced upper Lost River watershed, Orange County, Indiana

    Science.gov (United States)

    Bayless, E. Randall; Cinotto, Peter J.; Ulery, Randy L.; Taylor, Charles J.; McCombs, Gregory K.; Kim, Moon H.; Nelson, Hugh L.

    2014-01-01

    The U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers (USACE) and the Indiana Office of Community and Rural Affairs (OCRA), conducted a study of the upper Lost River watershed in Orange County, Indiana, from 2012 to 2013. Streamflow and groundwater data were collected at 10 data-collection sites from at least October 2012 until April 2013, and a preliminary Water Availability Tool for Environmental Resources (WATER)-TOPMODEL based hydrologic model was created to increase understanding of the complex, karstic hydraulic and hydrologic system present in the upper Lost River watershed, Orange County, Ind. Statistical assessment of the optimized hydrologic-model results were promising and returned correlation coefficients for simulated and measured stream discharge of 0.58 and 0.60 and Nash-Sutcliffe efficiency values of 0.56 and 0.39 for USGS streamflow-gaging stations 03373530 (Lost River near Leipsic, Ind.), and 03373560 (Lost River near Prospect, Ind.), respectively. Additional information to refine drainage divides is needed before applying the model to the entire karst region of south-central Indiana. Surface-water and groundwater data were used to tentatively quantify the complex hydrologic processes taking place within the watershed and provide increased understanding for future modeling and management applications. The data indicate that during wet-weather periods and after certain intense storms, the hydraulic capacity of swallow holes and subsurface conduits is overwhelmed with excess water that flows onto the surface in dry-bed relic stream channels and karst paleovalleys. Analysis of discharge data collected at USGS streamflow-gaging station 03373550 (Orangeville Rise, at Orangeville, Ind.), and other ancillary data-collection sites in the watershed, indicate that a bounding condition is likely present, and drainage from the underlying karst conduit system is potentially limited to near 200 cubic feet per second. This

  20. Inter-annual variability in apparent relative production, survival, and growth of juvenile Lost River and shortnose suckers in Upper Klamath Lake, Oregon, 2001–15

    Science.gov (United States)

    Burdick, Summer M.; Martin, Barbara A.

    2017-06-15

    Executive SummaryPopulations of the once abundant Lost River (Deltistes luxatus) and shortnose suckers (Chasmistes brevirostris) of the Upper Klamath Basin, decreased so substantially throughout the 20th century that they were listed under the Endangered Species Act in 1988. Major landscape alterations, deterioration of water quality, and competition with and predation by exotic species are listed as primary causes of the decreases in populations. Upper Klamath Lake populations are decreasing because fish lost due to adult mortality, which is relatively low for adult Lost River suckers and variable for adult shortnose suckers, are not replaced by new young adult suckers recruiting into known adult spawning aggregations. Catch-at-age and size data indicate that most adult suckers presently in Upper Klamath Lake spawning populations were hatched around 1991. While, a lack of egg production and emigration of young fish (especially larvae) may contribute, catch-at-length and age data indicate high mortality during the first summer or winter of life may be the primary limitation to the recruitment of young adults. The causes of juvenile sucker mortality are unknown.We compiled and analyzed catch, length, age, and species data on juvenile suckers from Upper Klamath Lake from eight prior studies conducted from 2001 to 2015 to examine annual variation in apparent production, survival, and growth of young suckers. We used a combination of qualitative assessments, general linear models, and linear regression to make inferences about annual differences in juvenile sucker dynamics. The intent of this exercise is to provide information that can be compared to annual variability in environmental conditions with the hopes of understanding what drives juvenile sucker population dynamics.Age-0 Lost River suckers generally grew faster than age-0 shortnose suckers, but the difference in growth rates between the two species varied among years. This unsynchronized annual variation in

  1. Snapping turtles (Chelydra serpentina) as biomonitors of lead contamination of the Big River in Missouri`s Old Lead Belt

    Energy Technology Data Exchange (ETDEWEB)

    Overmann, S.R.; Krajicek, J.J. [Southeast Missouri State Univ., Cape Girardeau, MO (United States). Dept. of Biology

    1995-04-01

    The usefulness of common snapping turtles (Chelydra serpentina) as biomonitors of lead (Pb) contamination of aquatic ecosystems was assessed. Thirty-seven snapping turtles were collected from three sites on the Big River, an Ozarkian stream contaminated with Pb mine tailings. Morphometric measurements, tissue Pb concentrations (muscle, blood, bone, carapace, brain, and liver), {delta}-aminolevulinic acid dehydratase ({delta}-ALAD) activity, hematocrit, hemoglobin, plasma glucose, osmolality, and chloride ion content were measured. The data showed no effects of Pb contamination on capture success or morphological measurements. Tissue Pb concentrations were related to capture location. Hematocrit, plasma osmolality, plasma glucose, and plasma chloride ion content were not significantly different with respect to capture location. The {delta}-ALAD activity levels were decreased in turtles taken from contaminated sites. Lead levels in the Big River do not appear to be adversely affecting the snapping turtles of the river. Chelydra serpentina is a useful species for biomonitoring of Pb-contaminated aquatic environments.

  2. Characteristics of dissolved organic matter in the Upper Klamath River, Lost River, and Klamath Straits Drain, Oregon and California

    Science.gov (United States)

    Goldman, Jami H.; Sullivan, Annett B.

    2017-12-11

    Concentrations of particulate organic carbon (POC) and dissolved organic carbon (DOC), which together comprise total organic carbon, were measured in this reconnaissance study at sampling sites in the Upper Klamath River, Lost River, and Klamath Straits Drain in 2013–16. Optical absorbance and fluorescence properties of dissolved organic matter (DOM), which contains DOC, also were analyzed. Parallel factor analysis was used to decompose the optical fluorescence data into five key components for all samples. Principal component analysis (PCA) was used to investigate differences in DOM source and processing among sites.At all sites in this study, average DOC concentrations were higher than average POC concentrations. The highest DOC concentrations were at sites in the Klamath Straits Drain and at Pump Plant D. Evaluation of optical properties indicated that Klamath Straits Drain DOM had a refractory, terrestrial source, likely extracted from the interaction of this water with wetland peats and irrigated soils. Pump Plant D DOM exhibited more labile characteristics, which could, for instance, indicate contributions from algal or microbial exudates. The samples from Klamath River also had more microbial or algal derived material, as indicated by PCA analysis of the optical properties. Most sites, except Pump Plant D, showed a linear relation between fluorescent dissolved organic matter (fDOM) and DOC concentration, indicating these measurements are highly correlated (R2=0.84), and thus a continuous fDOM probe could be used to estimate DOC loads from these sites.

  3. Distribution, Health, and Development of Larval and Juvenile Lost River and Shortnose Suckers in the Williamson River Delta Restoration Project and Upper Klamath Lake, Oregon: 2008 Annual Data Summary

    Science.gov (United States)

    Burdick, Summer M.; Ottinger, Christopher; Brown, Daniel T.; VanderKooi, Scott P.; Robertson, Laura; Iwanowicz, Deborah

    2009-01-01

    Federally endangered Lost River sucker Deltistes luxatus and shortnose sucker Chasmistes brevirostris were once abundant throughout their range but populations have declined; they have been extirpated from several lakes, and may no longer reproduce in others. Poor recruitment into the adult spawning populations is one of several reasons cited for the decline and lack of recovery of these species, and may be the consequence of high mortality during juvenile life stages. High larval and juvenile sucker mortality may be exacerbated by an insufficient quantity of suitable rearing habitat. Within Upper Klamath Lake, a lack of marshes also may allow larval suckers to be swept from suitable rearing areas downstream into the seasonally anoxic waters of the Keno Reservoir. The Nature Conservancy (TNC) flooded about 3,600 acres to the north of the Williamson River mouth (Tulana Unit) in October 2007, and about 1,400 acres to the south and east of the Williamson River mouth (Goose Bay Unit) a year later, to retain larval suckers in Upper Klamath Lake, create nursery habitat for suckers, and improve water quality. In collaboration with TNC, the Bureau of Reclamation, and Oregon State University, we began a long-term collaborative research and monitoring program in 2008 to assess the effects of the Williamson River Delta restoration on the early life-history stages of Lost River and shortnose suckers. Our approach includes two equally important aspects. One component is to describe habitat use and colonization processes by larval and juvenile suckers and non-sucker fish species. The second is to evaluate the effects of the restored habitat on the health and condition of juvenile suckers. This report contains a summary of the first year of data collected as a part of this monitoring effort.

  4. Hellsgate Big Game Winter Range Wildlife Mitigation Project : Annual Report 2008.

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Richard P.; Berger, Matthew T.; Rushing, Samuel; Peone, Cory

    2009-01-01

    The Hellsgate Big Game Winter Range Wildlife Mitigation Project (Hellsgate Project) was proposed by the Confederated Tribes of the Colville Reservation (CTCR) as partial mitigation for hydropower's share of the wildlife losses resulting from Chief Joseph and Grand Coulee Dams. At present, the Hellsgate Project protects and manages 57,418 acres (approximately 90 miles2) for the biological requirements of managed wildlife species; most are located on or near the Columbia River (Lake Rufus Woods and Lake Roosevelt) and surrounded by Tribal land. To date we have acquired about 34,597 habitat units (HUs) towards a total 35,819 HUs lost from original inundation due to hydropower development. In addition to the remaining 1,237 HUs left unmitigated, 600 HUs from the Washington Department of Fish and Wildlife that were traded to the Colville Tribes and 10 secure nesting islands are also yet to be mitigated. This annual report for 2008 describes the management activities of the Hellsgate Big Game Winter Range Wildlife Mitigation Project (Hellsgate Project) during the past year.

  5. Hydraulic survey and scour assessment of Bridge 524, Tanana River at Big Delta, Alaska

    Science.gov (United States)

    Heinrichs, Thomas A.; Langley, Dustin E.; Burrows, Robert L.; Conaway, Jeffrey S.

    2007-01-01

    Bathymetric and hydraulic data were collected August 26–28, 1996, on the Tanana River at Big Delta, Alaska, at the Richardson Highway bridge and Trans-Alaska Pipeline crossing. Erosion along the right (north) bank of the river between the bridge and the pipeline crossing prompted the data collection. A water-surface profile hydraulic model for the 100- and 500-year recurrence-interval floods was developed using surveyed information. The Delta River enters the Tanana immediately downstream of the highway bridge, causing backwater that extends upstream of the bridge. Four scenarios were considered to simulate the influence of the backwater on flow through the bridge. Contraction and pier scour were computed from model results. Computed values of pier scour were large, but the scour during a flood may actually be less because of mitigating factors. No bank erosion was observed at the time of the survey, a low-flow period. Erosion is likely to occur during intermediate or high flows, but the actual erosion processes are unknown at this time.

  6. The ordered network structure and its prediction for the big floods of the Changjiang River Basins

    Energy Technology Data Exchange (ETDEWEB)

    Men, Ke-Pei; Zhao, Kai; Zhu, Shu-Dan [Nanjing Univ. of Information Science and Technology, Nanjing (China). College of Mathematics and Statistics

    2013-12-15

    According to the latest statistical data of hydrology, a total of 21 floods took place over the Changjiang (Yangtze) River Basins from 1827 to 2012 and showed an obvious commensurable orderliness. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered analysis with complex network technology, we focus on the summary of the ordered network structure of the Changjiang floods, supplement new information, further optimize networks, construct the 2D- and 3D-ordered network structure and make prediction research. Predictions show that the future big deluges will probably occur over the Changjiang River Basin around 2013-2014, 2020-2021, 2030, 2036, 2051, and 2058. (orig.)

  7. The ordered network structure and its prediction for the big floods of the Changjiang River Basins

    International Nuclear Information System (INIS)

    Men, Ke-Pei; Zhao, Kai; Zhu, Shu-Dan

    2013-01-01

    According to the latest statistical data of hydrology, a total of 21 floods took place over the Changjiang (Yangtze) River Basins from 1827 to 2012 and showed an obvious commensurable orderliness. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered analysis with complex network technology, we focus on the summary of the ordered network structure of the Changjiang floods, supplement new information, further optimize networks, construct the 2D- and 3D-ordered network structure and make prediction research. Predictions show that the future big deluges will probably occur over the Changjiang River Basin around 2013-2014, 2020-2021, 2030, 2036, 2051, and 2058. (orig.)

  8. Lignite zone as an indicator to lost circulation belt: a case study of ...

    African Journals Online (AJOL)

    Eighteen (18) water boreholes were studied for lost circulation. When locations of the boreholes associated with lost circulation were plotted on the map of Anambra State a lost circulation belt was observed around the River Niger – Onitsha – Oba – Nnewi axis. Lost circulation intervals range between 20-50m and 75-90m ...

  9. Biological zonation of the last unbound big river in the West Carpathians: reference scheme based on caddisfly communities

    Directory of Open Access Journals (Sweden)

    Čiliak M.

    2014-01-01

    Full Text Available A thorough understanding of biotic communities distribution in predisturbance state is essential for predictions of their future changes related to human activities. In this regard, pre-damming data on spatial distribution of benthic communities are highly valuable. Caddisflies were sampled at 14 sites of the Hron River and analysed in order to establish longitudinal zonation of the river and to determine environmental factors affecting assemblages’ distribution in the longitudinal profile. A total of 2600 individuals of caddisflies belonging to 40 taxa of 12 families were recorded. Diversity of caddisflies was found to be higher in the upper (rhithral part of the river. Major change, with shift to much more uniform caddisfly assemblages, occurred in the middle part of the river. Four zones (subzones were distinguished using caddisfly communities: epirhithral, metarhithral, hyporhithral and epipotamal. Canonical correspondence analysis demonstrated the determining influence of altitude and conductivity on the caddisflies. Pre-damming zonation patterns presented here could serve as basic information for management of the Hron River as well as a reference scheme for other, previously dammed big rivers in the West Carpathian region.

  10. Mineralogical correlation of surficial sediment from area drainages with selected sedimentary interbeds at the Idaho National Engineering Laboratory, Idaho

    Energy Technology Data Exchange (ETDEWEB)

    Bartholomay, R.C.

    1990-08-01

    Ongoing research by the US Geological Survey at the INEL involves investigation of the migration of radioactive elements contained in low-level radioactive waste, hydrologic and geologic factors affecting waste movement, and geochemical factors that influence the chemical composition of the waste. Identification of the mineralogy of the Snake River Plain is needed to aid in the study of the hydrology and geochemistry of subsurface waste disposal. The US Geological Surveys project office at the Idaho National Engineering Laboratory, in cooperation with the US Department of Energy, used mineralogical data to correlate surficial sediment samples from the Big Lost River, Little Lost River, and Birch Greek drainages with selected sedimentary interbed core samples taken from test holes at the RWMC (Radioactive Waste Management Complex), TRA (Test Reactors Area), ICPP (Idaho Chemical Processing Plant), and TAN (Test Area North). Correlating the mineralogy of a particular present-day drainage area with a particular sedimentary interbed provides information on historical source of sediment for interbeds in and near the INEL. Mineralogical data indicate that surficial sediment samples from the Big Lost River drainage contained a larger amount of feldspar and pyroxene and a smaller amount of calcite and dolomite than samples from the Little Lost River and Birch Creek drainages. Mineralogical data from sedimentary interbeds at the RWMC, TRA, and ICPP correlate with surficial sediment of the present-day big Lost River drainage. Mineralogical data from a sedimentary interbed at TAN correlate with surficial sediment of the present-day Birch Creek drainage. 13 refs., 5 figs., 3 tabs.

  11. Mineralogical correlation of surficial sediment from area drainages with selected sedimentary interbeds at the Idaho National Engineering Laboratory, Idaho

    International Nuclear Information System (INIS)

    Bartholomay, R.C.

    1990-08-01

    Ongoing research by the US Geological Survey at the INEL involves investigation of the migration of radioactive elements contained in low-level radioactive waste, hydrologic and geologic factors affecting waste movement, and geochemical factors that influence the chemical composition of the waste. Identification of the mineralogy of the Snake River Plain is needed to aid in the study of the hydrology and geochemistry of subsurface waste disposal. The US Geological Surveys project office at the Idaho National Engineering Laboratory, in cooperation with the US Department of Energy, used mineralogical data to correlate surficial sediment samples from the Big Lost River, Little Lost River, and Birch Greek drainages with selected sedimentary interbed core samples taken from test holes at the RWMC (Radioactive Waste Management Complex), TRA (Test Reactors Area), ICPP (Idaho Chemical Processing Plant), and TAN (Test Area North). Correlating the mineralogy of a particular present-day drainage area with a particular sedimentary interbed provides information on historical source of sediment for interbeds in and near the INEL. Mineralogical data indicate that surficial sediment samples from the Big Lost River drainage contained a larger amount of feldspar and pyroxene and a smaller amount of calcite and dolomite than samples from the Little Lost River and Birch Creek drainages. Mineralogical data from sedimentary interbeds at the RWMC, TRA, and ICPP correlate with surficial sediment of the present-day big Lost River drainage. Mineralogical data from a sedimentary interbed at TAN correlate with surficial sediment of the present-day Birch Creek drainage. 13 refs., 5 figs., 3 tabs

  12. Diversitas Dan Hilangnya Jenis-jenis Ikan Disungai Ciliwung Dan Sungai Cisadane [Study of Fish Diversity and the Lost of Fish Species of River Ciliwung and R. Cisadane

    OpenAIRE

    Hadiaty, Renny Kurnia

    2011-01-01

    The fish research in Indonesian waters has been begun since 16 century ago. Most of the research collected fish around Batavia.Many new species was described and the type specimens deposited at the museums in Europe or America.The study of fish diversity and the lost of fish species was conducted at River Ciliwung and R. Cisadane in 2009. The aim of this study is to describe the recent fish diversity in both river drainages, then make a comparison with the number of species recorded based on ...

  13. Use of geochemical tracers for estimating groundwater influxes to the Big Sioux River, eastern South Dakota, USA

    Science.gov (United States)

    Neupane, Ram P.; Mehan, Sushant; Kumar, Sandeep

    2017-09-01

    Understanding the spatial distribution and variability of geochemical tracers is crucial for estimating groundwater influxes into a river and can contribute to better future water management strategies. Because of the much higher radon (222Rn) activities in groundwater compared to river water, 222Rn was used as the main tracer to estimate groundwater influxes to river discharge over a 323-km distance of the Big Sioux River, eastern South Dakota, USA; these influx estimates were compared to the estimates using Cl- concentrations. In the reaches overall, groundwater influxes using the 222Rn activity approach ranged between 0.3 and 6.4 m3/m/day (mean 1.8 m3/m/day) and the cumulative groundwater influx estimated during the study period was 3,982-146,594 m3/day (mean 40,568 m3/day), accounting for 0.2-41.9% (mean 12.5%) of the total river flow rate. The mean groundwater influx derived using the 222Rn activity approach was lower than that calculated based on Cl- concentration (35.6 m3/m/day) for most of the reaches. Based on the Cl- approach, groundwater accounted for 37.3% of the total river flow rate. The difference between the method estimates may be associated with minimal differences between groundwater and river Cl- concentrations. These assessments will provide a better understanding of estimates used for the allocation of water resources to sustain agricultural productivity in the basin. However, a more detailed sampling program is necessary for accurate influx estimation, and also to understand the influence of seasonal variation on groundwater influxes into the basin.

  14. Thinking big: linking rivers to landscapes

    Science.gov (United States)

    Joan O’Callaghan; Ashley E. Steel; Kelly M. Burnett

    2012-01-01

    Exploring relationships between landscape characteristics and rivers is an emerging field, enabled by the proliferation of satellite date, advances in statistical analysis, and increased emphasis on large-scale monitoring. Landscapes features such as road networks, underlying geology, and human developments, determine the characteristics of the rivers flowing through...

  15. Digging for Lost Rivers in Thailand: Locating and Dating Paleochannels in the Chiang Mai Intermontane Basin

    Science.gov (United States)

    Teo, Elisha A.; Ziegler, Alan D.; Wasson, Robert J.; Morthekai, Paulramasamy

    2017-04-01

    The drainage of the Chiang Mai basin has a dynamic but largely forgotten history. In the late 1980s, an ancient lost city was excavated near the Ping River in Chiang Mai, Thailand. Archaeologists had unearthed Wiang Kum Kam, the former royal capital of the Lanna Civilisation founded in 1286 CE. Former investigations revealed that flood sediments buried the capital and remnants of an abandoned river channel were discovered beneath the surface. This concurs with historical descriptions of the Ping River being on the eastern bank of the capital, despite being presently located on the western bank. The paleochannel drained 500 years ago after diverting west of the ancient city. This switch, an avulsion, coincided with a large flood, which could have triggered and/or caused the avulsion. Local oral histories also recount other Ping avulsions across the basin, but these were not documented. Some of these paleochannels residually remain as unusually sinuous irrigation canals, with historically suggestive names such as the Old Ping and the Small Ping Rivers. Here, the geomorphological evolution of the Ping River is investigated, as a future avulsion in this extensively populated area would be catastrophic. Evidence shows that the drainage of the Chiang Mai basin evolved from a braided system, to an avulsing anastomosing system, to a primarily single channel system. Two-dimensional electrical resistivity tomography and augering detected a large continuous body of fluvial sand 4 m below the surface, across the 10 km distance between the Ping and Kuang Rivers. This sand continues to the depth of at least 30 m and is typical of a braided system. Further augering along paleochannels revealed buried levees that protrude from the braided river deposits to near the surface, separated by fine floodplain sediments. This may have formed as the braided system evolved into an anastomosing system, where distinct channels stabilised and floodplain deposits could develop between channels

  16. Juvenile Lost River and shortnose sucker year class strength, survival, and growth in Upper Klamath Lake, Oregon, and Clear Lake Reservoir, California—2016 Monitoring Report

    Science.gov (United States)

    Burdick, Summer M.; Ostberg, Carl O.; Hoy, Marshal S.

    2018-04-20

    Executive SummaryThe largest populations of federally endangered Lost River (Deltistes luxatus) and shortnose suckers (Chasmistes brevirostris) exist in Upper Klamath Lake, Oregon, and Clear Lake Reservoir, California. Upper Klamath Lake populations are decreasing because adult mortality, which is relatively low, is not being balanced by recruitment of young adult suckers into known spawning aggregations. Most Upper Klamath Lake juvenile sucker mortality appears to occur within the first year of life. Annual production of juvenile suckers in Clear Lake Reservoir appears to be highly variable and may not occur at all in very dry years. However, juvenile sucker survival is much higher in Clear Lake, with non-trivial numbers of suckers surviving to join spawning aggregations. Long-term monitoring of juvenile sucker populations is needed to (1) determine if there are annual and species-specific differences in production, survival, and growth, (2) to identify the season (summer or winter) in which most mortality occurs, and (3) to help identify potential causes of high juvenile sucker mortality, particularly in Upper Klamath Lake.We initiated an annual juvenile sucker monitoring program in 2015 to track cohorts in 3 months (June, August, and September) annually in Upper Klamath Lake and Clear Lake Reservoir. We tracked annual variability in age-0 sucker apparent production, juvenile sucker apparent survival, and apparent growth. Using genetic markers, we were able to classify suckers as one of three taxa: shortnose or Klamath largescale suckers, Lost River, or suckers with genetic markers of both species (Intermediate Prob[LRS]). Using catch data, we generated taxa-specific indices of year class strength, August–September apparent survival, and overwinter apparent survival. We also examined prevalence and severity of afflictions such as parasites, wounds, and deformities.Indices of year class strength in Upper Klamath Lake were similar for shortnose suckers in 2015

  17. Virginia big-eared bats (Corynorhinus townsendii virginianus) roosting in abandoned coal mines in West Virginia

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J.B.; Edwards, J.W.; Wood, P.B. [West Virginia University, Morgantown, WV (US). Wildlife & Fisheries Resources Programme

    2005-07-01

    We surveyed bats at 36 abandoned coal mines during summer 2002 and 47 mines during fall 2002 at New River Gorge National River and Gauley River National Recreation Area, WV. During summer, we captured three federally endangered Virginia big-eared bats at two mine entrances, and 25 were captured at 12 mine entrances during fall. These represent the first documented captures of this species at coal mines in West Virginia. Future survey efforts conducted throughout the range of the Virginia big-eared bat should include abandoned coal mines.

  18. Colonial waterbird predation on Lost River and shortnose suckers based on recoveries of passive integrated transponder tags

    Science.gov (United States)

    Evans, Allen; Payton, Quinn; Cramer, Bradley D.; Collis, Ken; Hewitt, David A.; Roby, Daniel D.

    2015-01-01

    We evaluated predation on Lost River suckers (Deltistes luxatus) and shortnose suckers (Chasmistes brevirostris), both listed under the Endangered Species Act (ESA), from American white pelicans (Pelecanus erythrorhynchos) and double-crested cormorants (Phalacrocorax auritus) nesting at mixed species colonies on Clear Lake Reservoir, CA and Upper Klamath Lake, OR during 2009-2014. Predation was evaluated by recovering passive integrated transponder (PIT) tags that were implanted in suckers, subsequently consumed by pelicans or cormorants, and deposited on the birds’ nesting colonies. Data from PIT tag recoveries were used to estimate predation rates (proportion of available tagged suckers consumed) by birds to evaluate the relative susceptibility of suckers to avian predation in Upper Klamath Basin. Data on the size of pelican and cormorant colonies (number of breeding adults) at Clear Lake and Upper Klamath Lake were also collected and reported in the context of predation on suckers.

  19. Age dating ground water by use of chlorofluorocarbons (CCl3F and CCl2F2), and distribution of chlorofluorocarbons in the unsaturated zone, Snake River Plain aquifer, Idaho National Engineering Laboratory, Idaho

    International Nuclear Information System (INIS)

    Busenberg, E.; Weeks, E.P.; Plummer, L.N.; Bartholomay, R.C.

    1993-04-01

    Detectable concentrations of chlorofluorocarbons (CFC's) were observed in ground water and unsaturated-zone air at the Idaho National Engineering Laboratory (INEL) and vicinity. The recharge ages of waters were determined to be from 4 to more than 50 years on the basis of CFC concentrations and other environmental data; most ground waters have ages of 14 to 30 years. These results indicate that young ground water was added at various locations to the older regional ground water (greater than 50 years) within and outside the INEL boundaries. The wells drilled into the Snake River Plain aquifer at INEL sampled mainly this local recharge. The Big Lost River, Birch Creek, the Little Lost River, and the Mud Lake-Terreton area appear to be major sources of recharge of the Snake River Plain aquifer at INEL. An average recharge temperature of 9.7±1.3 degrees C (degrees Celsius) was calculated from dissolved nitrogen and argon concentrations in the ground waters, a temperature that is similar to the mean annual soil temperature of 9 degrees C measured at INEL. This similarity indicates that the aquifer was recharged at INEL and not at higher elevations that would have cooler soil temperatures than INEL. Soil-gas concentrations at Test Area North (TAN) are explained by diffusion theory

  20. The Importance of Hunting and Hunting Areas for Big and Small Game (Food) for the Tourism Development in the Crna River Basin in the Republic of Macedonia

    OpenAIRE

    Koteski, Cane; Josheski, Dushko; Jakovlev, Zlatko; Bardarova, Snezana; Serafimova, Mimoza

    2014-01-01

    The Crna River is a river in the Republic of Macedonia, right tributary to Vardar. Its source is in the mountains of Western Macedonia, west of Krusevo. It flows through the village of Sopotnica, and southwards through the plains east of Bitola. The name means “black river” in Macedonian, which is translation for its former Thracian name. The purpose of this paper is to show the hunting and hunting areas for big and small Game (food), the structure of the areas of certain hunting, fi...

  1. Characteristics and origin of Earth-mounds on the Eastern Snake River Plain, Idaho

    International Nuclear Information System (INIS)

    Tullis, J.A.

    1995-09-01

    Earth-mounds are common features on the Eastern Snake River Plain, Idaho. The mounds are typically round or oval in plan view, <0.5 m in height, and from 8 to 14 m in diameter. They are found on flat and sloped surfaces, and appear less frequently in lowland areas. The mounds have formed on deposits of multiple sedimentary environments. Those studied included alluvial gravel terraces along the Big Lost River (late Pleistocene/early Holocene age), alluvial fan segments on the flanks of the Lost River Range (Bull Lake and Pinedale age equivalents), and loess/slopewash sediments overlying basalt flows. Backhoe trenches were dug to allow characterization of stratigraphy and soil development. Each mound has features unique to the depositional and pedogenic history of the site; however, there are common elements to all mounds that are linked to the history of mound formation. Each mound has a open-quotes floorclose quotes of a sediment or basement rock of significantly different hydraulic conductivity than the overlying sediment. These paleosurfaces are overlain by finer-grained sediments, typically loess or flood-overbank deposits. Mounds formed in environments where a sufficient thickness of fine-grained sediment held pore water in a system open to the migration to a freezing front. Heaving of the sediment occurred by the growth of ice lenses. Mound formation occurred at the end of the Late Pleistocene or early in the Holocene, and was followed by pedogenesis. Soils in the mounds were subsequently altered by bioturbation, buried by eolian deposition, and eroded by slopewash runoff. These secondary processes played a significant role in maintaining or increasing the mound/intermound relief

  2. Characteristics and origin of Earth-mounds on the Eastern Snake River Plain, Idaho

    Energy Technology Data Exchange (ETDEWEB)

    Tullis, J.A.

    1995-09-01

    Earth-mounds are common features on the Eastern Snake River Plain, Idaho. The mounds are typically round or oval in plan view, <0.5 m in height, and from 8 to 14 m in diameter. They are found on flat and sloped surfaces, and appear less frequently in lowland areas. The mounds have formed on deposits of multiple sedimentary environments. Those studied included alluvial gravel terraces along the Big Lost River (late Pleistocene/early Holocene age), alluvial fan segments on the flanks of the Lost River Range (Bull Lake and Pinedale age equivalents), and loess/slopewash sediments overlying basalt flows. Backhoe trenches were dug to allow characterization of stratigraphy and soil development. Each mound has features unique to the depositional and pedogenic history of the site; however, there are common elements to all mounds that are linked to the history of mound formation. Each mound has a {open_quotes}floor{close_quotes} of a sediment or basement rock of significantly different hydraulic conductivity than the overlying sediment. These paleosurfaces are overlain by finer-grained sediments, typically loess or flood-overbank deposits. Mounds formed in environments where a sufficient thickness of fine-grained sediment held pore water in a system open to the migration to a freezing front. Heaving of the sediment occurred by the growth of ice lenses. Mound formation occurred at the end of the Late Pleistocene or early in the Holocene, and was followed by pedogenesis. Soils in the mounds were subsequently altered by bioturbation, buried by eolian deposition, and eroded by slopewash runoff. These secondary processes played a significant role in maintaining or increasing the mound/intermound relief.

  3. Bat habitat research. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Keller, B.L.; Bosworth, W.R.; Doering, R.W.

    1993-12-31

    This progress report describes activities over the current reporting period to characterize the habitats of bats on the INEL. Research tasks are entitled Monitoring bat habitation of caves on the INEL to determine species present, numbers, and seasons of use; Monitor bat use of man-made ponds at the INEL to determine species present and rates of use of these waters; If the Big Lost River is flowing on the INEL and/or if the Big Lost River sinks contain water, determine species present, numbers and seasons of use; Determine the habitat requirement of Townsend`s big-eared bats, including the microclimate of caves containing Townsend`s big-eared bats as compared to other caves that do not contain bats; Determine and describe an economical and efficient bat census technique to be used periodically by INEL scientists to determine the status of bats on the INEL; and Provide a suggestive management and protective plan for bat species on the INEL that might, in the future, be added to the endangered and sensitive list;

  4. Archeological Investigations at Big Hill Lake, Southeastern Kansas, 1980.

    Science.gov (United States)

    1982-09-01

    settled primarily along the Neosho river and Labette, Big Hill, and Pumpkin creeks. One of the first settlers in Osage township, in which Big Hill...slabs is not known at present. About 10 years later, in 1876, materials were reported- ly collected from an aboriginal site along Pumpkin creek...and length- ening its lifetime of use. As would therefore be expected, cracks are present between each of the paired holes on both of the two restored

  5. Mineralogy of selected sedimentary interbeds at or near the Idaho National Engineering Laboratory, Idaho

    International Nuclear Information System (INIS)

    Reed, M.F.; Bartholomay, R.C.

    1994-08-01

    The US Geological Survey's (USGS) Project Office at the Idaho National Engineering Laboratory (INEL) analyzed 66 samples from sedimentary interbed cores during a 38-month period beginning in October 1990 to determine bulk and clay mineralogy. These cores had been collected from 19 sites in the Big Lost River Basin, 2 sites in the Birch Creek Basin, and 1 site in the Mud Lake Basin, and were archived at the USGS lithologic core library at the INEL. Mineralogy data indicate that core samples from the Big Lost River Basin have larger mean and median percentages of quartz, total feldspar, and total clay minerals, but smaller mean and median percentages of calcite than the core samples from the Birch Creek Basin. Core samples from the Mud Lake Basin have abundant quartz, total feldspar, calcite, and total clay minerals. Identification of the mineralogy of the Snake River Plain is needed to aid in the study of the hydrology and geochemistry of subsurface waste disposal

  6. The analysis on the flood property of Weihe River in 2003

    International Nuclear Information System (INIS)

    Liu Longqing; Jiang Xinhui

    2004-01-01

    From the end of Aug to Oct in 2003, it occurred a serious rainfall in the Weihe River --the largest tributary of Yellow River. The rainfall is rare in the history with long duration in the Weihe River valley so that 5 successive floods have formed at the controlling hydrological station-Huaxian station. Those floods overflow the beach, broke the dykes and flood the big area of Lower Weihe River. The natural adversity made near 200.000 populations leave their homeland the serious economic losses. The durations of the floods are long, the water levels are high and the volume of floods is largeness, which is rare in the history to a large extent. The flood peak at Huaxian station is up to 3570 m 3 /s, which is the first biggest peak since 1992. In recent years, owing to the fact that probability of the big flood on Weihe River was rare, the main river was withered clearly, propagation time of flood is lengthened and the discharge flowing over the floodplain was only 800-1000 m 3 /s. The water producing areas of those floods were in the area with little sediment production and the sediment content of the river is lower. As a result, the main river is eroded, the discharge ability of the river course becomes big gradually and the discharge flowing over the floodplain recovers above 2000 m 3 /s. From the analyses of flood components and flood progress, the conclusion is: the sediment deposit and the rising of channel bed, the withering of the main river, the decreasing of the discharge flowing over the floodplain, the increasing of the large peak whittling rate and the prolonging of the propagation duration, all have become the universal appearance of the rivers in arid and half arid districts. The appearance is extremely easily to create the serious calamity in the big flood and the flood law in local area should be researched further.(Author)

  7. Spring and Summer Spatial Distribution of Endangered Juvenile Lost River and Shortnose Suckers in Relation to Environmental Variables in Upper Klamath Lake, Oregon: 2007 Annual Report

    Science.gov (United States)

    Burdick, Summer M.; VanderKooi, Scott P.; Anderson, Greer O.

    2009-01-01

    Lost River sucker Deltistes luxatus and shortnose sucker Chasmistes brevirostris were listed as endangered in 1988 for a variety of reasons including apparent recruitment failure. Upper Klamath Lake, Oregon, and its tributaries are considered the most critical remaining habitat for these two species. Age-0 suckers are often abundant in Upper Klamath Lake throughout the summer months, but catches decline dramatically between late August and early September each year, and age-1 and older subadult suckers are rare. These rapid declines in catch rates and a lack of substantial recruitment into adult sucker populations in recent years suggests sucker populations experience high mortality between their first summer and first spawn. A lack of optimal rearing habitat may exacerbate juvenile sucker mortality or restrict juvenile growth or development. In 2007, we continued research on juvenile sucker habitat use begun by the U.S. Geological Survey (USGS) in 2001. Age-0 catch rates in 2006 were more than an order of magnitude greater than in previous years, which prompted us to refocus our research from age-0 suckers to age-1 sucker distributions and habitat use. We took a two-phased approach to our research in 2007 that included preliminary spring sampling and intense summer sampling components. Spring sampling was a pilot study designed to gather baseline data on the distribution of age-1 suckers as they emerge from winter in shoreline environments throughout Upper Klamath Lake (Chapter 1). Whereas, summer sampling was designed to quantitatively estimate the influence of environmental variables on age-0 and age-1 sucker distribution throughout Upper Klamath Lake, while accounting for imperfect detection (Chapter 2). In addition to these two components, we began a project to evaluate passive integrated transponder (PIT) tag loss and the effects of PIT tags on mortality of age-1 Lost River suckers (Chapter 3). The spring pilot study built the foundation for future research

  8. Lost lake - restoration of a Carolina bay

    Energy Technology Data Exchange (ETDEWEB)

    Hanlin, H.G.; McLendon, J.P. [Univ. of South Carolina, Aiken, SC (United States). Dept. of Biology and Geology; Wike, L.D. [Univ. of South Carolina, Aiken, SC (United States). Dept. of Biology and Geology]|[Westinghouse Savannah River Co., Aiken, SC (United States). Savannah River Technology Center; Dietsch, B.M. [Univ. of South Carolina, Aiken, SC (United States). Dept. of Biology and Geology]|[Univ. of Georgia, Aiken, SC (United States)

    1994-09-01

    Carolina bays are shallow wetland depressions found only on the Atlantic Coastal Plain. Although these isolated interstream wetlands support many types of communities, they share the common features of having a sandy margin, a fluctuating water level, an elliptical shape, and a northwest to southeast orientation. Lost Lake, an 11.3 hectare Carolina bay, was ditched and drained for agricultural production before establishment of the Savannah River Site in 1950. Later it received overflow from a seepage basin containing a variety of chemicals, primarily solvents and some heavy metals. In 1990 a plan was developed for the restoration of Lost Lake, and restoration activities were complete by mid-1991. Lost Lake is the first known project designed for the restoration and recovery of a Carolina bay. The bay was divided into eight soil treatment zones, allowing four treatments in duplicate. Each of the eight zones was planted with eight species of native wetland plants. Recolonization of the bay by amphibians and reptiles is being evaluated by using drift fences with pitfall traps and coverboard arrays in each of the treatment zones. Additional drift fences in five upland habitats were also established. Hoop turtle traps, funnel minnow traps, and dip nets were utilized for aquatic sampling. The presence of 43 species common to the region has been documented at Lost Lake. More than one-third of these species show evidence of breeding populations being established. Three species found prior to the restoration activity and a number of species common to undisturbed Carolina bays were not encountered. Colonization by additional species is anticipated as the wetland undergoes further succession.

  9. Restoration of Lost Lake, recovery of an impacted Carolina Bay

    International Nuclear Information System (INIS)

    Wike, L.D.; Gladden, J.B.; Mackey, H.E. Jr.; Rogers, V.A.

    1995-01-01

    Lost Lake is one of approximately 200 Carolina bays found on the Savannah River Site (SRS). Until 1984 Lost Lake was contaminated by heavy metals and solvents overflowing from a nearby settling basin. Up to 12 inches of surface soil and all vegetation was removed from the bay as part of a RCRA removal action. A plan for restoration was initiated in 1989 and implemented in 1990 and 1991. Extensive planning led to defined objectives, strategies, treatments, and monitoring programs allowing successful restoration of Lost Lake. The primary goal of the project was to restore the wetland ecosystem after a hazardous waste clean up operation. An additional goal was to study the progress of the project and the success of the restoration activity. Several strategy considerations were necessary in the restoration plan. The removal of existing organic soils had to have compensation, a treatment scheme for planting and the extent of manipulation of the substrate had to be considered, monitoring decisions had to be made, and the decision whether or not to actively control the hydrology of the restored system

  10. Salmonid Gamete Preservation in the Snake River Basin, 2001 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Robyn; Kucera, Paul

    2002-06-01

    Steelhead (Oncorhynchus mykiss) and chinook salmon (Oncorhynchus tshawytscha) populations in the Northwest are decreasing. Genetic diversity is being lost at an alarming rate. Along with reduced population and genetic variability, the loss of biodiversity means a diminished environmental adaptability. The Nez Perce Tribe (Tribe) strives to ensure availability of genetic samples of the existing male salmonid population by establishing and maintaining a germplasm repository. The sampling strategy, initiated in 1992, has been to collect and preserve male salmon and steelhead genetic diversity across the geographic landscape by sampling within the major river subbasins in the Snake River basin, assuming a metapopulation structure existed historically. Gamete cryopreservation conserves genetic diversity in a germplasm repository, but is not a recovery action for listed fish species. The Tribe was funded in 2001 by the Bonneville Power Administration (BPA) and the U.S. Fish and Wildlife Service Lower Snake River Compensation Plan (LSRCP) to coordinate gene banking of male gametes from Endangered Species Act (ESA) listed steelhead and spring and summer chinook salmon in the Snake River basin. In 2001, a total of 398 viable chinook salmon semen samples from the Lostine River, Catherine Creek, upper Grande Ronde River, Lookingglass Hatchery (Imnaha River stock), Lake Creek, the South Fork Salmon River weir, Johnson Creek, Big Creek, Capehorn Creek, Marsh Creek, Pahsimeroi Hatchery, and Sawtooth Hatchery (upper Salmon River stock) were cryopreserved. Also, 295 samples of male steelhead gametes from Dworshak Hatchery, Fish Creek, Grande Ronde River, Little Sheep Creek, Pahsimeroi Hatchery and Oxbow Hatchery were also cryopreserved. The Grande Ronde chinook salmon captive broodstock program stores 680 cryopreserved samples at the University of Idaho as a long-term archive, half of the total samples. A total of 3,206 cryopreserved samples from Snake River basin steelhead and

  11. Stream seepage and groundwater levels, Wood River Valley, south-central Idaho, 2012-13

    Science.gov (United States)

    Bartolino, James R.

    2014-01-01

    Stream discharge and water levels in wells were measured at multiple sites in the Wood River Valley, south-central Idaho, in August 2012, October 2012, and March 2013, as a component of data collection for a groundwater-flow model of the Wood River Valley aquifer system. This model is a cooperative and collaborative effort between the U.S. Geological Survey and the Idaho Department of Water Resources. Stream-discharge measurements for determination of seepage were made during several days on three occasions: August 27–28, 2012, October 22–24, 2012, and March 27–28, 2013. Discharge measurements were made at 49 sites in August and October, and 51 sites in March, on the Big Wood River, Silver Creek, their tributaries, and nearby canals. The Big Wood River generally gains flow between the Big Wood River near Ketchum streamgage (13135500) and the Big Wood River at Hailey streamgage (13139510), and loses flow between the Hailey streamgage and the Big Wood River at Stanton Crossing near Bellevue streamgage (13140800). Shorter reaches within these segments may differ in the direction or magnitude of seepage or may be indeterminate because of measurement uncertainty. Additional reaches were measured on Silver Creek, the North Fork Big Wood River, Warm Springs Creek, Trail Creek, and the East Fork Big Wood River. Discharge measurements also were made on the Hiawatha, Cove, District 45, Glendale, and Bypass Canals, and smaller tributaries to the Big Wood River and Silver Creek. Water levels in 93 wells completed in the Wood River Valley aquifer system were measured during October 22–24, 2012; these wells are part of a network established by the U.S. Geological Survey in 2006. Maps of the October 2012 water-table altitude in the unconfined aquifer and the potentiometric-surface altitude of the confined aquifer have similar topology to those on maps of October 2006 conditions. Between October 2006 and October 2012, water-table altitude in the unconfined aquifer rose by

  12. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  13. Technical note: River modelling to infer flood management framework

    African Journals Online (AJOL)

    River hydraulic models have successfully identified the weaknesses and areas for improvement with respect to flooding in the Sarawak River system, and can also be used to support decisions on flood management measures. Often, the big question is 'how'. This paper demonstrates a theoretical flood management ...

  14. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  15. Health and condition of endangered young-of-the-year Lost River and Shortnose suckers relative to water quality in Upper Klamath Lake, Oregon, 2014–2015

    Science.gov (United States)

    Burdick, Summer M.; Conway, Carla M.; Elliott, Diane G.; Hoy, Marshal S.; Dolan-Caret, Amari; Ostberg, Carl O.

    2017-10-19

    Most mortality of endangered Lost River (Deltistes luxatus) and shortnose (Chasmistes brevirostris) suckers in Upper Klamath Lake, Oregon, occurs within the first year of life. Juvenile suckers in Clear Lake Reservoir, California, survive longer and may even recruit to the spawning populations. In a previous (2013–2014) study, the health and condition of juvenile suckers and the dynamics of water quality between Upper Klamath Lake and Clear Lake Reservoir were compared. That study found that apparent signs of stress or exposure to irritants, such as peribiliary cuffing in liver tissue and mild inflammation and necrosis in gill tissues, were present in suckers from both lakes and were unlikely to be clues to the cause of differential mortality between lakes. Seasonal trends in energy storage as glycogen and triglycerides were also similar between lakes, indicating prey limitation was not a likely factor in differential mortality. To better understand the relationship between juvenile sucker health and water quality, we examined suckers collected in 2014–2015 from Upper Klamath Lake, where water quality can be dynamic and, at times, extreme.While there were notable differences in water quality and fish health between years, we were not able to identify any specific water-quality-related causes for differential fish condition. Water quality was generally better in 2014 than in 2015. When considered together afflictions and abnormalities generally indicated healthier suckers in 2014 than 2015. Low dissolved-oxygen events (water temperatures were warmer, particularly in July and September; and concentrations of microcystin in both large and small fractions of samples were lower in 2014 than in 2015. Total and therefore also un-ionized ammonia were low in 2014–2015 relative to concentrations known to affect suckers. Petechial hemorrhages of the skin, attached Lernaea spp. and eosinophilic hyaline droplets in the kidney tubules were less prevalent in 2014 than in

  16. Legacy sediment, lead, and zinc storage in channel and floodplain deposits of the Big River, Old Lead Belt Mining District, Missouri, USA

    Science.gov (United States)

    Pavlowsky, Robert T.; Lecce, Scott A.; Owen, Marc R.; Martin, Derek J.

    2017-12-01

    The Old Lead Belt of southeastern Missouri was one of the leading producers of Pb ore for more than a century (1869-1972). Large quantities of contaminated mine waste have been, and continue to be, supplied to local streams. This study assessed the magnitude and spatial distribution of mining-contaminated legacy sediment stored in channel and floodplain deposits of the Big River in the Ozark Highlands of southeastern Missouri. Although metal concentrations decline downstream from the mine sources, the channel and floodplain sediments are contaminated above background levels with Pb and Zn along its entire 171-km length below the mine sources. Mean concentrations in floodplain cores > 2000 mg kg- 1 for Pb and > 1000 mg kg- 1 for Zn extend 40-50 km downstream from the mining area in association with the supply of fine tailings particles that were easily dispersed downstream in the suspended load. Mean concentrations in channel bed and bar sediments ranging from 1400 to 1700 mg kg- 1 for Pb extend 30 km below the mines, while Zn concentrations of 1000-3000 mg kg- 1 extend 20 km downstream. Coarse dolomite fragments in the 2-16 mm channel sediment fraction provide significant storage of Pb and Zn, representing 13-20% of the bulk sediment storage mass in the channel and can contain concentrations of > 4000 mg kg- 1 for Pb and > 1000 mg kg- 1 for Zn. These coarse tailings have been transported a maximum distance of only about 30 km from the source over a period of 120 years for an average of about 250 m/y. About 37% of the Pb and 9% of the Zn that was originally released to the watershed in tailings wastes is still stored in the Big River. A total of 157 million Mg of contaminated sediment is stored along the Big River, with 92% of it located in floodplain deposits that are typically contaminated to depths of 1.5-3.5 m. These contaminated sediments store a total of 188,549 Mg of Pb and 34,299 Mg of Zn, of which 98% of the Pb and 95% of the Zn are stored in floodplain

  17. Compromised Rivers: Understanding Historical Human Impacts on Rivers in the Context of Restoration

    Directory of Open Access Journals (Sweden)

    Ellen Wohl

    2005-12-01

    Full Text Available A river that preserves a simplified and attractive form may nevertheless have lost function. Loss of function in these rivers can occur because hydrologic and geomorphic processes no longer create and maintain the habitat and natural disturbance regimes necessary for ecosystem integrity. Recognition of compromised river function is particularly important in the context of river restoration, in which the public perception of a river's condition often drives the decision to undertake restoration as well as the decision about what type of restoration should be attempted. Determining the degree to which a river has been altered from its reference condition requires a knowledge of historical land use and the associated effects on rivers. Rivers of the Front Range of the Colorado Rocky Mountains in the United States are used to illustrate how historical land uses such as beaver trapping, placer mining, tie drives, flow regulation, and the construction of transportation corridors continue to affect contemporary river characteristics. Ignorance of regional land use and river history can lead to restoration that sets unrealistic goals because it is based on incorrect assumptions about a river's reference condition or about the influence of persistent land-use effects.

  18. The Predictive Effect of Big Five Factor Model on Social Reactivity ...

    African Journals Online (AJOL)

    The study tested a model of providing a predictive explanation of Big Five Factor on social reactivity among secondary school adolescents of Cross River State, Nigeria. A sample of 200 students randomly selected across 12 public secondary schools in the State participated in the study (120 male and 80 female). Data ...

  19. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  20. An Analysis of Freshwater Mussels (Unionidae) in the Quiver River and Bogue Phalia, Mississippi, 1994-95

    National Research Council Canada - National Science Library

    Miller, Andrew

    1997-01-01

    .... The project area included a section of the Quiver River between its confluence with the Big Sunflower River immediately north of Highway 82 in Sunflower County to the Leflore-Tallahatchie county line...

  1. Geochronology and Geomorphology of the Pioneer Archaeological Site (10BT676), Upper Snake River Plain, Idaho

    Energy Technology Data Exchange (ETDEWEB)

    Keene, Joshua L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    The Pioneer site in southeastern Idaho, an open-air, stratified, multi-component archaeological locality on the upper Snake River Plain, provides an ideal situation for understanding the geomorphic history of the Big Lost River drainage system. We conducted a block excavation with the goal of understanding the geochronological context of both cultural and geomorphological components at the site. The results of this study show a sequence of five soil formation episodes forming three terraces beginning prior to 7200 cal yr BP and lasting until the historic period, preserving one cultural component dated to ~3800 cal yr BP and multiple components dating to the last 800 cal yr BP. In addition, periods of deposition and stability at Pioneer indicate climate fluctuation during the middle Holocene (~7200-3800 cal yr BP), minimal deposition during the late Holocene, and a period of increased deposition potentially linked to the Little Ice Age. In addition, evidence for a high-energy erosion event dated to ~3800 cal yr BP suggest a catastrophic flood event during the middle Holocene that may correlate with volcanic activity at the Craters of the Moon lava fields to the northwest. This study provides a model for the study of alluvial terrace formations in arid environments and their potential to preserve stratified archaeological deposits.

  2. Sediment composition of big Chinese and Indochinese rivers reflects geology of their source, not tectonic setting of their sink.

    Science.gov (United States)

    Garzanti, Eduardo; Andò, Sergio; Limonta, Mara; Nie, Junsheng; Resentini, Alberto; Vezzoli, Giovanni; Wang, Jiangang; Yang, Shouye

    2016-04-01

    There are several reasons why the tectonic setting of a sedimentary basin cannot be inferred from the composition of its sedimentary fill. One is that sediments can, and quite often are transported for thousands of kilometers from sources uplifted by certain tectonic processes to subsident basins created by totally different tectonic processes. A classical case is the Amazon River, carrying detritus from the Andean Cordillera to the Atlantic passive margin on the opposite side of South America (Franzinelli and Potter, 1983; Dickinson, 1988). Similar is the case of major rivers in China and Indochina, sourced in Tibetan orogenic highlands and reaching the Chinese passive margin or the back-arc/pull-apart Andaman Sea. The Huang He (Yellow River), the most sediment-laden river in the world, delivers annually to the Bohai Sea 1 billion tons of litho-feldspatho-quartzose sedimentaclastic/metamorphiclastic sediments with moderately rich, amphibole-epidote-garnet suites including apatite and zircon (Nie et al., 2015). The Changjiang (Yangtze) River, the fourth longest on Earth and the largest in Eurasia, carries to the East China Sea litho-feldspatho-quartzose sedimentaclastic/metamorphiclastic sand with moderately poor, amphibole-epidote suites including clinopyroxene and garnet (Vezzoli et al., 2016). The Ayeyarwadi (Irrawaddy) River, ranking among the five major rivers in the world for its annual load of 0.4 billion tons, carries to the Andaman Sea litho-feldspatho-quartzose metamorphiclastic/sedimentaclastic sand with moderately rich, amphibole-epidote suites including garnet and clinopyroxene (Garzanti et al., 2013). Detrital modes in these three very big river basins are thus similar, and would plot in the "Recycled Orogen" field of Dickinson (1985) rather than in the "Continental Block" or "Magmatic Arc" fields. The orogenic signature acquired in mountainous headwaters is carried all the way to the mouth, and even after long-distance transport across wide

  3. Demographics and run timing of adult Lost River (Deltistes luxatus) and short nose (Chasmistes brevirostris) suckers in Upper Klamath Lake, Oregon, 2012

    Science.gov (United States)

    Hewitt, David A.; Janney, Eric C.; Hayes, Brian S.; Harris, Alta C.

    2014-01-01

    Data from a long-term capture-recapture program were used to assess the status and dynamics of populations of two long-lived, federally endangered catostomids in Upper Klamath Lake, Oregon. Lost River suckers (Deltistes luxatus) and shortnose suckers (Chasmistes brevirostris) have been captured and tagged with passive integrated transponder (PIT) tags during their spawning migrations in each year since 1995. In addition, beginning in 2005, individuals that had been previously PIT-tagged were re-encountered on remote underwater antennas deployed throughout sucker spawning areas. Captures and remote encounters during spring 2012 were used to describe the spawning migrations in that year and also were incorporated into capture-recapture analyses of population dynamics. Cormack-Jolly-Seber (CJS) open population capture-recapture models were used to estimate annual survival probabilities, and a reverse-time analog of the CJS model was used to estimate recruitment of new individuals into the spawning populations. In addition, data on the size composition of captured fish were examined to provide corroborating evidence of recruitment. Model estimates of survival and recruitment were used to derive estimates of changes in population size over time and to determine the status of the populations in 2011. Separate analyses were conducted for each species and also for each subpopulation of Lost River suckers (LRS). Shortnose suckers (SNS) and one subpopulation of LRS migrate into tributary rivers to spawn, whereas the other LRS subpopulation spawns at groundwater upwelling areas along the eastern shoreline of the lake. In 2012, we captured, tagged, and released 749 LRS at four lakeshore spawning areas and recaptured an additional 969 individuals that had been tagged in previous years. Across all four areas, the remote antennas detected 6,578 individual LRS during the spawning season. Spawning activity peaked in April and most individuals were encountered at Cinder Flats and

  4. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  5. An Analysis of Lost Sales

    Directory of Open Access Journals (Sweden)

    Jeffrey E. Jarrett

    2015-08-01

    Full Text Available The purpose of this manuscript is to shed light on problems associated with lost sales and the incurring of cost associated with lost sales. An investigation is made to determine if seasonality in sales and lost sales have effects on the efficient operations of supply chains. Optimization is always a goal of management supply chains, but cost increases due to insufficient inventory, low-quality product and the like lead to customers not returning. These are lost sales that occur for many reasons. We study a data set to determine if the ignoring of time series component also has an effect on the variation in lost sales. If so, can we measure the magnitude of the effects of seasonal variation in lost sales, and what are their consequences?

  6. Status and trends of adult Lost River (Deltistes luxatus) and shortnose (Chasmistes brevirostris) sucker populations in Upper Klamath Lake, Oregon, 2017

    Science.gov (United States)

    Hewitt, David A.; Janney, Eric C.; Hayes, Brian S.; Harris, Alta C.

    2018-04-24

    Executive SummaryData from a long-term capture-recapture program were used to assess the status and dynamics of populations of two long-lived, federally endangered catostomids in Upper Klamath Lake, Oregon. Lost River suckers (LRS; Deltistes luxatus) and shortnose suckers (SNS; Chasmistes brevirostris) have been captured and tagged with passive integrated transponder (PIT) tags during their spawning migrations in each year since 1995. In addition, beginning in 2005, individuals that had been previously PIT-tagged were re-encountered on remote underwater antennas deployed throughout sucker spawning areas. Captures and remote encounters during the spawning season in spring 2016 were incorporated into capture-recapture analyses of population dynamics.Cormack-Jolly-Seber (CJS) open population capture-recapture models were used to estimate annual survival probabilities, and a reverse-time analog of the CJS model was used to estimate recruitment of new individuals into the spawning populations. In addition, data on the size composition of captured fish were examined to provide corroborating evidence of recruitment. Model estimates of survival and recruitment were used to derive estimates of changes in population size over time and to determine the status of the populations through 2015. Separate analyses were done for each species and also for each subpopulation of LRS. Shortnose suckers and one subpopulation of LRS migrate into tributary rivers to spawn, whereas the other LRS subpopulation spawns at groundwater upwelling areas along the eastern shoreline of the lake.Capture-recapture analyses indicated that with a few exceptions, the survival of males and females in both Lost River sucker subpopulations was high (greater than 0.88) from 1999 to 2015. Survival was notably lower for males from the river in 2000, 2006, and 2012, and for the shoreline areas in 2002. From 2001 to 2015, the abundance of males in the lakeshore spawning subpopulation decreased by at least 64

  7. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  8. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  9. Nitrogen Leaching in Intensive Cropping Systems in Tam Duong District, Red River Delta of Vietnam

    OpenAIRE

    Trinh, M.V.; Keulen, van, H.; Roetter, R.P.

    2010-01-01

    The environmental and economic consequences of nitrogen (N) lost in rice-based systems in Vietnam is important but has not been extensively studied. The objective of this study was to quantify the amount of N lost in major cropping systems in the Red River Delta. An experiment was conducted in the Red River Delta of Vietnam, on five different crops including rose, daisy, cabbage, chili, and a rice–rice–maize rotation during 2004 and 2005. Core soil samples were taken periodically in 20-cm inc...

  10. Kootenai River Resident Fish Assessment, FY2008 KTOI Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Holderman, Charles

    2009-06-26

    The overarching goal of project 1994-049-00 is to recover a productive, healthy and biologically diverse Kootenai River ecosystem, with emphasis on native fish species rehabilitation. It is especially designed to aid the recovery of important fish stocks, i.e. white sturgeon, burbot, bull trout, kokanee and several other salmonids important to the Kootenai Tribe of Idaho and regional sport-fisheries. The objectives of the project have been to address factors limiting key fish species within an ecosystem perspective. Major objectives include: establishment of a comprehensive and thorough biomonitoring program, investigate ecosystem--level in-river productivity, test the feasibility of a large-scale Kootenai River nutrient addition experiment (completed), to evaluate and rehabilitate key Kootenai River tributaries important to the health of the lower Kootenai River ecosystem, to provide funding for Canadian implementation of nutrient addition and monitoring in the Kootenai River ecosystem (Kootenay Lake) due to lost system productivity created by construction and operation of Libby Dam, mitigate the cost of monitoring nutrient additions in Arrow Lakes due to lost system productivity created by the Libby-Arrow water swap, provide written summaries of all research and activities of the project, and, hold a yearly workshop to convene with other agencies and institutions to discuss management, research, and monitoring strategies for this project and to provide a forum to coordinate and disseminate data with other projects involved in the Kootenai River basin.

  11. Restoring the Lost Rivers of Washington: Can a city's hydrologic past inform its future?

    OpenAIRE

    Millay, Curtis A.

    2005-01-01

    Washington, D.C., like many older U.S. cities, suffers the woes of rapid urbanization and aging infrastructure. The cityâ s combined sewer and stormwater system dumps millions of gallons of raw sewage into the Anacostia and Potomac Rivers over 70 times annually during significant rain events. While many groups, both public and private, attempt to clean the river, billions of dollars are still necessary over several years to remedy the combined sewer overfl ow (CSO) problem alone. Current pla...

  12. Detrital zircon study along the Tsangpo River, SE Tibet

    Science.gov (United States)

    Liang, Y.; Chung, S.; Liu, D.; O'Reilly, S. Y.; Chu, M.; Ji, J.; Song, B.; Pearson, N. J.

    2004-12-01

    The interactions among tectonic uplift, river erosion and alluvial deposition are fundamental processes that shape the landscape of the Himalayan-Tibetan orogen since its creation from early Cenozoic time. To better understand these processes around the eastern Himalayan Syntaxis, we conducted a study by systematic sampling riverbank sediments along the Tsangpo River, SE Tibet. Detrital zircons separated from the sediments were subjected to U-Pb dating by the SHRIMP II at the Beijing SHRIMP Center and then in-situ measurements of Hf isotope ratios using LA-MC-ICPMS at GEMOC. These results, together with U-Pb ages and Hf isotope data that we recently obtained for the Transhimalayan plutonic and surrounding basement rocks, allow a more quantitative examination of the provenance or protosource areas for the river sediments. Consequently, the percentage inputs from these source areas can be estimated. Our study indicates that, before the Tsangpo River flows into the Namche Barwa Syntaxis of the eastern Himalayas where the River forms a 180° Big Bend gorge and crosscuts the Himalayan sequences, the Gangdese batholith that crops out just north of the River appear to be an overwhelming source accounting for ˜50 % of the bank sediments. The Tethyan Himalayan sequences south of the River are the second important source, with an input of ˜25 %. The proportion of sediment supply changes after the River enters the Big Bend gorge and turns to south: ˜25 % of detrital zircons are derived from the Greater Himalayas so that the input from the Tethyan Himalayas decreases (< 10 %) despite those from the Gangdese batholith remains high ( ˜40 %). Comparing with the sediment budget of the Brahmaputra River in the downstream based on literature Sr, Nd and Os isotope information, which suggests dominant ( ˜90-60 %) but subordinate ( ˜10-40 %) contributions by the (Greater and Lesser) Himalayan and Tibetan (including Tethyan Himalayan) rocks, respectively, the change is interpreted

  13. Development of a HEC-RAS temperature model for the North Santiam River, northwestern Oregon

    Science.gov (United States)

    Stonewall, Adam J.; Buccola, Norman L.

    2015-01-01

    A one-dimensional, unsteady streamflow and temperature model (HEC-RAS) of the North Santiam and Santiam Rivers was developed by the U.S. Geological Survey to be used in conjunction with previously developed two-dimensional hydrodynamic water-quality models (CE-QUAL-W2) of Detroit and Big Cliff Lakes upstream of the study area. In conjunction with the output from the previously developed models, the HEC-RAS model can simulate streamflows and temperatures within acceptable limits (mean error [bias] near zero; typical streamflow errors less than 5 percent; typical water temperature errors less than 1.0 °C) for the length of the North Santiam River downstream of Big Cliff Dam under a series of potential future conditions in which dam structures and/or dam operations are modified to improve temperature conditions for threatened and endangered fish. Although a two-dimensional (longitudinal, vertical) CE-QUAL-W2 model for the North Santiam and Santiam Rivers downstream of Big Cliff Dam exists, that model proved unstable under highly variable flow conditions. The one-dimensional HEC-RAS model documented in this report can better simulate cross-sectional-averaged stream temperatures under a wide range of flow conditions.

  14. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  15. Modeling canopy-level productivity: is the "big-leaf" simplification acceptable?

    Science.gov (United States)

    Sprintsin, M.; Chen, J. M.

    2009-05-01

    The "big-leaf" approach to calculating the carbon balance of plant canopies assumes that canopy carbon fluxes have the same relative responses to the environment as any single unshaded leaf in the upper canopy. Widely used light use efficiency models are essentially simplified versions of the big-leaf model. Despite its wide acceptance, subsequent developments in the modeling of leaf photosynthesis and measurements of canopy physiology have brought into question the assumptions behind this approach showing that big leaf approximation is inadequate for simulating canopy photosynthesis because of the additional leaf internal control on carbon assimilation and because of the non-linear response of photosynthesis on leaf nitrogen and absorbed light, and changes in leaf microenvironment with canopy depth. To avoid this problem a sunlit/shaded leaf separation approach, within which the vegetation is treated as two big leaves under different illumination conditions, is gradually replacing the "big-leaf" strategy, for applications at local and regional scales. Such separation is now widely accepted as a more accurate and physiologically based approach for modeling canopy photosynthesis. Here we compare both strategies for Gross Primary Production (GPP) modeling using the Boreal Ecosystem Productivity Simulator (BEPS) at local (tower footprint) scale for different land cover types spread over North America: two broadleaf forests (Harvard, Massachusetts and Missouri Ozark, Missouri); two coniferous forests (Howland, Maine and Old Black Spruce, Saskatchewan); Lost Creek shrubland site (Wisconsin) and Mer Bleue petland (Ontario). BEPS calculates carbon fixation by scaling Farquhar's leaf biochemical model up to canopy level with stomatal conductance estimated by a modified version of the Ball-Woodrow-Berry model. The "big-leaf" approach was parameterized using derived leaf level parameters scaled up to canopy level by means of Leaf Area Index. The influence of sunlit

  16. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  17. Hydrologic conditions and distribution of selected radiochemical and chemical constituents in water, Snake River Plain aquifer, Idaho National Engineering Laboratory, Idaho, 1989 through 1991

    International Nuclear Information System (INIS)

    Bartholomay, R.C.; Orr, B.R.; Liszewski, M.J.; Jensen, R.G.

    1995-08-01

    Radiochemical and chemical wastewater discharged since 1952 to infiltration ponds and disposal wells at the Idaho National Engineering Laboratory (INEL) has affected water quality in the Snake River Plain aquifer. The U.S. Geological Survey, in cooperation with the U.S. Department of Energy, maintains a continuous monitoring network at the INEL to determine hydrologic trends and to delineate the movement of radiochemical and chemical wastes in the aquifer. This report presents an analysis of water-level and water-quality data collected from the Snake River Plain aquifer during 1989-91. Water in the eastern Snake River Plain aquifer moves principally through fractures and interflow zones in basalt, generally flows southwestward, and eventually discharges at springs along the Snake River. The aquifer is recharged principally from irrigation water, infiltration of streamflow, and ground-water inflow from adjoining mountain drainage basins. Water levels in wells throughout the INEL generally declined during 1989-91 due to drought. Detectable concentrations of radiochemical constituents in water samples from wells in the Snake River Plain aquifer at the INEL decreased or remained constant during 1989-91. Decreased concentrations are attributed to reduced rates of radioactive-waste disposal, sorption processes, radioactive decay, and changes in waste-disposal practices. Detectable concentrations of chemical constituents in water from the Snake River Plain aquifer at the INEL were variable during 1989-91. Sodium and chloride concentrations in the southern part of the INEL increased slightly during 1989-91 because of increased waste-disposal rates and a lack of recharge from the Big Lost River. Plumes of 1,1,1-trichloroethane have developed near the Idaho Chemical Processing Plant and the Radioactive Waste Management Complex as a result of waste disposal practices

  18. The coal deposits of the Alkali Butte, the Big Sand Draw, and the Beaver Creek fields, Fremont County, Wyoming

    Science.gov (United States)

    Thompson, Raymond M.; White, Vincent L.

    1952-01-01

    Large coal reserves are present in three areas located between 12 and 20 miles southeast of Riverton, Fremont County, central Wyoming. Coal in two of these areas, the Alkali Butte coal field and the Big Sand Draw coal field, is exposed on the surface and has been developed to some extent by underground mining. The Beaver Creek coal field is known only from drill cuttings and cores from wells drilled for oil and gas in the Beaver Creek oil and gas field.These three coal areas can be reached most readily from Riverton, Wyo. State Route 320 crosses Wind River about 1 mile south of Riverton. A few hundred yards south of the river a graveled road branches off the highway and extends south across the Popo Agie River toward Sand Draw oil and gas field. About 8 miles south of the highway along the Sand Draw road, a dirt road bears east and along this road it is about 12 miles to the Bell coal mine in the Alkali Butte coal field. Three miles southeast of the Alkali Butte turn-off, 3 miles of oiled road extends southwest into the Beaver Creek oil and gas field. About 6 miles southeast of the Beaver Creek turn-off, in the valley of Little Sand Draw Creek, a dirt road extends east 1. mile and then southeast 1 mile to the Downey mine in the Big Sand Draw coal field. Location of these coal fields is shown on figure 1 with their relationship to the Wind River basin and other coal fields, place localities, and wells mentioned in this report. The coal in the Alkali Butte coal field is exposed partly on the Wind River Indian Reservation in Tps. 1 and 2 S., R. 6 E., and partly on public land. Coal in the Beaver Creek and Big Sand Draw coal fields is mainly on public land. The region has a semiarid climate with rainfall averaging less than 10 in. per year. When rain does fall the sandy-bottomed stream channels fill rapidly and are frequently impassable for a few hours. Beaver Creek, Big Sand Draw, Little Sand Draw, and Kirby Draw and their smaller tributaries drain the area and flow

  19. Untangling Trends and Drivers of Changing River Discharge Along Florida's Gulf Coast

    Science.gov (United States)

    Glodzik, K.; Kaplan, D. A.; Klarenberg, G.

    2017-12-01

    Along the relatively undeveloped Big Bend coastline of Florida, discharge in many rivers and springs is decreasing. The causes are unclear, though they likely include a combination of groundwater extraction for water supply, climate variability, and altered land use. Saltwater intrusion from altered freshwater influence and sea level rise is causing transformative ecosystem impacts along this flat coastline, including coastal forest die-off and oyster reef collapse. A key uncertainty for understanding river discharge change is predicting discharge from rainfall, since Florida's karstic bedrock stores large amounts of groundwater, which has a long residence time. This study uses Dynamic Factor Analysis (DFA), a multivariate data reduction technique for time series, to find common trends in flow and reveal hydrologic variables affecting flow in eight Big Bend rivers since 1965. The DFA uses annual river flows as response time series, and climate data (annual rainfall and evapotranspiration by watershed) and climatic indices (El Niño Southern Oscillation [ENSO] Index and North Atlantic Oscillation [NAO] Index) as candidate explanatory variables. Significant explanatory variables (one evapotranspiration and three rainfall time series) explained roughly 50% of discharge variation across rivers. Significant trends (representing unexplained variation) were shared among rivers, with geographical grouping of five northern rivers and three southern rivers, along with a strong downward trend affecting six out of eight systems. ENSO and NAO had no significant impact. Advancing knowledge of these dynamics is necessary for forecasting how altered rainfall and temperatures from climate change may impact flows. Improved forecasting is especially important given Florida's reliance on groundwater extraction to support its growing population.

  20. Geochemistry of groundwater in the eastern Snake River Plain aquifer, Idaho National Laboratory and vicinity, eastern Idaho

    Science.gov (United States)

    Rattray, Gordon W.

    2018-05-30

    Nuclear research activities at the U.S. Department of Energy (DOE) Idaho National Laboratory (INL) in eastern Idaho produced radiochemical and chemical wastes that were discharged to the subsurface, resulting in detectable concentrations of some waste constituents in the eastern Snake River Plain (ESRP) aquifer. These waste constituents may pose risks to the water quality of the aquifer. In order to understand these risks to water quality the U.S. Geological Survey, in cooperation with the DOE, conducted a study of groundwater geochemistry to improve the understanding of hydrologic and chemical processes in the ESRP aquifer at and near the INL and to understand how these processes affect waste constituents in the aquifer.Geochemistry data were used to identify sources of recharge, mixing of water, and directions of groundwater flow in the ESRP aquifer at the INL. The geochemistry data were analyzed from 167 sample sites at and near the INL. The sites included 150 groundwater, 13 surface-water, and 4 geothermal-water sites. The data were collected between 1952 and 2012, although most data collected at the INL were collected from 1989 to 1996. Water samples were analyzed for all or most of the following: field parameters, dissolved gases, major ions, dissolved metals, isotope ratios, and environmental tracers.Sources of recharge identified at the INL were regional groundwater, groundwater from the Little Lost River (LLR) and Birch Creek (BC) valleys, groundwater from the Lost River Range, geothermal water, and surface water from the Big Lost River (BLR), LLR, and BC. Recharge from the BLR that may have occurred during the last glacial epoch, or paleorecharge, may be present at several wells in the southwestern part of the INL. Mixing of water at the INL primarily included mixing of surface water with groundwater from the tributary valleys and mixing of geothermal water with regional groundwater. Additionally, a zone of mixing between tributary valley water and

  1. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  2. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  3. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  4. Distribution and movement of Big Spring spinedace (Lepidomeda mollispinis pratensis) in Condor Canyon, Meadow Valley Wash, Nevada

    Science.gov (United States)

    Jezorek, Ian G.; Connolly, Patrick J.

    2013-01-01

    Big Spring spinedace (Lepidomeda mollispinis pratensis) is a cyprinid whose entire population occurs within a section of Meadow Valley Wash, Nevada. Other spinedace species have suffered population and range declines (one species is extinct). Managers, concerned about the vulnerability of Big Spring spinedace, have considered habitat restoration actions or translocation, but they have lacked data on distribution or habitat use. Our study occurred in an 8.2-km section of Meadow Valley Wash, including about 7.2 km in Condor Canyon and 0.8 km upstream of the canyon. Big Spring spinedace were present upstream of the currently listed critical habitat, including in the tributary Kill Wash. We found no Big Spring spinedace in the lower 3.3 km of Condor Canyon. We tagged Big Spring spinedace ≥70 mm fork length (range 70–103 mm) with passive integrated transponder tags during October 2008 (n = 100) and March 2009 (n = 103) to document movement. At least 47 of these individuals moved from their release location (up to 2 km). Thirty-nine individuals moved to Kill Wash or the confluence area with Meadow Valley Wash. Ninety-three percent of movement occurred in spring 2009. Fish moved both upstream and downstream. We found no movement downstream over a small waterfall at river km 7.9 and recorded only one fish that moved downstream over Delmue Falls (a 12-m drop) at river km 6.1. At the time of tagging, there was no significant difference in fork length or condition between Big Spring Spinedace that were later detected moving and those not detected moving. We found no significant difference in fork length or condition at time of tagging of Big Spring spinedace ≥70 mm fork length that were detected moving and those not detected moving. Kill Wash and its confluence area appeared important to Big Spring spinedace; connectivity with these areas may be key to species persistence. These areas may provide a habitat template for restoration or translocation. The lower 3.3 km of

  5. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  6. Lost in Translation (LiT): IUPHAR Review 6.

    Science.gov (United States)

    Dollery, Colin T

    2014-05-01

    Translational medicine is a roller coaster with occasional brilliant successes and a large majority of failures. Lost in Translation 1 ('LiT1'), beginning in the 1950s, was a golden era built upon earlier advances in experimental physiology, biochemistry and pharmacology, with a dash of serendipity, that led to the discovery of many new drugs for serious illnesses. LiT2 saw the large-scale industrialization of drug discovery using high-throughput screens and assays based on affinity for the target molecule. The links between drug development and university sciences and medicine weakened, but there were still some brilliant successes. In LiT3, the coverage of translational medicine expanded from molecular biology to drug budgets, with much greater emphasis on safety and official regulation. Compared with R&D expenditure, the number of breakthrough discoveries in LiT3 was disappointing, but monoclonal antibodies for immunity and inflammation brought in a new golden era and kinase inhibitors such as imatinib were breakthroughs in cancer. The pharmaceutical industry is trying to revive the LiT1 approach by using phenotypic assays and closer links with academia. LiT4 faces a data explosion generated by the genome project, GWAS, ENCODE and the 'omics' that is in danger of leaving LiT4 in a computerized cloud. Industrial laboratories are filled with masses of automated machinery while the scientists sit in a separate room viewing the results on their computers. Big Data will need Big Thinking in LiT4 but with so many unmet medical needs and so many new opportunities being revealed there are high hopes that the roller coaster will ride high again. © 2014 The British Pharmacological Society.

  7. Salmonid Gamete Preservation in the Snake River Basin : 2000 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Robyn; Kucera, Paul A. [Nez Perce Tribe. Dept. of Fisheries Resource Management, Lapwai, ID (US)

    2001-06-01

    Steelhead (Oncorhynchus mykiss) and chinook salmon (Oncorhynchus tshawytscha) populations in the Northwest are decreasing. Genetic diversity is being lost at an alarming rate. The Nez Perce Tribe (Tribe) strives to ensure availability of genetic samples of the existing male salmonid population by establishing and maintaining a germplasm repository. The sampling strategy, initiated in 1992, has been to collect and preserve male salmon and steelhead genetic diversity across the geographic landscape by sampling within the major river subbasins in the Snake River basin, assuming a metapopulation structure existed historically. Gamete cryopreservation conserves genetic diversity in a germplasm repository, but is not a recovery action for listed fish species. The Tribe was funded in 2000 by the Bonneville Power Administration (BPA) and the U.S. Fish and Wildlife Service Lower Snake River Compensation Plan (LSRCP) to coordinate gene banking of male gametes from Endangered Species Act listed steelhead and spring and summer chinook salmon in the Snake River basin. In 2000, a total of 349 viable chinook salmon semen samples from the Lostine River, Catherine Creek, upper Grande Ronde River, Lookingglass Hatchery (Imnaha River stock), Rapid River Hatchery, Lake Creek, the South Fork Salmon River weir, Johnson Creek, Big Creek, Capehorn Creek, Marsh Creek, Pahsimeroi Hatchery, and Sawtooth Hatchery (upper Salmon River stock) were cryopreserved. Also, 283 samples of male steelhead gametes from Dworshak Hatchery, Fish Creek, Grande Ronde River, Imnaha River, Little Sheep Creek, Pahsimeroi Hatchery and Oxbow Hatchery were also cryopreserved. The Tribe acquired 5 frozen steelhead samples from the Selway River collected in 1994 and 15 from Fish Creek sampled in 1993 from the U.S. Geological Survey, for addition into the germplasm repository. Also, 590 cryopreserved samples from the Grande Ronde chinook salmon captive broodstock program are being stored at the University of Idaho as

  8. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  9. Hydrological forecast of maximal water level in Lepenica river basin and flood control measures

    Directory of Open Access Journals (Sweden)

    Milanović Ana

    2006-01-01

    Full Text Available Lepenica river basin territory has became axis of economic and urban development of Šumadija district. However, considering Lepenica River with its tributaries, and their disordered river regime, there is insufficient of water for water supply and irrigation, while on the other hand, this area is suffering big flood and torrent damages (especially Kragujevac basin. The paper presents flood problems in the river basin, maximum water level forecasts, and flood control measures carried out until now. Some of the potential solutions, aiming to achieve the effective flood control, are suggested as well.

  10. Bats of the Savannah River Site and vicinity.

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Menzel; J.M. Menzel; J.C. Kilgo; W.M. Ford; T.C. Carter; J.W. Edwards

    2003-10-01

    The U.S. Department of Energy's Savannah River Site supports a diverse bat community. Nine species occur there regularly, including the eastern pipistrelle (Pipistrellus subflavus), southeastern myotis (Myotis austroriparius), evening bat (Nycticeius humeralis), Rafinesque's big-eared bat (Corynorhinus rafinesquii), silver-haired bat (Lasionycteris noctivagans), eastern red bat (Lasiurus borealis), Seminole bat (L. seminolus), hoary bat (L. cinereus), and big brown bat (Eptesicus fuscus). There are extralimital capture records for two additional species: little brown bat (M. lucifigus) and northern yellow bat (Lasiurus intermedius). Acoustical sampling has documented the presence of Brazilian free-tailed bats (Tadarida brasiliensis), but none has been captured. Among those species common to the Site, the southeastern myotis and Rafinesque's big-eared bat are listed in South Carolina as threatened and endangered, respectively. The presence of those two species, and a growing concern for the conservation of forest-dwelling bats, led to extensive and focused research on the Savannah River Site between 1996 and 2002. Summarizing this and other bat research, we provide species accounts that discuss morphology and distribution, roosting and foraging behaviors, home range characteristics, habitat relations, and reproductive biology. We also present information on conservation needs and rabies issues; and, finally, identification keys that may be useful wherever the bat species we describe are found.

  11. Thirty Years Later: Reflections of the Big Thompson Flood, Colorado, 1976 to 2006

    Science.gov (United States)

    Jarrett, R. D.; Costa, J. E.; Brunstein, F. C.; Quesenberry, C. A.; Vandas, S. J.; Capesius, J. P.; O'Neill, G. B.

    2006-12-01

    Thirty years ago, over 300 mm of rain fell in about 4 to 6 hours in the middle reaches of the Big Thompson River Basin during the devastating flash flood on July 31, 1976. The rainstorm produced flood discharges that exceeded 40 m3/s/km2. A peak discharge of 883 m3/s was estimated at the Big Thompson River near Drake streamflow-gaging station. The raging waters left 144 people dead, 250 injured, and over 800 people were evacuated by helicopter. Four-hundred eighteen homes and businesses were destroyed, as well as 438 automobiles, and damage to infrastructure left the canyon reachable only via helicopter. Total damage was estimated in excess of $116 million (2006 dollars). Natural hazards similar to the Big Thompson flood are rare, but the probability of a similar event hitting the Front Range, other parts of Colorado, or other parts of the Nation is real. Although much smaller in scale than the Big Thompson flood, several flash floods have happened during the monsoon in early July 2006 in the Colorado foothills that reemphasized the hazards associated with flash flooding. The U.S. Geological Survey (USGS) conducts flood research to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson flood. A summary of hydrologic conditions of the 1976 flood, what the 1976 flood can teach us about flash floods, a description of some of the advances in USGS flood science as a consequence of this disaster, and lessons that we learned to help reduce loss of life from this extraordinary flash flood are discussed. In the 30 years since the Big Thompson flood, there have been important advances in streamflow monitoring and flood warning. The National Weather Service (NWS) NEXRAD radar allows real-time monitoring of precipitation in most places in the United States. The USGS currently (2006) operates about 7,250 real-time streamflow-gaging stations in the United States that are monitored by the USGS, the NWS, and emergency managers

  12. The costs of diabetes among Australians aged 45-64 years from 2015 to 2030: projections of lost productive life years (PLYs), lost personal income, lost taxation revenue, extra welfare payments and lost gross domestic product from Health&WealthMOD2030.

    Science.gov (United States)

    Schofield, Deborah; Shrestha, Rupendra N; Cunich, Michelle M; Passey, Megan E; Veerman, Lennert; Tanton, Robert; Kelly, Simon J

    2017-01-09

    To project the number of people aged 45-64 years with lost productive life years (PLYs) due to diabetes and related costs (lost income, extra welfare payments, lost taxation revenue); and lost gross domestic product (GDP) attributable to diabetes in Australia from 2015 to 2030. A simulation study of how the number of people aged 45-64 years with diabetes increases over time (based on population growth and disease trend data) and the economic losses incurred by individuals and the government. Cross-sectional outputs of a microsimulation model (Health&WealthMOD2030) which used the Australian Bureau of Statistics' Survey of Disability, Ageing and Carers 2003 and 2009 as a base population and integrated outputs from two microsimulation models (Static Incomes Model and Australian Population and Policy Simulation Model), Treasury's population and labour force projections, and chronic disease trends data. Australian population aged 45-64 years in 2015, 2020, 2025 and 2030. Lost PLYs, lost income, extra welfare payments, lost taxation revenue, lost GDP. 18 100 people are out of the labour force due to diabetes in 2015, increasing to 21 400 in 2030 (18% increase). National costs consisted of a loss of $A467 million in annual income in 2015, increasing to $A807 million in 2030 (73% increase). For the government, extra annual welfare payments increased from $A311 million in 2015 to $A350 million in 2030 (13% increase); and lost annual taxation revenue increased from $A102 million in 2015 to $A166 million in 2030 (63% increase). A loss of $A2.1 billion in GDP was projected for 2015, increasing to $A2.9 billion in 2030 attributable to diabetes through its impact on PLYs. Individuals incur significant costs of diabetes through lost PLYs and lost income in addition to disease burden through human suffering and healthcare costs. The government incurs extra welfare payments, lost taxation revenue and lost GDP, along with direct healthcare costs. Published by the BMJ

  13. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  14. M-area basin closure-Savannah River Site

    International Nuclear Information System (INIS)

    McMullin, S.R.; Horvath, J.G.

    1991-01-01

    M-Area, on the Savannah River Site, processes raw materials and manufactures fuel and target rods for reactor use. Effluent from these processes were discharged into the M-Area settling basin and Lost Lake, a natural wetland. The closure of this basin began in 1988 and included the removal and stabilization of basin fluids, excavation of all contaminated soils from affected areas and Lost Lake, and placement of all materials in the bottom of the emptied basin. These materials were covered with a RCRA style cap, employing redundant barriers of kaolin clay and geosynthetic material. Restoration of excavated uplands and wetlands is currently underway

  15. Big River Benthos: Linking Year Round Biological Response to Altered Hydrological Regimes

    Science.gov (United States)

    2017-04-02

    Sieved material was then placed in Whirl-Pak® bags, preserved with 80% EtOH, and returned to the ERDC Fish Ecology Laboratory in Vicksburg, MS... ecological response to altered flow regimes and help document benefits of restoring connectivity between secondary channels and the Mississippi River main...Modifications of the flow and function of the Mississippi River have only increased since then — markedly so after the Great Flood of 1927, an event that

  16. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  17. 77 FR 73739 - Endangered and Threatened Wildlife and Plants; Designation of Critical Habitat for Lost River...

    Science.gov (United States)

    2012-12-11

    ... sucker and shortnose sucker under the Act. For more information on the biology and ecology of the Lost... conservation biology principles. We received responses from two of the peer reviewers. We reviewed all comments... microgram ([micro]g) per liter (L). The peer reviewer stated that this is the World Health Organization...

  18. Status and trends of adult Lost River (Deltistes luxatus) and shortnose (Chasmistes brevirostris) sucker populations in Upper Klamath Lake, Oregon, 2015

    Science.gov (United States)

    Hewitt, David A.; Janney, Eric C.; Hayes, Brian S.; Harris, Alta C.

    2017-07-21

    Executive SummaryData from a long-term capture-recapture program were used to assess the status and dynamics of populations of two long-lived, federally endangered catostomids in Upper Klamath Lake, Oregon. Lost River suckers (LRS; Deltistes luxatus) and shortnose suckers (SNS; Chasmistes brevirostris) have been captured and tagged with passive integrated transponder (PIT) tags during their spawning migrations in each year since 1995. In addition, beginning in 2005, individuals that had been previously PIT-tagged were re-encountered on remote underwater antennas deployed throughout sucker spawning areas. Captures and remote encounters during the spawning season in spring 2015 were incorporated into capture-recapture analyses of population dynamics. Cormack-Jolly-Seber (CJS) open population capture-recapture models were used to estimate annual survival probabilities, and a reverse-time analog of the CJS model was used to estimate recruitment of new individuals into the spawning populations. In addition, data on the size composition of captured fish were examined to provide corroborating evidence of recruitment. Separate analyses were done for each species and also for each subpopulation of LRS. Shortnose suckers and one subpopulation of LRS migrate into tributary rivers to spawn, whereas the other LRS subpopulation spawns at groundwater upwelling areas along the eastern shoreline of the lake. Characteristics of the spawning migrations in 2015, such as the effects of temperature on the timing of the migrations, were similar to past years.Capture-recapture analyses for the LRS subpopulation that spawns at the shoreline areas included encounter histories for 13,617 individuals, and analyses for the subpopulation that spawns in the rivers included 39,321 encounter histories. With a few exceptions, the survival of males and females in both subpopulations was high (greater than or equal to 0.86) between 1999 and 2013. Survival was notably lower for males from the rivers

  19. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  20. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  1. Two-Dimensional (2-D) Acoustic Fish Tracking at River Mile 85, Sacramento River, California

    Science.gov (United States)

    2013-06-01

    on fish become known (USACE 2004). Levee repair and constructed habitat features included (1) protection of the toe and upper slopes of the bank...be recovered rather than being lost due to sediment dunes , large woody material floating downstream, and vandalism. The RM 85 site was a relatively...into the river channel. The addition of this material narrowed the channel and created a scour feature along the toe of the repair site. VPS array

  2. 75 FR 5758 - Bridger-Teton National Forest, Big Piney Ranger District, WY; Piney Creeks Vegetation Treatment

    Science.gov (United States)

    2010-02-04

    ... analysis area is approximately 20,000 acres within this watershed and includes the creeks of South, Middle... and for further site specific analysis of effects. It is approximately 25 miles west of Big Piney, Wyoming in the Green River drainage, on the east slope of the Wyoming range. All lands within the analysis...

  3. Lost in a Transmedia Universe

    Directory of Open Access Journals (Sweden)

    Aaron Smith

    2011-12-01

    Full Text Available Este artigo (previamente publicado como um capítulo de minha tese analisa os mecanismos transmidiáticos de storytelling por trás da narrativa do seriado Lost, da ABC. Por ter sua narrativa sustentada em uma complexa mitologia, Lost faz um grande esforço para suplementar a narrativa de seu programa de TV através de valiosas e distintas extensões narrativas. Em um primeiro momento, eu examino como as técnicas de construção de mundo em Lost encorajam os fãs mais ávidos a “jogar” com o espaço narrativo. Eu então faço avaliações sobre as extensões que Lost oferece como opcionais, através de experiências convincentes em seus textos expandido. Ao  ser muito bem-sucedida ao balancear seus fãs mais ávidos com os casuais espectadores, Lost representa o futuro de muitos programas de televisão que se propõem a colocar os fãs em situações imersivas, usando um vasto universo transmídia, ao mesmo tempo prometendo um programa de televisão coerente em seu interior

  4. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  5. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  6. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  7. Bedrock geologic map of the Spring Valley, West Plains, and parts of the Piedmont and Poplar Bluff 30'x60' quadrangles, Missouri, including the upper Current River and Eleven Point River drainage basins

    Science.gov (United States)

    Weary, David J.; Harrison, Richard W.; Orndorff, Randall C.; Weems, Robert E.; Schindler, J. Stephen; Repetski, John E.; Pierce, Herbert A.

    2015-01-01

    This map covers the drainage basins of the upper Current River and the Eleven Point River in the Ozark Plateaus physiographic province of southeastern Missouri. The two surface drainage basins are contiguous in their headwaters regions, but are separated in their lower reaches by the lower Black River basin in the southeast corner of the map area. Numerous dye-trace studies demonstrate that in the contiguous headwaters areas, groundwater flows from the Eleven Point River basin into the Current River basin. Much of the groundwater discharge of the Eleven Point River basin emanates from Big Spring, located on the Current River. This geologic map and cross sections were produced to help fulfill a need to understand the geologic framework of the region in which this subsurface flow occurs.

  8. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  9. Big Canyon Creek Ecological Restoration Strategy.

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Lynn; Richardson, Shannon

    2007-10-01

    He-yey, Nez Perce for steelhead or rainbow trout (Oncorhynchus mykiss), are a culturally and ecologically significant resource within the Big Canyon Creek watershed; they are also part of the federally listed Snake River Basin Steelhead DPS. The majority of the Big Canyon Creek drainage is considered critical habitat for that DPS as well as for the federally listed Snake River fall chinook (Oncorhynchus tshawytscha) ESU. The Nez Perce Soil and Water Conservation District (District) and the Nez Perce Tribe Department of Fisheries Resources Management-Watershed (Tribe), in an effort to support the continued existence of these and other aquatic species, have developed this document to direct funding toward priority restoration projects in priority areas for the Big Canyon Creek watershed. In order to achieve this, the District and the Tribe: (1) Developed a working group and technical team composed of managers from a variety of stakeholders within the basin; (2) Established geographically distinct sub-watershed areas called Assessment Units (AUs); (3) Created a prioritization framework for the AUs and prioritized them; and (4) Developed treatment strategies to utilize within the prioritized AUs. Assessment Units were delineated by significant shifts in sampled juvenile O. mykiss (steelhead/rainbow trout) densities, which were found to fall at fish passage barriers. The prioritization framework considered four aspects critical to determining the relative importance of performing restoration in a certain area: density of critical fish species, physical condition of the AU, water quantity, and water quality. It was established, through vigorous data analysis within these four areas, that the geographic priority areas for restoration within the Big Canyon Creek watershed are Big Canyon Creek from stream km 45.5 to the headwaters, Little Canyon from km 15 to 30, the mainstem corridors of Big Canyon (mouth to 7km) and Little Canyon (mouth to 7km). The District and the Tribe

  10. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  11. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  12. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  13. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  14. 21 CFR 1305.26 - Lost electronic orders.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Lost electronic orders. 1305.26 Section 1305.26... CONTROLLED SUBSTANCES Electronic Orders § 1305.26 Lost electronic orders. (a) If a purchaser determines that an unfilled electronic order has been lost before or after receipt, the purchaser must provide, to...

  15. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  16. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  17. Flow intermittence and ecosystem services in rivers of the Anthropocene_Figure 4_Journal of Applied Ecology

    Data.gov (United States)

    U.S. Environmental Protection Agency — Counts of ecosystem service status (provided, altered, and lost/absent) during three hydrological phases (flowing, pool, dry) typically seen in intermittent rivers...

  18. Hybridization threatens shoal bass populations in the Upper Chattahoochee River Basin: Chapter 37

    Science.gov (United States)

    Dakin, Elizabeth E; Porter, Brady A.; Freeman, Byron J.; Long, James M.; Tringali, Michael D.; Long, James M.; Birdsong, Timothy W.; Allen, Micheal S.

    2015-01-01

    Shoal bass are native only to the Apalachicola-Chattahoochee-Flint river system of Georgia, Alabama, and Florida, and are vulnerable to extinction as a result of population fragmentation and introduction of non-native species. We assessed the genetic integrity of isolated populations of shoal bass in the upper Chattahoochee River basin (above Lake Lanier, Big Creek, and below Morgan Falls Dam) and sought to identify rates of hybridization with non-native, illegally stocked smallmouth bass and spotted bass.

  19. Cause-specific measures of life years lost

    Directory of Open Access Journals (Sweden)

    Per Kragh Andersen

    2013-12-01

    Full Text Available Background: A new measure of the number of life years lost due to specific causes of death is introduced. Methods: This measure is based on the cumulative incidence of death, it does not require "independence" of causes, and it satisfies simple balance equations: "total number of life years lost = sum of cause-specific life years lost", and "total number of life years lost before age x + temporary life expectancy between birth and age x = x". Results: The measure is contrasted to alternatives suggested in the demographic literature and allmethods are illustrated using Danish and Russian multiple decrement life-tables.

  20. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  1. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  2. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  3. Spatial and temporal trends of freshwater mussel assemblages in the Meramec River Basin, Missouri, USA

    Science.gov (United States)

    Hinck, Jo Ellen; McMurray, Stephen E.; Roberts, Andrew D.; Barnhart, M. Christopher; Ingersoll, Christopher G.; Wang, Ning; Augspurger, Tom

    2012-01-01

    The Meramec River basin in east-central Missouri has one of the most diverse unionoid mussel faunas in the central United States with >40 species identified. Data were analyzed from historical surveys to test whether diversity and abundance of mussels in the Meramec River basin (Big, Bourbeuse, and Meramec rivers, representing >400 river miles) decreased between 1978 and 1997. We found that over 20y, species richness and diversity decreased significantly in the Bourbeuse and Meramec rivers but not in the Big River. Most species were found at fewer sites and in lower numbers in 1997 than in 1978. Federally endangered species and Missouri Species of Conservation Concern with the most severe temporal declines were Alasmidonta viridis, Arcidens confragosus, Elliptio crassidens, Epioblasma triquetra, Fusconaia ebena, Lampsilis abrupta, Lampsilis brittsi, and Simpsonaias ambigua. Averaged across all species, mussels were generally being extirpated from historical sampling sites more rapidly than colonization was occurring. An exception was one reach of the Meramec River between river miles 28.4 and 59.5, where mussel abundance and diversity were greater than in other reaches and where colonization of Margaritiferidae, Lampsilini, and Quadrulini exceeded extirpation. The exact reasons mussel diversity and abundance have remained robust in this 30- mile reach is uncertain, but the reach is associated with increased gradients, few long pools, and vertical rock faces, all of which are preferable for mussels. Complete loss of mussel communities at eight sites (16%) with relatively diverse historical assemblages was attributed to physical habitat changes including bank erosion, unstable substrate, and sedimentation. Mussel conservation efforts, including restoring and protecting riparian habitats, limiting the effects of in-stream sand and gravel mining, monitoring and controlling invasive species, and protecting water quality, may be warranted in the Meramec River basin.

  4. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  5. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  6. Lost in Location

    DEFF Research Database (Denmark)

    Hansen, Lone Koefoed

    2009-01-01

    traversed. While becoming destination aware, the individual loses her location awareness. The article proposes that the reason people get lost when using sat-nav is due to a wrong location-performative paradigm. As an alternative, the article introduces and analyzes two performance-related examples...... that illustrate an alternative location-performative paradigm: Meredith Warner's Lost/Found knitting series and Etter and Schecht's Melodious Walkabout. In both examples, the artist's hand becomes the intermediary between alien and location. Thus, by exploring how wayfinding can be a poetically situated...... performance, the article examines how the growing locative media industry can learn from the location-aware performative strategies employed by artists who create situated and urban performances for the curious participant. The academic frames employed in the analysis draw on psychogeography, site...

  7. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  8. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  9. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  10. Applications of digital image analysis capability in Idaho

    Science.gov (United States)

    Johnson, K. A.

    1981-01-01

    The use of digital image analysis of LANDSAT imagery in water resource assessment is discussed. The data processing systems employed are described. The determination of urban land use conversion of agricultural land in two southwestern Idaho counties involving estimation and mapping of crop types and of irrigated land is described. The system was also applied to an inventory of irrigated cropland in the Snake River basin and establishment of a digital irrigation water source/service area data base for the basin. Application of the system to a determination of irrigation development in the Big Lost River basin as part of a hydrologic survey of the basin is also described.

  11. 28 CFR 301.204 - Continuation of lost-time wages.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Continuation of lost-time wages. 301.204... ACCIDENT COMPENSATION Lost-Time Wages § 301.204 Continuation of lost-time wages. (a) Once approved, the inmate shall receive lost-time wages until the inmate: (1) Is released; (2) Is transferred to another...

  12. Fall Chinook Salmon Survival and Supplementation Studies in the Snake River Reservoirs, 1996 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Williams, John G.; Bjornn (Bjomn), Theodore C.

    1998-05-01

    In 1996, the National Marine Fisheries Service, the Nez Perce Tribe, and the U.S. Fish and Wildlife Service completed the second year of cooperative research to investigate migrational characteristics of subyearling fall chinook salmon in the Snake River Basin. In spring and early summer, we captured natural subyearling fall chinook salmon by beach seine, PIT tagged them, and released them in two reaches of the Snake River. Also, subyearling fall chinook salmon reared at Lyons Ferry Hatchery were PIT tagged at the hatchery, transported, and released weekly at Pittsburg Landing on the Snake River and Big Canyon Creek on the Clearwater River to collect data on survival detection probabilities, and travel time.

  13. Monitoring and Evaluation of Environmental Flow Prescriptions for Five Demonstration Sites of the Sustainable Rivers Project

    Science.gov (United States)

    Konrad, Christopher P.

    2010-01-01

    The Nature Conservancy has been working with U.S. Army Corps of Engineers (Corps) through the Sustainable Rivers Project (SRP) to modify operations of dams to achieve ecological objectives in addition to meeting the authorized purposes of the dams. Modifications to dam operations are specified in terms of environmental flow prescriptions that quantify the magnitude, duration, frequency, and seasonal timing of releases to achieve specific ecological outcomes. Outcomes of environmental flow prescriptions implemented from 2002 to 2008 have been monitored and evaluated at demonstration sites in five rivers: Green River, Kentucky; Savannah River, Georgia/South Carolina; Bill Williams River, Arizona; Big Cypress Creek, Texas; and Middle Fork Willamette River, Oregon. Monitoring and evaluation have been accomplished through collaborative partnerships of federal and state agencies, universities, and nongovernmental organizations.

  14. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  15. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  16. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  17. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  18. 28 CFR 301.203 - Payment of lost-time wages.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Payment of lost-time wages. 301.203... ACCIDENT COMPENSATION Lost-Time Wages § 301.203 Payment of lost-time wages. (a) An inmate worker may receive lost-time wages for the number of regular work hours absent from work due to injury sustained in...

  19. 50 CFR 25.22 - Lost and found articles.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Lost and found articles. 25.22 Section 25.22 Wildlife and Fisheries UNITED STATES FISH AND WILDLIFE SERVICE, DEPARTMENT OF THE INTERIOR... Lost and found articles. Lost articles or money found on a national wildlife refuge are to be...

  20. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  1. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  2. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  3. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  4. Chemical weathering as a mechanism for the climatic control of bedrock river incision

    Science.gov (United States)

    Murphy, Brendan P.; Johnson, Joel P. L.; Gasparini, Nicole M.; Sklar, Leonard S.

    2016-04-01

    Feedbacks between climate, erosion and tectonics influence the rates of chemical weathering reactions, which can consume atmospheric CO2 and modulate global climate. However, quantitative predictions for the coupling of these feedbacks are limited because the specific mechanisms by which climate controls erosion are poorly understood. Here we show that climate-dependent chemical weathering controls the erodibility of bedrock-floored rivers across a rainfall gradient on the Big Island of Hawai‘i. Field data demonstrate that the physical strength of bedrock in streambeds varies with the degree of chemical weathering, which increases systematically with local rainfall rate. We find that incorporating the quantified relationships between local rainfall and erodibility into a commonly used river incision model is necessary to predict the rates and patterns of downcutting of these rivers. In contrast to using only precipitation-dependent river discharge to explain the climatic control of bedrock river incision, the mechanism of chemical weathering can explain strong coupling between local climate and river incision.

  5. As long as the rivers flow: Athabasca River knowledge, use and change

    International Nuclear Information System (INIS)

    Candler, C.; Olson, R.; Deroy, S.

    2010-11-01

    This document is a report supported by specific information gathered by the Athabasca Chipewyan First Nation (ACFN) and the Mikisew Cree First Nation (MCFN), and takes part in an Athabasca River Use and Traditional Ecological Knowledge (TEK) study conducted in 2010. The main objective was to provide a written submission, based on evidence, in order to effectively notify the crown about plans for managing industrial water withdrawals from the lower Athabasca River. The First Nations used the same methods, wrote their community reports as distinguished stand-alone documents and made the choice to present the ACFN and MCFN data in parallel with each other within the same document. The study provides information on the knowledge and uses of the Athabasca River by the community members. Context and background for the study can be found in the part A. It comprises a short discussion of the Treaty No.8 of 1899, the latter confirming the rights of First Nation people. The importance of boat transportation for the community members is mentioned, and a summary of the methods is given. The results of the ACFN and MCFN studies are given in part B and C. The reduction of the quantity and quality of the river has affected the practice of ACFN and MCFN aboriginal and treaty rights. The community perceptions of the changes of the river and how it has influenced their lifestyle is discussed. Some uses of the Athabasca river have been lost because of concerns regarding contamination associated with oil sands operations. The last part of the document provides an analysis of results and suggests two thresholds that define the ability of ACFN and MCFN members to practice their rights and access their territories. This document ends with recommendations for implementation of these thresholds. 22 refs., 12 maps.

  6. As long as the rivers flow: Athabasca River knowledge, use and change

    Energy Technology Data Exchange (ETDEWEB)

    Candler, C.; Olson, R.; Deroy, S. [Firelight Group Research Cooperative, Victoria, BC (Canada)

    2010-11-15

    This document is a report supported by specific information gathered by the Athabasca Chipewyan First Nation (ACFN) and the Mikisew Cree First Nation (MCFN), and takes part in an Athabasca River Use and Traditional Ecological Knowledge (TEK) study conducted in 2010. The main objective was to provide a written submission, based on evidence, in order to effectively notify the crown about plans for managing industrial water withdrawals from the lower Athabasca River. The First Nations used the same methods, wrote their community reports as distinguished stand-alone documents and made the choice to present the ACFN and MCFN data in parallel with each other within the same document. The study provides information on the knowledge and uses of the Athabasca River by the community members. Context and background for the study can be found in the part A. It comprises a short discussion of the Treaty No.8 of 1899, the latter confirming the rights of First Nation people. The importance of boat transportation for the community members is mentioned, and a summary of the methods is given. The results of the ACFN and MCFN studies are given in part B and C. The reduction of the quantity and quality of the river has affected the practice of ACFN and MCFN aboriginal and treaty rights. The community perceptions of the changes of the river and how it has influenced their lifestyle is discussed. Some uses of the Athabasca river have been lost because of concerns regarding contamination associated with oil sands operations. The last part of the document provides an analysis of results and suggests two thresholds that define the ability of ACFN and MCFN members to practice their rights and access their territories. This document ends with recommendations for implementation of these thresholds. 22 refs., 12 maps.

  7. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  8. Ground-Water System in the Chimacum Creek Basin and Surface Water/Ground Water Interaction in Chimacum and Tarboo Creeks and the Big and Little Quilcene Rivers, Eastern Jefferson County, Washington

    Science.gov (United States)

    Simonds, F. William; Longpre, Claire I.; Justin, Greg B.

    2004-01-01

    throughout most of the year and the lower reaches have little or no gains. The Big Quilcene River generally gains water from the shallow ground-water system after it emerges from a bedrock canyon and loses water from the town of Quilcene to the mouth of the river in Quilcene Bay. The Little Quilcene River generally loses water to the shallow ground-water system, although two localized areas were found to have gaining conditions. The Big Quilcene and Little Quilcene Rivers incur significant losses on the alluvial plain at the head of Quilcene Bay. Each of the creeks examined had a unique pattern of gaining and losing reaches, owing to the hydraulic conductivity of the streambed material and the relative altitude of the surrounding water table. Although the magnitudes of gains and losses varied seasonally, the spatial distribution did not vary greatly, suggesting that patterns of gains and losses in surface-water systems depend greatly on the geology underlying the streambed.

  9. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  10. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  11. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  12. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  13. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  14. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  15. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  16. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  17. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  18. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  19. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  20. Assessment of water quality for the determination of extent of pollution in Malir river

    International Nuclear Information System (INIS)

    Bano, F.; Rizvi, S.N.; Farooq, S.

    2009-01-01

    Karachi is the most industrially developed and populous city of Pakistan. A big part of its basin is occupied by alluvial of Malir River which is basically a seasonal river but becomes perennial within the limits of Karachi due to the continuous flow of untreated sewage and industrial effluents through its basin into the Arabian Sea. The data obtained during this study shows that the most down stream parts of the river are grossly polluted due to the inclusion of sewage and industrial wastes. Present data shows that pollution has not only deteriorated the pristine conditions of this river but it is also causing pollution in Arabian Sea where river finally falls. The data shows increasing trend of nutrients concentration and turbidity from 1994 to 1996. This study provides the base line data and reflects the quality of water in Malir River in middle 1990's. This data can be used to study the extent of pollution in Malir river by comparing it to the recent data (if available) on Malir river. (author)

  1. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  2. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  3. Restoring Anadromous Fish Habitat in Big Canyon Creek Watershed, 2004-2005 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Lynn (Nez Perce Soil and Conservation District, Lewiston, ID)

    2006-07-01

    The ''Restoring Anadromous Fish Habitat in the Big Canyon Creek Watershed'' is a multi-phase project to enhance steelhead trout in the Big Canyon Creek watershed by improving salmonid spawning and rearing habitat. Habitat is limited by extreme high runoff events, low summer flows, high water temperatures, poor instream cover, spawning gravel siltation, and sediment, nutrient and bacteria loading. Funded by the Bonneville Power Administration (BPA) as part of the Northwest Power Planning Council's Fish and Wildlife Program, the project assists in mitigating damage to steelhead runs caused by the Columbia River hydroelectric dams. The project is sponsored by the Nez Perce Soil and Water Conservation District. Target fish species include steelhead trout (Oncorhynchus mykiss). Steelhead trout within the Snake River Basin were listed in 1997 as threatened under the Endangered Species Act. Accomplishments for the contract period September 1, 2004 through October 31, 2005 include; 2.7 riparian miles treated, 3.0 wetland acres treated, 5,263.3 upland acres treated, 106.5 riparian acres treated, 76,285 general public reached, 3,000 students reached, 40 teachers reached, 18 maintenance plans completed, temperature data collected at 6 sites, 8 landowner applications received and processed, 14 land inventories completed, 58 habitat improvement project designs completed, 5 newsletters published, 6 habitat plans completed, 34 projects installed, 2 educational workshops, 6 displays, 1 television segment, 2 public service announcements, a noxious weed GIS coverage, and completion of NEPA, ESA, and cultural resources requirements.

  4. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  5. The Portland Basin: A (big) river runs through it

    Science.gov (United States)

    Evarts, Russell C.; O'Connor, Jim E.; Wells, Ray E.; Madin, Ian P.

    2009-01-01

    Metropolitan Portland, Oregon, USA, lies within a small Neogene to Holocene basin in the forearc of the Cascadia subduction system. Although the basin owes its existence and structural development to its convergent-margin tectonic setting, the stratigraphic architecture of basin-fill deposits chiefly reflects its physiographic position along the lower reaches of the continental-scale Columbia River system. As a result of this globally unique setting, the basin preserves a complex record of aggradation and incision in response to distant as well as local tectonic, volcanic, and climatic events. Voluminous flood basalts, continental and locally derived sediment and volcanic debris, and catastrophic flood deposits all accumulated in an area influenced by contemporaneous tectonic deformation and variations in regional and local base level.

  6. The Lost Guidewire

    Directory of Open Access Journals (Sweden)

    Ankit Shah

    2017-04-01

    Full Text Available History of present illness: A 44-year-old female called 911 complaining of abdominal pain, but was unresponsive upon arrival by emergency medical services (EMS. She presented to the emergency department (ED as a full cardiac arrest and had return of spontaneous circulation (ROSC with cardiopulmonary resuscitation (CPR and epinephrine. The patient had a splenic embolization 1 week prior to presentation. Bedside ultrasound demonstrated free fluid throughout the abdomen. As part of the resuscitation, femoral central venous access was obtained by the Emergency Department (ED physician, and a medical student was allowed to place a Cordis over the guidewire. The attending was next to the student, though became distracted when the patient again lost pulses. The student lost control of the guidewire upon re-initiation of CPR. Another Cordis was placed in the same location by the ED physician after the guidewire was seen on a chest radiograph. The patient was taken to the operating room with massive transfusion protocol, and the guidewire was left in the vena caval system until the patient could be stabilized. Two days later, interventional radiology removed the guidewire via a right internal jugular (IJ approach without complications. The patient had a prolonged and complicated course, but was discharged home two weeks later at her baseline. Significant findings: Initial chest radiograph shows a guidewire in the inferior vena cava (IVC, superior vena cava (SVC, and right IJ veins. Discussion: Central line complications include failure to place the catheter, improper catheter location, hemothorax from vascular injury, infection, arrhythmia, and cardiac arrest1. Complications from lost guidewires include cardiac dysrhythmias, cardiac conduction abnormalities, perforation of vessels/heart chambers, kinking/looping/knotting of the wire, entanglement of previously placed intravascular devices, breakage of the tip of the wire and subsequent embolization and

  7. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  8. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  9. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  10. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  11. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  12. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  13. After the crisis? Big Data and the methodological challenges of empirical sociology

    Directory of Open Access Journals (Sweden)

    Roger Burrows

    2014-07-01

    Full Text Available Google Trends reveals that at the time we were writing our article on ‘The Coming Crisis of Empirical Sociology’ in 2007 almost nobody was searching the internet for ‘Big Data’. It was only towards the very end of 2010 that the term began to register, just ahead of an explosion of interest from 2011 onwards. In this commentary we take the opportunity to reflect back on the claims we made in that original paper in light of more recent discussions about the social scientific implications of the inundation of digital data. Did our paper, with its emphasis on the emergence of, what we termed, ‘social transactional data’ and ‘digital byproduct data’ prefigure contemporary debates that now form the basis and rationale for this excellent new journal? Or was the paper more concerned with broader methodological, theoretical and political debates that have somehow been lost in all of the loud babble that has come to surround Big Data. Using recent work on the BBC Great British Class Survey as an example this brief paper offers a reflexive and critical reflection on what has become – much to the surprise of its authors – one of the most cited papers in the discipline of sociology in the last decade.

  14. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  15. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  16. Demystifying communication signal lost for network redundancy ...

    African Journals Online (AJOL)

    These studies report on the communication signal lost factors that were analyzed and supported by evidences on coverage analysis activities for Automatic Meter Reading (AMR) systems. We have categorized the influential signal lost factors into four core elements that were concluded based on our field measurement ...

  17. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  18. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  19. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  20. BigWig and BigBed: enabling browsing of large distributed datasets.

    Science.gov (United States)

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  1. Lost Cause: an interactive movie project

    OpenAIRE

    Johnson, Kirsten

    2008-01-01

    One of the challenges in designing an interactive cinematic experience is to offer interactive choices which do not distract from immersion into the story. The interactive movie project, Lost Cause focuses on the life of the main character explored through the inter-related perspectives of three other characters. Lost Cause supports an immersive interactive story experience through its correlated design of an interface, narrative content and narrative structure. The movie project is examined ...

  2. The Role of Phosphoramidon on the Biological Activity of Big Endothelin-1 in the Rat Mesenteric Microcirculation in Vivo

    International Nuclear Information System (INIS)

    Abdelhalim, Mohamed A K

    2008-01-01

    The goal of the present study was to clarify the role of metalloprotease inhibitor phosphoramidon on the effects induced by big endothelin-1 (big ET-1) in the rat mesenteric microcirculation in vivo, through investigating the systemic blood pressure, diameter and blood flow velocity of arterioles and venules of the rat mesentery. For this purpose, the rat mesentery was arranged for in situ intravital microscopic observation under transillumination and separate cumulative injections of big ET-1 and phosphoramidon were infused into the right jugular vein, respectively. In these experiments twenty-five rats (Charles River, 130 - 140 g) were used. The experiments were divided into two groups. In the first group of experiments, cumulative injections of big ET-1 (1000-8000 pmole/kg) were infused through a catheter inserted into the right jugular vein. Each dose of big ET-1 was infused 25 min prior to the infusion of the following dose. Infusion of big ET-1 (1000-8000 pmole/kg) elicited a long-lasting pressor effect. The infusion of low doses of big ET-1 (1000-2000 pmole/kg) elicited a significant (p < 0.05) dose-dependent increase in the microvascular blood flow velocity both in arterioles (20 - 30 ?m) and venules (30 - 50 ?m), and diameters of arterioles and venules exhibited a slight not significant vasodilator effect. The infusion of high doses of big ET-1 (4000-8000 pmole/kg) elicited significant dose-dependant decrease in the blood flow velocity of arterioles and venules, and diameters returned to the control runs. This may be attributed to the gradual conversion of big ET-1 to ET-1, and ET-1 is a potent vasoconstrictor. In the second group of experiments, cumulative injections of phosphoramidon (30 mg/kg /10 min) were administered 10 min prior to the infusion of big ET-1. These findings suggested that phosphoramidon significantly suppressed long-lasting pressor effect, dose-dependent increase, dose-dependent decrease and slow vasodilator effect produced by big ET-1

  3. Freshwater mussels (Unionidae) in the headwaters of Chipola River, Houston County, Alabama

    Science.gov (United States)

    Garner, J.T.; McGregor, S.W.; Tarpley, T.A.; Buntin, M.L.

    2009-01-01

    Big and Cowarts creeks lie in extreme southeastern Alabama and form the headwaters of Chipola River. Qualitative and quantitative sampling for freshwater mussels in these reaches during 2006 and 2007 revealed an intact fauna, relative to historical reports. A cumulative total of 17 species, including federally protected Elliptio chipolaensis (Chipola Slabshell), Lampsilis subangulata (Shinyrayed Pocketbook), Medionidus penicillatus (Gulf Moccasinshell), and Pleurobema pyriforme (Oval Pigtoe), was encountered. A total of 3382 mussels (density 5.84 per m2) was estimated for one 65-m reach of Big Creek and 9627 mussels (density 8.09 per m2) were estimated to occur in one 170-m reach of Cowarts Creek. Tributaries had depauperate faunas, apparently due to substrate instability.

  4. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  5. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  6. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  7. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  8. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  9. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  10. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  11. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  12. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  13. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  14. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  15. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  16. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  17. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  18. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  19. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  20. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  1. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  2. Circadian clocks : Translation lost

    NARCIS (Netherlands)

    Roenneberg, T; Merrow, M

    2005-01-01

    One of the big questions in biological rhythms research is how a stable and precise circa-24 hour oscillation is generated on the molecular level. While increasing complexity seemed to be the key, a recent report suggests that circa-24 hour rhythms can be generated by just four molecules incubated

  3. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  4. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  5. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  6. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  8. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  9. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  10. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  11. Assessment of risk factors in radionuclides pollution of coastal zone and river basins by numerical modelling

    International Nuclear Information System (INIS)

    Tsitskishvili, M.; Tsitskishvili, L.; Kordzakhia, G.; Diasamidze, R.; Shaptoshvili, A.; Valiaev, A.

    2006-01-01

    Full text: All types of industrial activities require the norms of protection, assessment of corresponding risks to preserve the pollution and degradation of corresponding areas. To make available the sustainable development of the country the risk assessment of possible accidents on the big enterprises is foreseen that provides preparedness of the country and possibility of the prevention measures and mitigation of the accidents. While big anthropogenic accidents in mountainous countries - the main paths for transportation of the pollution are the rivers and sea basins. Due to overpopulation of these areas assessment of the pollution risks are very important. For this aim the special deterministic models on the basis of passive admixture's turbulence diffusion equation is used. For numerical calculations Mc Kormack's predictor-corrector two steps scheme is used. The scheme is disintegrated, second order in space and time. Such scheme is established because the turbulent velocities very differ in horizontal and vertical directions and model allows implementing singular independent steps in different directions. Grid step for the model is 26.88 km in horizontal direction and 20 m m in vertical until 200 m. Time step is equal to 4 hours and computational time period - 4 months. Number of grid points is equal to 4983 for all calculation areas. Computations are carried out separately for big rivers basins as well as for Black and Caspian Seas water areas. The model calculations are made for cases with various locations of pollutant sources including accidental throws. For different realistic scenarios are calculated the concentrations of admixtures. The directions of their propagation are also determined. The risks are calculated in comparison with the Maximum Permissible Concentrations (MPC) of the pollutants according to achieved results. That gives possibility to define the most vulnerable areas in coastal zones. Realized methodology is verified by means of various

  12. Gain-loss study along two streams in the upper Sabine River basin, Texas; August-September 1981

    Science.gov (United States)

    Myers, Dennis R.

    1983-01-01

    A gain-loss study was made August-September 1981 along the upper Sabine River from Lake Tawakoni to Farm Road 2517 near Carthage and along Lake Fork Creek from Lake Fork Reservoir to its junction (mouth) with the Sabine River. The hydrologic data collected during the gain-loss study indicated that during periods of low flow on the Sabine River, at least as much water as is released from Lake Tawakoni and from Lake Fork Reservoir will be available downstream at Farm Road 14 near Big Sandy and at Farm Road 2517 near Carthage. Gains from bank seepage and small tributary inflows compensate for losses due to evaporation, evapotranspiration, and loss of water into the alluvial aquifer.

  13. Shoal bass hybridization in the Chattahoochee River Basin near Atlanta, Georgia

    Science.gov (United States)

    Taylor, Andrew T.; Tringali, Michael D.; O'Rourke, Patrick M.; Long, James M.

    2018-01-01

    The shoal bass (Micropterus cataractae) is a sportfish endemic to the Apalachicola-Chattahoochee-Flint Basin of the southeastern United States. Introgression with several non-native congeners poses a pertinent threat to shoal bass conservation, particularly in the altered habitats of the Chattahoochee River. Our primary objective was to characterize hybridization in shoal bass populations near Atlanta, Georgia, including a population inhabiting Big Creek and another in the main stem Chattahoochee River below Morgan Falls Dam (MFD). A secondary objective was to examine the accuracy of phenotypic identifications below MFD based on a simplified suite of characters examined in the field. Fish were genotyped with 16 microsatellite DNA markers, and results demonstrated that at least four black bass species were involved in introgressive hybridization. Of 62 fish genotyped from Big Creek, 27% were pure shoal bass and 65% represented either F1 hybrids of shoal bass x smallmouth bass (M. dolomieu) or unidirectional backcrosses towards shoal bass. Of 29 fish genotyped below MFD and downstream at Cochran Shoals, 45% were pure shoal bass. Six hybrid shoal bass included both F1 hybrids and backcrosses with non-natives including Alabama bass (M. henshalli), spotted bass (M. punctulatus), and smallmouth bass. Shoal bass alleles comprised only 21% of the overall genomic composition in Big Creek and 31% below MFD (when combined with Cochran Shoals). Phenotypic identification below MFD resulted in an overall correct classification rate of 86% when discerning pure shoal bass from all other non-natives and hybrids. Results suggest that although these two shoal bass populations feature some of the highest introgression rates documented, only a fleeting opportunity may exist to conserve pure shoal bass in both populations. Continued supplemental stocking of pure shoal bass below MFD appears warranted to thwart increased admixture among multiple black bass taxa, and a similar stocking

  14. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  15. Reasserting the primacy of human needs to reclaim the 'lost half' of sustainable development.

    Science.gov (United States)

    Everard, Mark; Longhurst, James W S

    2018-04-15

    The concept of sustainable development evolved from growing awareness of the interdependence of social and economic progress with the limits of the supporting natural environment, becoming progressively integrated into global agreements and transposition into local regulatory and implementation frameworks. We argue that transposition of the concept into regulation and supporting tools reduced the focus to minimal environmental and social standards, perceived as imposing constraints rather than opportunities for innovation to meet human needs. The aspirational 'half' of the concept of sustainable development specifically addressing human needs was thus lost in transposing high ideals into regulatory instruments. The Sustainable Development Goals (SDGs) restore focus on interlinked human needs, stimulating innovation of products and processes to satisfy them. Through three case studies - PVC water pipes, river quality management in England, and UK local air quality management - we explore the current operationalisation of the concept in diverse settings, using the SDG framework to highlight the broader societal purposes central to sustainable development. Partnerships involving civil society support evolution of regulatory instruments and their implementation, optimising social and ecological benefits thereby serving more human needs. Restoring the visionary 'lost half' of sustainable development - meeting human needs in sustainable ways - creates incentives for innovation and partnership; an innovation framework rather than a perceived constraint. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  17. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  18. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  19. Landscape elements and river chemistry as affected by river regulation – a 3-D perspective

    Directory of Open Access Journals (Sweden)

    E. Smedberg

    2009-09-01

    Full Text Available We tested the hypothesis whether individual land classes within a river catchment contribute equally to river loading with dissolved constituents or whether some land classes act as "hot spots" to river loading and if so, are these land classes especially affected by hydrological alterations. The amount of land covered by forests and wetlands and the average soil depth (throughout this paper soil refers to everything overlying bedrock i.e. regolith of a river catchment explain 58–93% of the variability in total organic carbon (TOC and dissolved silicate (DSi concentrations for 22 river catchments in Northern Sweden. For the heavily regulated Luleälven, with 7 studied sub-catchments, only 3% of the headwater areas have been inundated by reservoirs, some 10% of the soils and aggregated forest and wetland areas have been lost due to damming and further hydrological alteration such as bypassing entire sub-catchments by headrace tunnels. However, looking at individual forest classes, our estimates indicate that some 37% of the deciduous forests have been inundated by the four major reservoirs built in the Luleälven headwaters. These deciduous forest and wetlands formerly growing on top of alluvial deposits along the river corridors forming the riparian zone play a vital role in loading river water with dissolved constituents, especially DSi. A digital elevation model draped with land classes and soil depths which highlights that topography of various land classes acting as hot spots is critical in determining water residence time in soils and biogeochemical fluxes. Thus, headwater areas of the Luleälven appear to be most sensitive to hydrological alterations due to the thin soil cover (on average 2.7–4.5 m and only patchy appearance of forest and wetlands that were significantly perturbed. Hydrological alterations of these relatively small headwater areas significantly impacts downstream flux of dissolved constituents and their delivery to

  20. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  1. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  2. Lost circulation technology workshop, October 9-10, 1984

    Energy Technology Data Exchange (ETDEWEB)

    Caskey, B.C. (ed.)

    1985-03-01

    This report summarizes the presentations and discussions of a workshop on lost circulation technology. The workshop identified and defined lost circulation problem areas in field operations, materials, mud effects, and standards. Problem solution needs were also categorized as requiring analytical evaluation and procedure, instrument, and material development.

  3. Lost Talent? The Occupational Ambitions and Attainments of Young Australians

    Science.gov (United States)

    Sikora, Joanna; Saha, Lawrence J.

    2011-01-01

    Given ongoing interest in increasing productivity and participation in the workforce, understanding when talent is lost is a useful exercise. The term "lost talent" describes the underutilisation or wastage of human potential. Focusing on young people, Sikora and Saha define lost talent as occurring when students in the top 50% of…

  4. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  5. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  6. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  7. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  8. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  9. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  10. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  11. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  12. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  13. 21 CFR 1305.16 - Lost and stolen DEA Forms 222.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Lost and stolen DEA Forms 222. 1305.16 Section... II CONTROLLED SUBSTANCES DEA Form 222 § 1305.16 Lost and stolen DEA Forms 222. (a) If a purchaser ascertains that an unfilled DEA Form 222 has been lost, he or she must execute another in triplicate and...

  14. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  15. Considering lost sale in inventory routing problems for perishable goods

    DEFF Research Database (Denmark)

    Mirzaei, Samira; Seifi, Abbas

    2015-01-01

    , the average optimality gaps are less than 10.9% and 13.4% using linear and exponential lost sale functions, respectively. Furthermore, we show that the optimality gaps found by CPLEX grow exponentially with the problem size while those obtained by the proposed meta-heuristic algorithm increase linearly....... is considered as lost sale. The proposed model balances the transportation cost, the cost of inventory holding and lost sale. In addition to the usual inventory routing constraints, we consider the cost of lost sale as a linear or an exponential function of the inventory age. The proposed model is solved...

  16. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  17. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  18. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  19. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  20. Reproduction and conservation of the Magdalena River turtle (Podocnemis lewyana) in the Claro Cocorna Sur River, Colombia

    International Nuclear Information System (INIS)

    Ceballos, Claudia P; Romero, Isabel; Gomez Saldarriaga, Catalina; Miranda, Karla

    2014-01-01

    The Magdalena river turtle, Podocnemis lewyana, is an endangered and endemic turtle from Colombia. Among the most important information needed to conserve endangered species is to identify, monitor, and protect the sites used by the species to reproduce and grow. In this study we report, for the first time, the reproductive output and the nesting beaches of P. lewyana in the Claro Cocorna Sur River, a tributary of the Magdalena river drainage. We systematically examined a river transect of 8 km with 14 sandy beaches during two nesting seasons in one year. We recorded a yearly production of 47 clutches, 957 eggs, and two preferred nesting beaches: Alto Bonito with 51 %, and Belgica with 28.3 % of this reproductive output. Aafuver, a community-based organization, has led a headstarting program since 2010 to decrease in-situ egg mortality due to predation on nesting beaches. Aafuver collects and incubates the eggs ex-situ, raises the hatchlings for one to five months and then releases them into the same river. To understand potential effects of such egg manipulation, we monitored and compared in-situ and ex-situ incubation temperatures. We found ex-situ temperatures below the pivotal temperature known for P. lewyana and below the temperatures in nesting beaches. Finally, we monitored hatchlings growth under aafuver captive conditions, and found that hatchlings duplicated their body mass during the first three months of age. Egg weight was strongly associated to body weight at hatching; however this association is lost by the third month of age. We strongly encourage supporting this community-based conservation program, and the protection of the Claro Cocorna Sur River as an important nesting and growth habitat for the conservation of P. lewyana.

  1. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  2. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  3. Flood-inundation maps for a 12.5-mile reach of Big Papillion Creek at Omaha, Nebraska

    Science.gov (United States)

    Strauch, Kellan R.; Dietsch, Benjamin J.; Anderson, Kayla J.

    2016-03-22

    Digital flood-inundation maps for a 12.5-mile reach of the Big Papillion Creek from 0.6 mile upstream from the State Street Bridge to the 72nd Street Bridge in Omaha, Nebraska, were created by the U.S. Geological Survey (USGS) in cooperation with the Papio-Missouri River Natural Resources District. The flood-inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage on the Big Papillion Creek at Fort Street at Omaha, Nebraska (station 06610732). Near-real-time stages at this streamgage may be obtained on the Internet from the USGS National Water Information System at http://waterdata.usgs.gov/ or the National Weather Service Advanced Hydrologic Prediction Service at http:/water.weather.gov/ahps/, which also forecasts flood hydrographs at this site.

  4. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  5. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  6. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  7. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  8. 36 CFR 327.16 - Lost and found articles.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Lost and found articles. 327... CHIEF OF ENGINEERS § 327.16 Lost and found articles. All articles found shall be deposited by the finder at the Manager's office or with a ranger. All such articles shall be disposed of in accordance with...

  9. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  10. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  11. Net lost revenue from DSM: State policies that work

    Energy Technology Data Exchange (ETDEWEB)

    Baxter, L.W.

    1995-07-01

    A key utility regulatory reform undertaken since 1989 allows utilities to recover the lost revenue incurred through successful operation of demand-side management (DSM) programs. Net lost revenue adjustment (NLRA) mechanisms are states preferred approach to lost revenue recovery from DSM programs. This paper examines the experiences states and utilities are having with the NLRA approach. The paper has three objectives: (1) determine whether NLRA is a feasible and effective approach to the lost-revenue disincentive for utility DSM programs, (2) identify the conditions linked to effective implementation of NLRA mechanisms and assess whether NLRA has changed utility investment behavior, and (3) suggest improvements to NLRA mechanisms. Contrary to the concerns raised by some industry analysts, our results indicate NLRA is a feasible approach. Seven of the ten states we studied report no substantial problems with their approach. We observe several conditions linked to effective NLRA implementation. Observed changes in utility investment behavior occur after implementation of DSM rate reforms, which include deployment of NLRA mechanisms. Utilities in states with lost revenue recovery invest more than twice as much in DSM as do utilities in other states.

  12. Years of life lost due to external radiation exposure

    Directory of Open Access Journals (Sweden)

    Raičević Jagoš J.

    2004-01-01

    Full Text Available In this paper a new approach for calculation of the years of life lost per excess death due to stochastic health effects is applied to external exposure pathways. The short-term external exposures are due to the passage of radioactive cloud and due to the skin and clothes contamination. The long-term external exposure is the one from the radioactive material deposited on the ground (groundshine. Three nuclides, 131I, 137Cs, and 239Pu, and with the extremely wide range of half-life are considered in order to examine their possible influence on the calculated values of years of life lost. For each of these nuclides, the number of years of life lost has been found as a decreasing function of the age at the expo sure and presented graphically in this paper. For protracted exposures, the fully averaged number of years of life lost is negative correlated with the nuclide’s half-life. On the other hand, the short-term external exposures do not depend on the nuclide’s half-life. In addition, a weak years of life lost dependence of the dose has been commented.

  13. Experimental feeding of DDE and PCB to female big brown bats (Eptesicus fuscus)

    Science.gov (United States)

    Clark, D.R.; Prouty, R.M.

    1977-01-01

    Twenty-two female big brown bats (Eptesicus fuscus) were collected in a house attic in Montgomery County, Maryland. Seventeen were fed mealworms (Tenebrio molitor larvae) that contained 166 ppm DDE; the other five were fed uncontaminated mealworms. After 54 days of feeding, six dosed bats were frozen and the remaining 16 were starved to death. In a second experiment, 21 female big brown bats were collected in a house attic in Prince Georges County, Maryland. Sixteen were fed mealworms that contained 9.4 ppm Aroclor 1254 (PCB). After 37 days, two bats had died, four dosed bats were frozen, and the remaining 15 were starved to death. Starvation caused mobilization of stored residues. After the feeding periods, average weights of all four groups (DDE-dosed, DDE control, PCB-dosed, PCB control) had increased. However, weights of DDE-dosed bats had increased significantly more than those of their contols, whereas weights of PCB-dosed bats had increased significantly less than those of their controls. During starvation, PCB-dosed bats lost weight significantly more slowly than controls. Because PCB levels in dosed bats resembled levels found in some free-living big brown bats, PCBs may be slowing metabolic rates of some free-living bats. It is not known how various common organochlorine residues may affect metabolism in hibernating bats. DDE and PCB increased in brains of starving bats as carcass fat was metabolized. Because the tremors and/or convulsions characteristic of neurotoxicity were not observed, we think even the maximum brain levels attained (132 ppm DDE, 20 ppm PCB) were sublethal. However, extrapolation of our DDE data predicted lethal brain levels when fat reserves declined sufficiently. PCB-dosed bats were probably in no danger of neurotoxic poisoning. However, PCB can kill by a nonneurotoxic mode, and this could explain the deaths of two bats on PCB dosage.

  14. Natural equilibria and anthropic effects on sediment transport in big river systems: The Nile case

    Science.gov (United States)

    Garzanti, Eduardo; Andò, Sergio; Padoan, Marta; Vezzoli, Giovanni; Villa, Igor

    2014-05-01

    The Nile River flows for ~ 6700 km, from Burundi and Rwanda highlands south of the Equator to the Mediterranean Sea at northern subtropical latitudes. It is thus the longest natural laboratory on Earth, a unique setting in which we are carrying out a continuing research project to investigate changes in sediment composition associated with a variety of chemical and physical processes, including weathering in equatorial climate and hydraulic sorting during transport and deposition. Petrographic, mineralogical, chemical, and isotopic fingerprints of sand and mud have been monitored along all Nile branches, from the Kagera and White Nile draining Archean, Paleoproterozoic and Mesoproterozoic basements uplifted along the western branch of the East African rift, to the Blue Nile and Atbara Rivers sourced in Ethiopian volcanic highlands made of Oligocene basalt. Downstream of the Atbara confluence, the Nile receives no significant tributary water and hardly any rainfall across the Sahara. After construction of the Aswan High Dam in 1964, the Nile ceased to be an active conveyor-belt in Egypt, where the mighty river has been tamed to a water canal; transported sediments are thus chiefly reworked from older bed and levee deposits, with minor contributions from widyan sourced in the Red Sea Hills and wind-blown desert sand and dust. Extensive dam construction has determined a dramatic sediment deficit at the mouth, where deltaic cusps are undergoing ravaging erosion. Nile delta sediments are thus recycled under the effect of dominant waves from the northwest, the longest Mediterranean fetch direction. Nile sands, progressively enriched in more stable minerals such as quartz and amphiboles relative to volcanic rock fragments and pyroxene, thus undergo multistep transport by E- and NE-directed longshore currents all along the coast of Egypt and Palestine, and are carried as far as Akko Bay in northern Israel. Nile mud reaches the Iskenderun Gulf in southern Turkey. A full

  15. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  16. Mesohabitats, fish assemblage composition, and mesohabitat use of the Rio Grande silvery minnow over a range of seasonal flow regimes in the Rio Grande/Rio Bravo del Norte, in and near Big Bend National Park, Texas, 2010-11

    Science.gov (United States)

    Moring, J. Bruce; Braun, Christopher L.; Pearson, Daniel K.

    2014-01-01

    In 2010–11, the U.S. Geological Survey (USGS), in cooperation with the U.S. Fish and Wildlife Service, evaluated the physical characteristics and fish assemblage composition of mapped river mesohabitats at four sites on the Rio Grande/Rio Bravo del Norte (hereinafter Rio Grande) in and near Big Bend National Park, Texas. The four sites used for the river habitat study were colocated with sites where the U.S. Fish and Wildlife Service has implemented an experimental reintroduction of the Rio Grande silvery minnow (Hybognathus amarus), a federally listed endangered species, into part of the historical range of this species. The four sites from upstream to downstream are USGS station 08374340 Rio Grande at Contrabando Canyon near Lajitas, Tex. (hereinafter the Contrabando site), USGS station 290956103363600 Rio Grande at Santa Elena Canyon, Big Bend National Park, Tex. (hereinafter the Santa Elena site), USGS station 291046102573900 Rio Grande near Ranger Station at Rio Grande Village, Tex. (hereinafter the Rio Grande Village site), and USGS station 292354102491100 Rio Grande above Stillwell Crossing near Big Bend National Park, Tex. (hereinafter the Stillwell Crossing site).

  17. Semantic Web Technologies and Big Data Infrastructures: SPARQL Federated Querying of Heterogeneous Big Data Stores

    OpenAIRE

    Konstantopoulos, Stasinos; Charalambidis, Angelos; Mouchakis, Giannis; Troumpoukis, Antonis; Jakobitsch, Jürgen; Karkaletsis, Vangelis

    2016-01-01

    The ability to cross-link large scale data with each other and with structured Semantic Web data, and the ability to uniformly process Semantic Web and other data adds value to both the Semantic Web and to the Big Data community. This paper presents work in progress towards integrating Big Data infrastructures with Semantic Web technologies, allowing for the cross-linking and uniform retrieval of data stored in both Big Data infrastructures and Semantic Web data. The technical challenges invo...

  18. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  19. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  20. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  1. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  2. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  3. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  4. Intelligent Test Mechanism Design of Worn Big Gear

    Directory of Open Access Journals (Sweden)

    Hong-Yu LIU

    2014-10-01

    Full Text Available With the continuous development of national economy, big gear was widely applied in metallurgy and mine domains. So, big gear plays an important role in above domains. In practical production, big gear abrasion and breach take place often. It affects normal production and causes unnecessary economic loss. A kind of intelligent test method was put forward on worn big gear mainly aimed at the big gear restriction conditions of high production cost, long production cycle and high- intensity artificial repair welding work. The measure equations transformations were made on involute straight gear. Original polar coordinate equations were transformed into rectangular coordinate equations. Big gear abrasion measure principle was introduced. Detection principle diagram was given. Detection route realization method was introduced. OADM12 laser sensor was selected. Detection on big gear abrasion area was realized by detection mechanism. Tested data of unworn gear and worn gear were led in designed calculation program written by Visual Basic language. Big gear abrasion quantity can be obtained. It provides a feasible method for intelligent test and intelligent repair welding on worn big gear.

  5. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  6. The lost art of finding our way

    CERN Document Server

    Huth, Edward John

    2013-01-01

    Long before GPS, Google Earth, and global transit, humans traveled vast distances using only environmental clues and simple instruments. John Huth asks what is lost when modern technology substitutes for our innate capacity to find our way. Encyclopedic in breadth, weaving together astronomy, meteorology, oceanography, and ethnography, The Lost Art of Finding Our Way puts us in the shoes, ships, and sleds of early navigators for whom paying close attention to the environment around them was, quite literally, a matter of life and death. Haunted by the fate of two young kayakers lost in a fogbank off Nantucket, Huth shows us how to navigate using natural phenomena—the way the Vikings used the sunstone to detect polarization of sunlight, and Arab traders learned to sail into the wind, and Pacific Islanders used underwater lightning and “read” waves to guide their explorations. Huth reminds us that we are all navigators capable of learning techniques ranging from the simplest to the most sophisticated skil...

  7. Concentration Trends for Lead and Calcium-Normalized Lead in Fish Fillets from the Big River, a Mining-Contaminated Stream in Southeastern Missouri USA.

    Science.gov (United States)

    Schmitt, Christopher J; McKee, Michael J

    2016-11-01

    Lead (Pb) and calcium (Ca) concentrations were measured in fillet samples of longear sunfish (Lepomis megalotis) and redhorse suckers (Moxostoma spp.) collected in 2005-2012 from the Big River, which drains a historical mining area in southeastern Missouri and where a consumption advisory is in effect due to elevated Pb concentrations in fish. Lead tends to accumulated in Ca-rich tissues such as bone and scale. Concentrations of Pb in fish muscle are typically low, but can become elevated in fillets from Pb-contaminated sites depending in part on how much bone, scale, and skin is included in the sample. We used analysis-of-covariance to normalize Pb concentration to the geometric mean Ca concentration (415 ug/g wet weight, ww), which reduced variation between taxa, sites, and years, as was the number of samples that exceeded Missouri consumption advisory threshold (300 ng/g ww). Concentrations of Pb in 2005-2012 were lower than in the past, especially after Ca-normalization, but the consumption advisory is still warranted because concentrations were >300 ng/g ww in samples of both taxa from contaminated sites. For monitoring purposes, a simple linear regression model is proposed for estimating Ca-normalized Pb concentrations in fillets from Pb:Ca molar ratios as a way of reducing the effects of differing preparation methods on fillet Pb variation.

  8. Concentration trends for lead and calcium-normalized lead in fish fillets from the Big River, a mining-contaminated stream in southeastern Missouri USA

    Science.gov (United States)

    Schmitt, Christopher J.; McKee, Michael J.

    2016-01-01

    Lead (Pb) and calcium (Ca) concentrations were measured in fillet samples of longear sunfish (Lepomis megalotis) and redhorse suckers (Moxostoma spp.) collected in 2005–2012 from the Big River, which drains a historical mining area in southeastern Missouri and where a consumption advisory is in effect due to elevated Pb concentrations in fish. Lead tends to accumulated in Ca-rich tissues such as bone and scale. Concentrations of Pb in fish muscle are typically low, but can become elevated in fillets from Pb-contaminated sites depending in part on how much bone, scale, and skin is included in the sample. We used analysis-of-covariance to normalize Pb concentration to the geometric mean Ca concentration (415 ug/g wet weight, ww), which reduced variation between taxa, sites, and years, as was the number of samples that exceeded Missouri consumption advisory threshold (300 ng/g ww). Concentrations of Pb in 2005–2012 were lower than in the past, especially after Ca-normalization, but the consumption advisory is still warranted because concentrations were >300 ng/g ww in samples of both taxa from contaminated sites. For monitoring purposes, a simple linear regression model is proposed for estimating Ca-normalized Pb concentrations in fillets from Pb:Ca molar ratios as a way of reducing the effects of differing preparation methods on fillet Pb variation.

  9. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  10. Making big sense from big data in toxicology by read-across.

    Science.gov (United States)

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  11. [Work days lost due to health problems in industry].

    Science.gov (United States)

    Yano, Sylvia Regina Trindade; Santana, Vilma Sousa

    2012-05-01

    This cross-sectional study estimated the prevalence of work days lost due to health problems and associated factors among industrial workers. The study population was a simple random cluster sample of 3,403 workers from 16 to 65 years of age in the city of Salvador, Bahia State, Brazil. Data were collected with individual home interviews. Among industrial workers, one-year prevalence of work days lost to health problems was 12.5%, of which 5.5% were directly work-related and 4.1% aggravated by work. There were no statistically significant differences when compared to other worker categories. Self-perceived workplace hazards, history of work-related injury, and poor self-rated health were associated with work days lost due to work-related injuries/diseases. The findings showed that work days lost are common among both industrial and non-industrial workers, thereby affecting productivity and requiring prevention programs.

  12. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  13. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  14. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  15. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  16. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  17. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  18. Harnessing the Power of Big Data to Improve Graduate Medical Education: Big Idea or Bust?

    Science.gov (United States)

    Arora, Vineet M

    2018-06-01

    With the advent of electronic medical records (EMRs) fueling the rise of big data, the use of predictive analytics, machine learning, and artificial intelligence are touted as transformational tools to improve clinical care. While major investments are being made in using big data to transform health care delivery, little effort has been directed toward exploiting big data to improve graduate medical education (GME). Because our current system relies on faculty observations of competence, it is not unreasonable to ask whether big data in the form of clinical EMRs and other novel data sources can answer questions of importance in GME such as when is a resident ready for independent practice.The timing is ripe for such a transformation. A recent National Academy of Medicine report called for reforms to how GME is delivered and financed. While many agree on the need to ensure that GME meets our nation's health needs, there is little consensus on how to measure the performance of GME in meeting this goal. During a recent workshop at the National Academy of Medicine on GME outcomes and metrics in October 2017, a key theme emerged: Big data holds great promise to inform GME performance at individual, institutional, and national levels. In this Invited Commentary, several examples are presented, such as using big data to inform clinical experience and provide clinically meaningful data to trainees, and using novel data sources, including ambient data, to better measure the quality of GME training.

  19. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  20. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  1. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  2. Lost spoiler practices

    DEFF Research Database (Denmark)

    Gürsimsek, Ödül; Drotner, Kirsten

    2014-01-01

    narratives, and viewers through tracing, dismantling – and sometimes questioning – content in order to create coherent meanings in the maze of narratives. Online audiences, such as spoiler communities, may uncover components of transmedia storytelling, discuss their validity and enhance them with individual...... and documents that interactions between Lost audiences and producers operate as forms of social participation when spoiler-seeking audiences work to unravel, challenge and predict the narrative while the producers seek to orchestrate transmedia storytelling experiences. Our results serve as a sobering empirical...

  3. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Science.gov (United States)

    2013-01-17

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N259; FXRS1265030000-134-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive... significant impact (FONSI) for the environmental assessment (EA) for Big Stone National Wildlife Refuge...

  4. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  5. A Joint Replenishment Inventory Model with Lost Sales

    Science.gov (United States)

    Devy, N. L.; Ai, T. J.; Astanti, R. D.

    2018-04-01

    This paper deals with two items joint replenishment inventory problem, in which the demand of each items are constant and deterministic. Inventory replenishment of items is conducted periodically every T time intervals. Among of these replenishments, joint replenishment of both items is possible. It is defined that item i is replenished every ZiT time intervals. Replenishment of items are instantaneous. All of shortages are considered as lost sales. The maximum allowance for lost sales of item i is Si. Mathematical model is formulated in order to determining the basic time cycle T, replenishment multiplier Zi , and maximum lost sales Si in order to minimize the total cost per unit time. A solution methodology is proposed for solve the model and a numerical example is provided for demonstrating the effectiveness of the proposed methodology.

  6. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  7. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  8. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  9. 42 CFR 102.32 - Benefits for lost employment income.

    Science.gov (United States)

    2010-10-01

    ... 102.32 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES VACCINES SMALLPOX... pay for lost employment income or to provide disability or retirement benefits to the requester. As provided in § 102.84, the Secretary retains the right to recover benefits for lost employment income paid...

  10. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  11. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  12. Epidemiology in wonderland: Big Data and precision medicine.

    Science.gov (United States)

    Saracci, Rodolfo

    2018-03-01

    Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.

  13. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  14. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  15. Assessment of net lost revenue adjustment mechanisms for utility DSM programs

    Energy Technology Data Exchange (ETDEWEB)

    Baxter, L.W.

    1995-01-01

    Utility shareholders can lose money on demand-side management (DSM) investments between rate cases. Several industry analysts argue that the revenues lost from utility DSM programs are an important financial disincentive to utility DSM investment. A key utility regulatory reform undertaken since 1989 allows utilities to recover the lost revenues incurred through successful operation of DSM programs. Explicitly defined net lost revenue adjustment (NLRA) mechanisms are states` preferred approach to lost revenue recovery from DSM programs. This report examines the experiences states and utilities are having with the NLRA approach. The report has three objectives. First, we determine whether NLRA is a feasible and successful approach to removing the lost-revenue disincentive to utility operation of DSM programs. Second, we identify the conditions linked to successful implementation of NLRA mechanisms in different states and assess whether NLRA has changed utility investment behavior. Third, we suggest improvements to NLRA mechanisms. We first identify states with NLRA mechanisms where utilities are recovering lost revenues from DSM programs. We interview staff at regulatory agencies in all these states and utility staff in four states. These interviews focus on the status of NLRA, implementation issues, DSM measurement issues, and NLRA results. We also analyze regulatory agency orders on NLRA, as well as associated testimony, reports, and utility lost revenue recovery filings. Finally, we use qualitative and quantitative indicators to assess NLRA`s effectiveness. Contrary to the concerns raised by some industry analysts, our results indicate NLRA is a feasible approach to the lost-revenue disincentive.

  16. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  17. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  18. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  19. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  20. BIG DATA IN TAMIL: OPPORTUNITIES, BENEFITS AND CHALLENGES

    OpenAIRE

    R.S. Vignesh Raj; Babak Khazaei; Ashik Ali

    2015-01-01

    This paper gives an overall introduction on big data and has tried to introduce Big Data in Tamil. It discusses the potential opportunities, benefits and likely challenges from a very Tamil and Tamil Nadu perspective. The paper has also made original contribution by proposing the ‘big data’s’ terminology in Tamil. The paper further suggests a few areas to explore using big data Tamil on the lines of the Tamil Nadu Government ‘vision 2023’. Whilst, big data has something to offer everyone, it ...

  1. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Archaeological Investigations on the East Fork of the Salmon River, Custer County, Idaho.

    Science.gov (United States)

    1984-01-01

    coniferous environment in addition to pine marten (Martes americana), red squirrel (Tamiasciurus hudsonicus), porcupine (Erithizon dorsatum), mountain vole...can be seen in small herds throughout the East Fork valley from the Salmon River to Big Boulder Creek. Two bands of Rocky Mountain bighorn sheep...utilize the Challis Planning Unit, one on the East Fork and the other in the Birch Creek area. The East Fork herd is comprised of approximately 50-70

  3. Big Data’s Role in Precision Public Health

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091

  4. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  5. Molecular signatures of biogeochemical transformations in dissolved organic matter from ten World Rivers

    Directory of Open Access Journals (Sweden)

    Thomas Riedel

    2016-09-01

    Full Text Available Rivers carry large amounts of dissolved organic matter (DOM to the oceans thereby connecting terrestrial and marine element cycles. Photo-degradation in conjunction with microbial turnover is considered a major pathway by which terrigenous DOM is decomposed. To reveal globally relevant patterns behind this process, we performed photo-degradation experiments and year-long bio-assays on DOM from ten of the largest world rivers that collectively account for more than one-third of the fresh water discharge to the global ocean. We furthermore tested the hypothesis that the terrigenous component in deep ocean DOM may be far higher than biomarker studies suggest, because of the selective photochemical destruction of characteristic biomolecules from vascular plants. DOM was molecularly characterized by a combination of non-targeted ultrahigh-resolution mass spectrometry and quantitative molecular tracer analyses. We show that the reactivity of DOM is globally related to broad catchment properties. Basins that are dominated by forest and grassland export more photo-degradable DOM than other rivers. Chromophoric compounds are mainly vascular plant-derived polyphenols, and partially carry a pyrogenic signature from vegetation fires. These forest and grassland dominated rivers lost up to 50% of dissolved organic carbon (DOC during irradiation, and up to 85% of DOC was lost in total if subsequently bio-incubated for one year. Basins covered by cropland, on the other hand, export DOM with a higher proportion of photo-resistant and bio-available DOM which is enriched in nitrogen. In these rivers, 30% or less of DOC was photodegraded. Consistent with previous studies, we found that riverine DOM resembled marine DOM in its broad molecular composition after extensive degradation, mainly due to almost complete removal of aromatics. More detailed molecular fingerprinting analysis (based on the relative abundance of >4000 DOM molecular formulae, however, revealed

  6. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  7. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  8. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  9. Tree recruitment and survival in rivers: influence of hydrological processes

    Science.gov (United States)

    Carter Johnson, W.

    2000-10-01

    The findings of a 14-year study of tree reproduction and survival in the Platte River, Nebraska, are presented. The study was initiated in 1985 to determine the causes and remedies of woodland expansion and channel narrowing, which have reduced potential roosting habitat for migratory avifauna such as the whooping crane and sandhill crane.A total of 296 relocatable sites, constituting some 600 plots with Populus and Salix seedlings, was selected and sampled within two reaches near Shelton and Odessa, Nebraska. The fate of some 37 000 tree seedlings was monitored within the plot network.Tree recruitment is controlled largely by stream flow in June. Populus and Salix produce large numbers of seedlings in the river bed in most years, indicating the potential for high rates of woodland expansion. On average, in only 1 year in 7 is stream flow in June high enough to preclude Populus and Salix recruitment.Seedling mortality is dominated by two environmental factors: summer stream flow pulses from thunderstorms, which erode or bury new germinants, and river bed restructuring by moving ice in winter. A third factor, seedling mortality by desiccation during summer droughts, does occur but at a low frequency.Plots of seedlings had extremely low survival rates over the course of the study. Forty-two per cent of the plots lost all seedlings by the first remeasurement (July to September), 36% by the second measurement (May), and 10% by the third remeasurement (July). Thus nearly 90% of the plots had lost all tree seedlings by the end of the first year.counterbalanced by erosion of established woodland.effectiveness of prescribed flows as insurance against future narrowing. Flows prescribed at key times to raise seedling mortality rates are recommended to maintain or widen channels, rather than mechanical clearing of established woodland.

  10. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  11. 32 CFR 635.31 - Lost, abandoned, or unclaimed property.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Lost, abandoned, or unclaimed property. 635.31 Section 635.31 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.31 Lost, abandoned, or unclaimed property. This is...

  12. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  13. Measurement of water lost from heated geologic salt

    International Nuclear Information System (INIS)

    Hohlfelder, J.J.

    1979-07-01

    This report describes three methods used to measure the rate at which water is lost from heated geologic salt. The three methods were employed in each of a series of proof tests which were performed to evaluate instrumentation designed to measure the water-loss rate. It was found that the water lost from heated, 1-kg salt specimens which were measured according to these three methods was consistent to within an average 9 percent

  14. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  15. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  16. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  17. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  18. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  19. Big data processing in the cloud - Challenges and platforms

    Science.gov (United States)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  20. Lost Muon Study for the Muon G-2 Experiment at Fermilab*

    Energy Technology Data Exchange (ETDEWEB)

    Ganguly, S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Crnkovic, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Morse, W. M. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2017-05-19

    The Fermilab Muon g-2 Experiment has a goal of measuring the muon anomalous magnetic moment to a precision of 140 ppb - a fourfold improvement over the 540 ppb precision obtained by the BNL Muon g-2 Experiment. Some muons in the storage ring will interact with material and undergo bremsstrahlung, emitting radiation and loosing energy. These so called lost muons will curl in towards the center of the ring and be lost, but some of them will be detected by the calorimeters. A systematic error will arise if the lost muons have a different average spin phase than the stored muons. Algorithms are being developed to estimate the relative number of lost muons, so as to optimize the stored muon beam. This study presents initial testing of algorithms that can be used to estimate the lost muons by using either double or triple detection coincidences in the calorimeters.

  1. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  2. Salmonid Gamete Preservation in the Snake River Basin, Annual Report 2002.

    Energy Technology Data Exchange (ETDEWEB)

    Young, William; Kucera, Paul

    2003-07-01

    In spite of an intensive management effort, chinook salmon (Oncorhynchus tshawytscha) and steelhead (Oncorhynchus mykiss) populations in the Northwest have not recovered and are currently listed as threatened species under the Endangered Species Act. In addition to the loss of diversity from stocks that have already gone extinct, decreased genetic diversity resulting from genetic drift and inbreeding is a major concern. Reduced population and genetic variability diminishes the environmental adaptability of individual species and entire ecological communities. The Nez Perce Tribe (NPT), in cooperation with Washington State University and the University of Idaho, established a germplasm repository in 1992 in order to preserve the remaining salmonid diversity in the region. The germplasm repository provides long-term storage for cryopreserved gametes. Although only male gametes can be cryopreserved, conserving the male component of genetic diversity will maintain future management options for species recovery. NPT efforts have focused on preserving salmon and steelhead gametes from the major river subbasins in the Snake River basin. However, the repository is available for all management agencies to contribute gamete samples from other regions and species. In 2002 a total of 570 viable semen samples were added to the germplasm repository. This included the gametes of 287 chinook salmon from the Lostine River, Catherine Creek, upper Grande Ronde River, Imnaha River (Lookingglass Hatchery), Lake Creek, South Fork Salmon River, Johnson Creek, Big Creek, Capehorn Creek, Marsh Creek, Pahsimeroi River (Pahsimeroi Hatchery), and upper Salmon River (Sawtooth Hatchery) and the gametes of 280 steelhead from the North Fork Clearwater River (Dworshak Hatchery), Fish Creek, Little Sheep Creek, Pahsimeroi River (Pahsimeroi Hatchery) and Snake River (Oxbow Hatchery). In addition, gametes from 60 Yakima River spring chinook and 34 Wenatchee River coho salmon were added to the

  3. Recharge Area, Base-Flow and Quick-Flow Discharge Rates and Ages, and General Water Quality of Big Spring in Carter County, Missouri, 2000-04

    Science.gov (United States)

    Imes, Jeffrey L.; Plummer, Niel; Kleeschulte, Michael J.; Schumacher, John G.

    2007-01-01

    Exploration for lead deposits has occurred in a mature karst area of southeast Missouri that is highly valued for its scenic beauty and recreational opportunities. The area contains the two largest springs in Missouri (Big Spring and Greer Spring), both of which flow into federally designated scenic rivers. Concerns about potential mining effects on the area ground water and aquatic biota prompted an investigation of Big Spring. Water-level measurements made during 2000 helped define the recharge area of Big Spring, Greer Spring, Mammoth Spring, and Boze Mill Spring. The data infer two distinct potentiometric surfaces. The shallow potentiometric surface, where the depth-to-water is less than about 250 feet, tends to mimic topographic features and is strongly controlled by streams. The deep potentiometric surface, where the depth-to-water is greater than about 250 feet represents ground-water hydraulic heads within the more mature karst areas. A highly permeable zone extends about 20 mile west of Big Spring toward the upper Hurricane Creek Basin. Deeper flowing water in the Big Spring recharge area is directed toward this permeable zone. The estimated sizes of the spring recharge areas are 426 square miles for Big Spring, 352 square miles for Greer Spring, 290 square miles for Mammoth Spring, and 54 square miles for Boze Mill Spring. A discharge accumulation curve using Big Spring daily mean discharge data shows no substantial change in the discharge pattern of Big Spring during the period of record (water years 1922 through 2004). The extended periods when the spring flow deviated from the trend line can be attributed to prolonged departures from normal precipitation. The maximum possible instantaneous flow from Big Spring has not been adequately defined because of backwater effects from the Current River during high-flow conditions. Physical constraints within the spring conduit system may restrict its maximum flow. The largest discharge measured at Big Spring

  4. Victoria Stodden: Scholarly Communication in the Era of Big Data and Big Computation

    OpenAIRE

    Stodden, Victoria

    2015-01-01

    Victoria Stodden gave the keynote address for Open Access Week 2015. "Scholarly communication in the era of big data and big computation" was sponsored by the University Libraries, Computational Modeling and Data Analytics, the Department of Computer Science, the Department of Statistics, the Laboratory for Interdisciplinary Statistical Analysis (LISA), and the Virginia Bioinformatics Institute. Victoria Stodden is an associate professor in the Graduate School of Library and Information Scien...

  5. Big Data: Concept, Potentialities and Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2018-03-01

    Full Text Available The evolution of information systems and the growth in the use of the Internet and social networks has caused an explosion in the amount of available data relevant to the activities of the companies. Therefore, the treatment of these available data is vital to support operational, tactical and strategic decisions. This paper aims to present the concept of big data and the main technologies that support the analysis of large data volumes. The potential of big data is explored considering nine sectors of activity, such as financial, retail, healthcare, transports, agriculture, energy, manufacturing, public, and media and entertainment. In addition, the main current opportunities, vulnerabilities and privacy challenges of big data are discussed. It was possible to conclude that despite the potential for using the big data to grow in the previously identified areas, there are still some challenges that need to be considered and mitigated, namely the privacy of information, the existence of qualified human resources to work with Big Data and the promotion of a data-driven organizational culture.

  6. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  7. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  8. Forecasting in an integrated surface water-ground water system: The Big Cypress Basin, South Florida

    Science.gov (United States)

    Butts, M. B.; Feng, K.; Klinting, A.; Stewart, K.; Nath, A.; Manning, P.; Hazlett, T.; Jacobsen, T.

    2009-04-01

    The South Florida Water Management District (SFWMD) manages and protects the state's water resources on behalf of 7.5 million South Floridians and is the lead agency in restoring America's Everglades - the largest environmental restoration project in US history. Many of the projects to restore and protect the Everglades ecosystem are part of the Comprehensive Everglades Restoration Plan (CERP). The region has a unique hydrological regime, with close connection between surface water and groundwater, and a complex managed drainage network with many structures. Added to the physical complexity are the conflicting needs of the ecosystem for protection and restoration, versus the substantial urban development with the accompanying water supply, water quality and flood control issues. In this paper a novel forecasting and real-time modelling system is presented for the Big Cypress Basin. The Big Cypress Basin includes 272 km of primary canals and 46 water control structures throughout the area that provide limited levels of flood protection, as well as water supply and environmental quality management. This system is linked to the South Florida Water Management District's extensive real-time (SCADA) data monitoring and collection system. Novel aspects of this system include the use of a fully distributed and integrated modeling approach and a new filter-based updating approach for accurately forecasting river levels. Because of the interaction between surface- and groundwater a fully integrated forecast modeling approach is required. Indeed, results for the Tropical Storm Fay in 2008, the groundwater levels show an extremely rapid response to heavy rainfall. Analysis of this storm also shows that updating levels in the river system can have a direct impact on groundwater levels.

  9. Slaves to Big Data. Or Are We?

    Directory of Open Access Journals (Sweden)

    Mireille Hildebrandt

    2013-10-01

    Full Text Available

    In this contribution, the notion of Big Data is discussed in relation to the monetisation of personal data. The claim of some proponents, as well as adversaries, that Big Data implies that ‘n = all’, meaning that we no longer need to rely on samples because we have all the data, is scrutinised and found to be both overly optimistic and unnecessarily pessimistic. A set of epistemological and ethical issues is presented, focusing on the implications of Big Data for our perception, cognition, fairness, privacy and due process. The article then looks into the idea of user-centric personal data management to investigate to what extent it provides solutions for some of the problems triggered by the Big Data conundrum. Special attention is paid to the core principle of data protection legislation, namely purpose binding. Finally, this contribution seeks to inquire into the influence of Big Data politics on self, mind and society, and asks how we can prevent ourselves from becoming slaves to Big Data.

  10. Will Organization Design Be Affected By Big Data?

    Directory of Open Access Journals (Sweden)

    Giles Slinger

    2014-12-01

    Full Text Available Computing power and analytical methods allow us to create, collate, and analyze more data than ever before. When datasets are unusually large in volume, velocity, and variety, they are referred to as “big data.” Some observers have suggested that in order to cope with big data (a organizational structures will need to change and (b the processes used to design organizations will be different. In this article, we differentiate big data from relatively slow-moving, linked people data. We argue that big data will change organizational structures as organizations pursue the opportunities presented by big data. The processes by which organizations are designed, however, will be relatively unaffected by big data. Instead, organization design processes will be more affected by the complex links found in people data.

  11. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  12. MILTON’S PARADISE LOST AND A POSTCOLONIAL FALL

    OpenAIRE

    LUIZ FERNANDO FERREIRA DE SÁ

    2006-01-01

    In John Milton’s Paradise Lost epic and empire are dissociated. Contrary to many misreadings, this all-important work of the English Renaissance intersects postcolonial thinking in a number of ways. By using Gayatri Spivak’s circuit of postcolonial theory and practice, this paper enacts a counterpointal (mis)reading of Milton’s text: Paradise Lost may at last free its (post-)colonial (dis)content. Since every reading is a misreading, my (mis)reading of Milton’s paradis...

  13. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  14. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  15. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  16. Geohydrologic Investigations and Landscape Characteristics of Areas Contributing Water to Springs, the Current River, and Jacks Fork, Ozark National Scenic Riverways, Missouri

    Science.gov (United States)

    Mugel, Douglas N.; Richards, Joseph M.; Schumacher, John G.

    2009-01-01

    The Ozark National Scenic Riverways (ONSR) is a narrow corridor that stretches for approximately 134 miles along the Current River and Jacks Fork in southern Missouri. Most of the water flowing in the Current River and Jacks Fork is discharged to the rivers from springs within the ONSR, and most of the recharge area of these springs is outside the ONSR. This report describes geohydrologic investigations and landscape characteristics of areas contributing water to springs and the Current River and Jacks Fork in the ONSR. The potentiometric-surface map of the study area for 2000-07 shows that the groundwater divide extends beyond the surface-water divide in some places, notably along Logan Creek and the northeastern part of the study area, indicating interbasin transfer of groundwater between surface-water basins. A low hydraulic gradient occurs in much of the upland area west of the Current River associated with areas of high sinkhole density, which indicates the presence of a network of subsurface karst conduits. The results of a low base-flow seepage run indicate that most of the discharge in the Current River and Jacks Fork was from identified springs, and a smaller amount was from tributaries whose discharge probably originated as spring discharge, or from springs or diffuse groundwater discharge in the streambed. Results of a temperature profile conducted on an 85-mile reach of the Current River indicate that the lowest average temperatures were within or downstream from inflows of springs. A mass-balance on heat calculation of the discharge of Bass Rock Spring, a previously undescribed spring, resulted in an estimated discharge of 34.1 cubic feet per second (ft3/s), making it the sixth largest spring in the Current River Basin. The 13 springs in the study area for which recharge areas have been estimated accounted for 82 percent (867 ft3/s of 1,060 ft3/s) of the discharge of the Current River at Big Spring during the 2006 seepage run. Including discharge from

  17. Experimental feeding of DDE and PCB to female big brown bats (Eptesicus fuscus). [1,1-dichloro-2,2-bis(p-chlorophenyl)-ethylene

    Energy Technology Data Exchange (ETDEWEB)

    Clark, D.R. Jr.; Prouty, R.M.

    1977-03-01

    Twenty-two female big brown bats (Eptesicus fuscus) were collected in a house attic in Montgomery County, Maryland. Seventeen were fed mealworms (Tenebrio molitor larvae) that contained 166 ppM DDE; the other five were fed uncontaminated mealworms. After 54 days of feeding, six dosed bats were frozen and the remaining 16 were starved to death. In a second experiment, 21 female big brown bats were collected in a house attic in Prince Georges County, Maryland. Sixteen were fed mealworms that contained 9.4 ppM Aroclor 1254 (PCB). After 37 days, two bats had died, four dosed bats were frozen, and the remaining 15 were starved to death. Starvation caused mobilization of stored residues. After the feeding periods, average weights of all four groups (DDE-dosed, DDE control, PCB-dosed, PCB control) had increased. However, weights of DDE-dosed bats had increased significantly more than those of their controls, whereas weights of PCB-dosed bats had increased significantly less than those of their controls. During starvation, PCB-dosed bats lost weight significantly more slowly than controls. Because PCB levels in dosed bats resembled levels found in some free-living big brown bats, PCBs may be slowing metabolic rates of some free-living bats. It is not known how various common organochlorine residues may affect metabolism in hibernating bats.

  18. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  19. Functional redundancy and sensitivity of fish assemblages in European rivers, lakes and estuarine ecosystems.

    Science.gov (United States)

    Teichert, Nils; Lepage, Mario; Sagouis, Alban; Borja, Angel; Chust, Guillem; Ferreira, Maria Teresa; Pasquaud, Stéphanie; Schinegger, Rafaela; Segurado, Pedro; Argillier, Christine

    2017-12-14

    The impact of species loss on ecosystems functioning depends on the amount of trait similarity between species, i.e. functional redundancy, but it is also influenced by the order in which species are lost. Here we investigated redundancy and sensitivity patterns across fish assemblages in lakes, rivers and estuaries. Several scenarios of species extinction were simulated to determine whether the loss of vulnerable species (with high propensity of extinction when facing threats) causes a greater functional alteration than random extinction. Our results indicate that the functional redundancy tended to increase with species richness in lakes and rivers, but not in estuaries. We demonstrated that i) in the three systems, some combinations of functional traits are supported by non-redundant species, ii) rare species in rivers and estuaries support singular functions not shared by dominant species, iii) the loss of vulnerable species can induce greater functional alteration in rivers than in lakes and estuaries. Overall, the functional structure of fish assemblages in rivers is weakly buffered against species extinction because vulnerable species support singular functions. More specifically, a hotspot of functional sensitivity was highlighted in the Iberian Peninsula, which emphasizes the usefulness of quantitative criteria to determine conservation priorities.

  20. Big data and biomedical informatics: a challenging opportunity.

    Science.gov (United States)

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  1. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  2. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  3. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  4. Integrating R and Hadoop for Big Data Analysis

    OpenAIRE

    Bogdan Oancea; Raluca Mariana Dragoescu

    2014-01-01

    Analyzing and working with big data could be very diffi cult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Offi cial statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools ...

  5. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  6. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  7. More dams planned for Nitassinan Rivers : Innu Nation backgrounder on the proposed Churchill River hydro projects

    International Nuclear Information System (INIS)

    1999-01-01

    The Labrador Innu have expressed their concerns regarding the proposed development of a huge hydroelectric project for the Nitassinan Rivers. The Innu people were not consulted regarding the negotiations which will take place between Newfoundland and Quebec. The biggest concern of the Innu people is the cumulative environmental and social effects of the proposed development and how it will contribute to opening up their territory to further development. For example, access roads, transmission corridors and large-scale clear-cut forestry operations would all impact on their traditional way of life. This paper also described the impacts that the Innu have already experienced as a result of hydroelectric development on the Nitassinan Rivers during the 1970s, when the government of Newfoundland gave the rights to develop and exploit water resources, forests and minerals to a private company. At that time, large areas of hunting and trapping territories were flooded, belongings were lost and burial sites were flooded. The flooding also resulted in increased levels of methyl mercury in fish in the reservoirs as well as downstream. The losses suffered by the Innu people have yet to be addressed by the governments

  8. Effects of historical lead–zinc mining on riffle-dwelling benthic fish and crayfish in the Big River of southeastern Missouri, USA

    Science.gov (United States)

    Allert, A.L.; DiStefano, R.J.; Fairchild, J.F.; Schmitt, C.J.; McKee, M.J.; Girondo, J.A.; Brumbaugh, W.G.; May, T.W.

    2013-01-01

    The Big River (BGR) drains much of the Old Lead Belt mining district (OLB) in southeastern Missouri, USA, which was historically among the largest producers of lead–zinc (Pb–Zn) ore in the world. We sampled benthic fish and crayfish in riffle habitats at eight sites in the BGR and conducted 56-day in situ exposures to the woodland crayfish (Orconectes hylas) and golden crayfish (Orconectes luteus) in cages at four sites affected to differing degrees by mining. Densities of fish and crayfish, physical habitat and water quality, and the survival and growth of caged crayfish were examined at sites with no known upstream mining activities (i.e., reference sites) and at sites downstream of mining areas (i.e., mining and downstream sites). Lead, zinc, and cadmium were analyzed in surface and pore water, sediment, detritus, fish, crayfish, and other benthic macro-invertebrates. Metals concentrations in all materials analyzed were greater at mining and downstream sites than at reference sites. Ten species of fish and four species of crayfish were collected. Fish and crayfish densities were significantly greater at reference than mining or downstream sites, and densities were greater at downstream than mining sites. Survival of caged crayfish was significantly lower at mining sites than reference sites; downstream sites were not tested. Chronic toxic-unit scores and sediment probable effects quotients indicated significant risk of toxicity to fish and crayfish, and metals concentrations in crayfish were sufficiently high to represent a risk to wildlife at mining and downstream sites. Collectively, the results provided direct evidence that metals associated with historical mining activities in the OLB continue to affect aquatic life in the BGR.

  9. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  10. Physics with Big Karl Brainstorming. Abstracts

    International Nuclear Information System (INIS)

    Machner, H.; Lieb, J.

    2000-08-01

    Before summarizing details of the meeting, a short description of the spectrometer facility Big Karl is given. The facility is essentially a new instrument using refurbished dipole magnets from its predecessor. The large acceptance quadrupole magnets and the beam optics are new. Big Karl has a design very similar as the focussing spectrometers at MAMI (Mainz), AGOR (Groningen) and the high resolution spectrometer (HRS) in Hall A at Jefferson Laboratory with ΔE/E = 10 -4 but at some lower maximum momentum. The focal plane detectors consisting of multiwire drift chambers and scintillating hodoscopes are similar. Unlike HRS, Big Karl still needs Cerenkov counters and polarimeters in its focal plane; detectors which are necessary to perform some of the experiments proposed during the brainstorming. In addition, BIG KARL allows emission angle reconstruction via track measurements in its focal plane with high resolution. In the following the physics highlights, the proposed and potential experiments are summarized. During the meeting it became obvious that the physics to be explored at Big Karl can be grouped into five distinct categories, and this summary is organized accordingly. (orig.)

  11. Fish Passage Assessment: Big Canyon Creek Watershed, Technical Report 2004.

    Energy Technology Data Exchange (ETDEWEB)

    Christian, Richard

    2004-02-01

    This report presents the results of the fish passage assessment as outlined as part of the Protect and Restore the Big Canyon Creek Watershed project as detailed in the CY2003 Statement of Work (SOW). As part of the Northwest Power Planning Council's Columbia Basin Fish and Wildlife Program (FWP), this project is one of Bonneville Power Administration's (BPA) many efforts at off-site mitigation for damage to salmon and steelhead runs, their migration, and wildlife habitat caused by the construction and operation of federal hydroelectric dams on the Columbia River and its tributaries. The proposed restoration activities within the Big Canyon Creek watershed follow the watershed restoration approach mandated by the Fisheries and Watershed Program. Nez Perce Tribal Fisheries/Watershed Program vision focuses on protecting, restoring, and enhancing watersheds and treaty resources within the ceded territory of the Nez Perce Tribe under the Treaty of 1855 with the United States Federal Government. The program uses a holistic approach, which encompasses entire watersheds, ridge top to ridge top, emphasizing all cultural aspects. We strive toward maximizing historic ecosystem productive health, for the restoration of anadromous and resident fish populations. The Nez Perce Tribal Fisheries/Watershed Program (NPTFWP) sponsors the Protect and Restore the Big Canyon Creek Watershed project. The NPTFWP has the authority to allocate funds under the provisions set forth in their contract with BPA. In the state of Idaho vast numbers of relatively small obstructions, such as road culverts, block thousands of miles of habitat suitable for a variety of fish species. To date, most agencies and land managers have not had sufficient, quantifiable data to adequately address these barrier sites. The ultimate objective of this comprehensive inventory and assessment was to identify all barrier crossings within the watershed. The barriers were then prioritized according to the

  12. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    Science.gov (United States)

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  13. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  14. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  15. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  16. Una aproximación a Big Data = An approach to Big Data

    OpenAIRE

    Puyol Moreno, Javier

    2014-01-01

    Big Data puede ser considerada como una tendencia en el avance de la tecnología que ha abierto la puerta a un nuevo enfoque para la comprensión y la toma de decisiones, que se utiliza para describir las enormes cantidades de datos (estructurados, no estructurados y semi- estructurados) que sería demasiado largo y costoso para cargar una base de datos relacional para su análisis. Así, el concepto de Big Data se aplica a toda la información que no puede ser procesada o analizada utilizando herr...

  17. Toward a Literature-Driven Definition of Big Data in Healthcare.

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  18. Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.

    Science.gov (United States)

    Borrero, Ernesto E

    2018-01-01

    This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.

  19. Big Data and Biomedical Informatics: A Challenging Opportunity

    Science.gov (United States)

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  20. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  1. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  2. Reconnecting fragmented sturgeon populations in North American rivers

    Science.gov (United States)

    Jager, Henriette; Parsley, Michael J.; Cech, Joseph J. Jr.; McLaughlin, R.L.; Forsythe, Patrick S.; Elliott, Robert S.

    2016-01-01

    The majority of large North American rivers are fragmented by dams that interrupt migrations of wide-ranging fishes like sturgeons. Reconnecting habitat is viewed as an important means of protecting sturgeon species in U.S. rivers because these species have lost between 5% and 60% of their historical ranges. Unfortunately, facilities designed to pass other fishes have rarely worked well for sturgeons. The most successful passage facilities were sized appropriately for sturgeons and accommodated bottom-oriented species. For upstream passage, facilities with large entrances, full-depth guidance systems, large lifts, or wide fishways without obstructions or tight turns worked well. However, facilitating upstream migration is only half the battle. Broader recovery for linked sturgeon populations requires safe “round-trip” passage involving multiple dams. The most successful downstream passage facilities included nature-like fishways, large canal bypasses, and bottom-draw sluice gates. We outline an adaptive approach to implementing passage that begins with temporary programs and structures and monitors success both at the scale of individual fish at individual dams and the scale of metapopulations in a river basin. The challenge will be to learn from past efforts and reconnect North American sturgeon populations in a way that promotes range expansion and facilitates population recovery.

  3. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  4. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  5. Hours Lost to Planned and Unplanned Dental Visits Among US Adults.

    Science.gov (United States)

    Kelekar, Uma; Naavaal, Shillpa

    2018-01-11

    Poor oral health is associated with lost hours at work or school, which may affect a person's productivity. The objective of our study was to estimate work or school hours lost to dental visits among adults aged 18 and older by the types of visits (emergency or unplanned; routine, planned, or orthodontic; or cosmetic) and to determine the factors associated with hours lost. We used the most recent Oral Health Supplement data, from the 2008 National Health Interview Survey (NHIS), to estimate the total hours lost at work or school for dental visits among adults in the United States. The associations of the hours lost in unplanned and planned dental visits with socioeconomic characteristics, oral health status, and affordability were calculated. We used χ 2 tests and logistic regression to determine associations at P work or school hours were lost annually for dental care in the United States, of which 92.4 million hours were for emergency (unplanned) care (0.99 h/adult), 159.8 million for routine (planned) care or orthodontic care (1.71 h/adult), and 68.6 million for cosmetic care (0.73 h/adult). Adults with poor oral health were more likely to lose one or more hours in unplanned dental visits (OR = 5.60; 95% confidence interval [CI], 3.25-9.63) than those who reported very good oral health. Not being able to afford dental care was positively associated with more work hours lost in unplanned care (odds ratio [OR] = 2.56; 95% CI, 1.76-3.73). Compared with Hispanic adults, non-Hispanic white adults (OR = 2.09; 95% CI, 1.40-3.11) and non-Hispanic Asian adults and adults of other races/ethnicities (OR =1.91; 95% CI, 1.06-3.47) were more likely to lose any hours for planned care. Consistently, those with more than a high school education were more likely to lose any hours in planned care (OR = 1.39; 95% CI, 1.06-1.83) than those with a high school education or less. Dental problems result in hours lost from work and may adversely affect a person's productivity. There is

  6. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z 2 , the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  7. Energy-Saving Melting and Revert Reduction Technology (E-SMARRT): Lost Foam Thin Wall - Feasibility of Producing Lost Foam Castings in Aluminum and Magnesium Based Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Fasoyinu, Yemi [CanmetMATERIALS; Griffin, John A. [University of Alabama - Birmingham

    2014-03-31

    With the increased emphasis on vehicle weight reduction, production of near-net shape components by lost foam casting will make significant inroad into the next-generation of engineering component designs. The lost foam casting process is a cost effective method for producing complex castings using an expandable polystyrene pattern and un-bonded sand. The use of un-bonded molding media in the lost foam process will impose less constraint on the solidifying casting, making hot tearing less prevalent. This is especially true in Al-Mg and Al-Cu alloy systems that are prone to hot tearing when poured in rigid molds partially due to their long freezing range. Some of the unique advantages of using the lost foam casting process are closer dimensional tolerance, higher casting yield, and the elimination of sand cores and binders. Most of the aluminum alloys poured using the lost foam process are based on the Al-Si system. Very limited research work has been performed with Al-Mg and Al-Cu type alloys. With the increased emphasis on vehicle weight reduction, and given the high-strength-to-weight-ratio of magnesium, significant weight savings can be achieved by casting thin-wall (≤ 3 mm) engineering components from both aluminum- and magnesium-base alloys.

  8. The Information Panopticon in the Big Data Era

    Directory of Open Access Journals (Sweden)

    Martin Berner

    2014-04-01

    Full Text Available Taking advantage of big data opportunities is challenging for traditional organizations. In this article, we take a panoptic view of big data – obtaining information from more sources and making it visible to all organizational levels. We suggest that big data requires the transformation from command and control hierarchies to post-bureaucratic organizational structures wherein employees at all levels can be empowered while simultaneously being controlled. We derive propositions that show how to best exploit big data technologies in organizations.

  9. Paleogene Vertebrate Paleontology, Geology and Remote Sensing in the Wind River Basin

    Science.gov (United States)

    Stucky, R. K.; Krishtalka, L.

    1985-01-01

    Biostratigraphic and lithostratigraphic studies were used to correlate different events in the geologic evolution of the northeastern part of the Wind River Basin and have suggested several conclusions. Laterally equivalent exposures of the Lysite member from Cedar Ridge to Bridger Creek show a gradation in lithology from interbedded boulder conglomerates and sandstones to interbedded lenticular sandstones and mudstones to interbedded carbonaceous shales, coals and tabular sandstones. This gradation suggests a shift from alluvial fan to braided stream to paludal or lacustrine sedimentary environments during the late early Eocene. The Lysite and Lost Cabin members of the Wind River Formation are in fault contact in the Bridger Creek area and may intertongue to the east along Cedar Ridge. Ways in which remote sensing could be used in these studies are discussed.

  10. 38 CFR 12.24 - Operation of lost and found service.

    Science.gov (United States)

    2010-07-01

    ... custodian. VA Form 3771, Record of Lost or Found Article, will be used for recording articles of any... articles and to recover items which have been reported lost. Currency, including readily negotiable... closing hour. The currency or negotiable instruments will be delivered to the agent cashier before the...

  11. WE-H-BRB-00: Big Data in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  12. WE-H-BRB-00: Big Data in Radiation Oncology

    International Nuclear Information System (INIS)

    2016-01-01

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  13. De impact van Big Data op Internationale Betrekkingen

    NARCIS (Netherlands)

    Zwitter, Andrej

    Big Data changes our daily lives, but does it also change international politics? In this contribution, Andrej Zwitter (NGIZ chair at Groningen University) argues that Big Data impacts on international relations in ways that we only now start to understand. To comprehend how Big Data influences

  14. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  15. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  16. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  17. Toward a Literature-Driven Definition of Big Data in Healthcare

    Directory of Open Access Journals (Sweden)

    Emilie Baro

    2015-01-01

    Full Text Available Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n and the number of variables (p for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n*p≥7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR data.

  18. Toward a Literature-Driven Definition of Big Data in Healthcare

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  19. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  20. Big Data - What is it and why it matters.

    Science.gov (United States)

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. © 2016 Health Libraries Group.

  1. Assessment of eco-environmental geochemistry of heavy metals pollution of the river Gandak, a major tributary of the river Ganga in Northern India

    Science.gov (United States)

    Singh, Harendra; Kushwaha, Alok; Shukla, D. N.

    2018-04-01

    This study includes a seasonal analysis of sediment contamination of the River Gandak by heavy metals. It passes through the many small, medium and big cities of Uttar Pradesh and Bihar in Indian Territory. To explore the geochemical condition of the streambed sediment of the river, seven heavy metals, namely Co, Cu, Cr, Ni, Cd, Zn and Pb were analyzed. The newly deposited river bed sediment tests gathered on a seasonal basis from five stations for the years 2013-14 and 2014-15. Level of heavy metals in the sediments of the river was measured in the range between 10.54-16.78mg/kg for Co, 6.78-23.97mg/kg for Cu, 16.56-23.17mg/kg forCr, 9.71-18.11mg/kg for Ni, 0.364-1.068mg/kg forCd), 30.54-51.09mg/kg for Zn, 12.21-17.01mg/kg for Pb. Anthropogenic addition of heavy metals into the stream was controlled by utilizing metal Contamination Factor. Geo-accumulation values were found between (0-1) which indicates that sediment was uncontaminated to moderately contaminated, and can adversely influence the freshwater ecosystem of the river. A Good correlation was noted between Co, Zn, Pb, Ni, and Cu. Cluster analysis demonstrated three cluster groups of sites, which indicate that the metals originate from the same source mainly due to natural weathering of rocks, atmospheric deposition, human settlement and agriculture activity and is additionally confirmed by correlation analysis. However, on the basis of contamination indicators, it was found that the stream bed sediment is slightly contaminated with toxic metals. The conditions may harmful in the future because of the fast population growth in the river basin which might bring about irreparable biological harm in the long haul.

  2. Research on information security in big data era

    Science.gov (United States)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  3. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  4. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  5. The importance of colloids and mires for the transport of uranium isotopes through the Kalix River watershed and Baltic Sea

    International Nuclear Information System (INIS)

    Porcelli, D.; Wasserburg, G.J.; Andersson, P.S.

    1997-01-01

    The importance of colloids and organic deposits for the transport of uranium isotopes from continental source regions and through the estuarine environment was investigated in the mire-rich Kalix River drainage basin in northern Sweden and the Baltic Sea. Ultrafiltration techniques were used to separate uranium and other elements associated with colloids > 10 kD and >3 kD from open-quotes soluteclose quotes uranium and provided consistent results and high recovery rates for uranium as well as for other elements from large volume samples. Uranium concentrations in 0.45 μm-filtered Kalix River water samples increased by a factor of 3 from near the headwaters in the Caledonides to the river mouth while major cation concentrations were relatively constant. 234 U 238 U ratios were high (δ 234 U = 770-1500) throughout the basin, without showing any simple pattern, and required a supply of 234 U-rich water. Throughout the Kalix River, a large fraction (30-90%) of the uranium is carried by >10 kD colloids, which is compatible with uranium complexation with humic acids. No isotopic differences were found between colloid-associated and solute uranium. Within the Baltic Sea, about half of the uranium is removed at low salinities. The proportion that is lost is equivalent to that of river-derived colloid-bound uranium, suggesting that while solute uranium behaves conservatively during estuarine mixing, colloid-bound uranium is lost due to rapid flocculation of colloidal material. The association of uranium with colloids therefore may be an important parameter in determining uranium estuarine behavior. Mire peats in the Kalix River highly concentrate uranium and are potentially a significant source of recoil 234 U to the mirewaters and river waters. However, mirewater data clearly demonstrate that only small 234 U/ 238 U shifts are generated relative to inflowing groundwater. 63 refs., 8 figs., 3 tabs. groundwater. 63 refs., 8 figs., 3 tabs

  6. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  7. Addressing big data issues in Scientific Data Infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Membrey, P.; Grosso, P.; de Laat, C.; Smari, W.W.; Fox, G.C.

    2013-01-01

    Big Data are becoming a new technology focus both in science and in industry. This paper discusses the challenges that are imposed by Big Data on the modern and future Scientific Data Infrastructure (SDI). The paper discusses a nature and definition of Big Data that include such features as Volume,

  8. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  9. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  10. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  11. Big Data Management in US Hospitals: Benefits and Barriers.

    Science.gov (United States)

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  12. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  13. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  14. BLAM (Benthic Light Availability Model): A Proposed Model of Hydrogeomorphic Controls on Light in Rivers

    Science.gov (United States)

    Julian, J. P.; Doyle, M. W.; Stanley, E. H.

    2006-12-01

    Light is vital to the dynamics of aquatic ecosystems. It drives photosynthesis and photochemical reactions, affects thermal structure, and influences behavior of aquatic biota. Despite the fundamental role of light to riverine ecosystems, light studies in rivers have been mostly neglected because i) boundary conditions (e.g., banks, riparian vegetation) make ambient light measurements difficult, and ii) the optical water quality of rivers is highly variable and difficult to characterize. We propose a benthic light availability model (BLAM) that predicts the percent of incoming photosynthetically active radiation (PAR) available at the river bed. BLAM was developed by quantifying light attenuation of the five hydrogeomorphic controls that dictate riverine light availability: topography, riparian vegetation, channel geometry, optical water quality, and water depth. BLAM was calibrated using hydrogeomorphic data and light measurements from two rivers: Deep River - a 5th-order, turbid river in central North Carolina, and Big Spring Creek - a 2nd-order, optically clear stream in central Wisconsin. We used a series of four PAR sensors to measure i) above-canopy PAR, ii) PAR above water surface, iii) PAR below water surface, and iv) PAR on stream bed. These measurements were used to develop empirical light attenuation coefficients, which were then used in combination with optical water quality measurements, shading analyses, channel surveys, and flow records to quantify the spatial and temporal variability in riverine light availability. Finally, we apply BLAM to the Baraboo River - a 6th-order, 120-mile, unimpounded river in central Wisconsin - in order to characterize light availability along the river continuum (from headwaters to mouth).

  15. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  16. Big data in psychology: A framework for research advancement.

    Science.gov (United States)

    Adjerid, Idris; Kelley, Ken

    2018-02-22

    The potential for big data to provide value for psychology is significant. However, the pursuit of big data remains an uncertain and risky undertaking for the average psychological researcher. In this article, we address some of this uncertainty by discussing the potential impact of big data on the type of data available for psychological research, addressing the benefits and most significant challenges that emerge from these data, and organizing a variety of research opportunities for psychology. Our article yields two central insights. First, we highlight that big data research efforts are more readily accessible than many researchers realize, particularly with the emergence of open-source research tools, digital platforms, and instrumentation. Second, we argue that opportunities for big data research are diverse and differ both in their fit for varying research goals, as well as in the challenges they bring about. Ultimately, our outlook for researchers in psychology using and benefiting from big data is cautiously optimistic. Although not all big data efforts are suited for all researchers or all areas within psychology, big data research prospects are diverse, expanding, and promising for psychology and related disciplines. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Socio-economic status and environmental problems affecting the fishermen along the river tributaries of Dagupan City

    Directory of Open Access Journals (Sweden)

    Sally A. Jarin

    2018-02-01

    Full Text Available This study was conducted to determine the socio-economic status of the fishermen along the river tributaries of Dagupan City and to study the environment problems affecting the fishermen along the river tributaries of Dagupan City. This study used a mixed method research design and utilized a survey questionnaire to gather response from 60 fishers selected through proportionate sampling. The fishermen along the tributaries of Dagupan City are mostly male, young adult with family of their own, attended primary education, and belong to big family size. All respondents owned houses made only of light materials. Shrimps and crabs were the most frequently caught species now compared to many small pelagic fishes before, when there were no aquaculture structures like fish pens and cages. Fishermen were limited to the ownership of passive fishing gears like gill nets, skylab, skyblue, and liftnet. Fishpen or cage structures were owned by big businessmen while the fishers served only as caretakers. The respondents are worried on the decrease of fish catch. It is recommended that the government of the City of Dagupan should continue its program in demolishing pen and cage structures to free the rivers from pollution of feed inputs. Management and economic measures should be considered in order to gain significant effect on income of the fishermen. In designing management systems which have income improvement as a goal, appropriate implementation, monitoring and evaluation initiatives should be conducted and taken cared of for sustainable income improvement of farmers in the community of Dagupan and, perhaps, wealth distribution.

  18. 'Big data' in pharmaceutical science: challenges and opportunities.

    Science.gov (United States)

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  19. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  20. Anthropogenic factor and water quality in the rivers of Prespa Lake catchment; Antropogeniot faktor i kvalitetot na vodata vo rekite na prespanskoto slivno podrachje

    Energy Technology Data Exchange (ETDEWEB)

    Jordanoski, Momchulo; Veljanoska-Serafiloska, Elizabeta [Hydrobiological Institute, Ohrid (Macedonia, The Former Yugoslav Republic of)

    2001-07-01

    From the Rivers, which are subject of our investigation, only River Brajcinska and River Kranska are mountain rivers, while River Golema is lowland river. This has influence on water quality, which is evidently from the dates we found for the investigated parameters. Water quality moves from distinctly clear oligo trophic water (winter period), to strongly eytrophic polluted water (summer, autumn,). Great organic loading of River Golema in the summer period is evidential. Although, there are small possibilities of many investigations on this part, our obligation is to find possibilities, even to reduce some of sampling points of this project, to define the real state in long time period, so we could find appropriate conclusions and suggestions to eliminate that situation. Fields watching of the river beds and results from the laboratory investigations, shows how big is mans negligence for this natural resources. Practically, this rivers are recipients of all wastes that man made, like solid waste, communal waste water, waste water from pig farms, etc. International character of Lake Prespa enforces need of much completely and sensible engagement for reclaiming the state of the rivers inflow, in aim to protect the Lake. (Original)

  1. Retail inventory management with lost sales

    NARCIS (Netherlands)

    Curseu - Stefanut, A.

    2012-01-01

    The inventory control problem of traditional store-based grocery retailers has several challenging features. Demand for products is stochastic, and is typically lost when no inventory is available on the shelves. As the consumer behavior studies reveal, only a small percentage of customers are

  2. Lost-sales inventory theory : A review

    NARCIS (Netherlands)

    Bijvank, Marco; Vis, Iris F. A.

    2011-01-01

    In classic inventory models it is common to assume that excess demand is backordered. However, studies analyzing customer behavior in practice show that most unfulfilled demand is lost or an alternative item/location is looked for in many retail environments. Inventory systems that include this

  3. Visualizing Sungai Batu Ancient River, Lembah Bujang Archeology Site, Kedah – Malaysia using 3-D Resistivity Imaging

    Science.gov (United States)

    Yusoh, R.; Saad, R.; Saidin, M.; Muhammad, S. B.; Anda, S. T.; Ashraf, M. A. M.; Hazreek, Z. A. M.

    2018-04-01

    Sungai Batu at lembah bujang has become an interest spot for archeologist since it was discover as earliest entrepot in history of Malaysia. It is believe that there was a large lost river near the ancient jetty remain. Ground resistivity method was implement with large coverage area to locate the ancient river direction. Eleven ground resistivity survey line was carry out using SAS4000 equipment and wenner-schlumberger array was applied for measurement. Ground resistivity method was used to detect the alluvial deposit made by the ancient river deposition. The ground resistivity data were produce in 2D image and presented in 3D contour map for various selected depth by using Rockwork 15 and Surfer 8 software to visualize the alluvial deposits area. The results from the survey has found the appearance of sedimentation formation due to low resistivity value (0 – 330 ohm.m) was found near the existing river. However, the width of alluvial deposition was 1400 m which too wide for river width unless it was a deposition happen form age to age by movement of river meander. It’s conclude that the river was still at the same direction and its direction was change due to sediment dumping factor waking it shifting to the east.

  4. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  5. An environmental streamflow assessment for the Santiam River basin, Oregon

    Science.gov (United States)

    Risley, John C.; Wallick, J. Rose; Mangano, Joseph F.; Jones, Krista L.

    2012-01-01

    The Santiam River is a tributary of the Willamette River in northwestern Oregon and drains an area of 1,810 square miles. The U.S. Army Corps of Engineers (USACE) operates four dams in the basin, which are used primarily for flood control, hydropower production, recreation, and water-quality improvement. The Detroit and Big Cliff Dams were constructed in 1953 on the North Santiam River. The Green Peter and Foster Dams were completed in 1967 on the South Santiam River. The impacts of the structures have included a decrease in the frequency and magnitude of floods and an increase in low flows. For three North Santiam River reaches, the median of annual 1-day maximum streamflows decreased 42–50 percent because of regulated streamflow conditions. Likewise, for three reaches in the South Santiam River basin, the median of annual 1-day maximum streamflows decreased 39–52 percent because of regulation. In contrast to their effect on high flows, the dams increased low flows. The median of annual 7-day minimum flows in six of the seven study reaches increased under regulated streamflow conditions between 60 and 334 percent. On a seasonal basis, median monthly streamflows decreased from February to May and increased from September to January in all the reaches. However, the magnitude of these impacts usually decreased farther downstream from dams because of cumulative inflow from unregulated tributaries and groundwater entering the North, South, and main-stem Santiam Rivers below the dams. A Wilcox rank-sum test of monthly precipitation data from Salem, Oregon, and Waterloo, Oregon, found no significant difference between the pre-and post-dam periods, which suggests that the construction and operation of the dams since the 1950s and 1960s are a primary cause of alterations to the Santiam River basin streamflow regime. In addition to the streamflow analysis, this report provides a geomorphic characterization of the Santiam River basin and the associated conceptual

  6. [Big data and their perspectives in radiation therapy].

    Science.gov (United States)

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  7. Current applications of big data in obstetric anesthesiology.

    Science.gov (United States)

    Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin

    2017-06-01

    The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.

  8. Captive Rearing Program for Salmon River Chinook Salmon, 2000 Project Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Venditti, David A.

    2002-04-01

    During 2000, the Idaho Department of Fish and Game (IDFG) continued to develop techniques to rear chinook salmon Oncorhynchus tshawytscha to sexual maturity in captivity and to monitor their reproductive performance under natural conditions. Eyed-eggs were collected to establish captive cohorts from three study streams and included 503 eyed-eggs from East Fork Salmon River (EFSR), 250 from the Yankee Fork Salmon River, and 304 from the West Fork Yankee Fork Salmon River (WFYF). After collection, the eyed-eggs were immediately transferred to the Eagle Fish Hatchery, where they were incubated and reared by family group. Juveniles collected the previous summer were PIT and elastomer tagged and vaccinated against vibrio Vibrio spp. and bacterial kidney disease before the majority (approximately 75%) were transferred to the National Marine Fisheries Service, Manchester Marine Experimental Station for saltwater rearing through sexual maturity. Smolt transfers included 158 individuals from the Lemhi River (LEM), 193 from the WFYF, and 372 from the EFSR. Maturing fish transfers from the Manchester facility to the Eagle Fish Hatchery included 77 individuals from the LEM, 45 from the WFYF, and 11 from the EFSR. Two mature females from the WFYF were spawned in captivity with four males in 2000. Only one of the females produced viable eggs (N = 1,266), which were placed in in-stream incubators by personnel from the Shoshone-Bannock Tribe. Mature adults (N = 70) from the Lemhi River were released into Big Springs Creek to evaluate their reproductive performance. After release, fish distributed themselves throughout the study section and displayed a progression of habitat associations and behavior consistent with progressing maturation and the onset of spawning. Fifteen of the 17 suspected redds spawned by captive-reared parents in Big Springs Creek were hydraulically sampled to assess survival to the eyed stage of development. Eyed-eggs were collected from 13 of these, and

  9. Volume and Value of Big Healthcare Data.

    Science.gov (United States)

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  10. Using Big Book to Teach Things in My House

    OpenAIRE

    Effrien, Intan; Lailatus, Sa’diyah; Nuruliftitah Maja, Neneng

    2017-01-01

    The purpose of this study to determine students' interest in learning using the big book media. Big book is a big book from the general book. The big book contains simple words and images that match the content of sentences and spelling. From here researchers can know the interest and development of students' knowledge. As well as train researchers to remain crative in developing learning media for students.

  11. Short-term Lost Productivity per Victim: Intimate Partner Violence, Sexual Violence, or Stalking.

    Science.gov (United States)

    Peterson, Cora; Liu, Yang; Kresnow, Marcie-Jo; Florence, Curtis; Merrick, Melissa T; DeGue, Sarah; Lokey, Colby N

    2018-05-15

    The purpose of this study is to estimate victims' lifetime short-term lost productivity because of intimate partner violence, sexual violence, or stalking. U.S. nationally representative data from the 2012 National Intimate Partner and Sexual Violence Survey were used to estimate a regression-adjusted average per victim (female and male) and total population number of cumulative short-term lost work and school days (or lost productivity) because of victimizations over victims' lifetimes. Victims' lost productivity was valued using a U.S. daily production estimate. Analysis was conducted in 2017. Non-institutionalized adults with some lifetime exposure to intimate partner violence, sexual violence, or stalking (n=6,718 respondents; survey-weighted n=130,795,789) reported nearly 741 million lost productive days because of victimizations by an average of 2.5 perpetrators per victim. The adjusted per victim average was 4.9 (95% CI=3.9, 5.9) days, controlling for victim, perpetrator, and violence type factors. The estimated societal cost of this short-term lost productivity was $730 per victim, or $110 billion across the lifetimes of all victims (2016 USD). Factors associated with victims having a higher number of lost days included a higher number of perpetrators and being female, as well as sexual violence, physical violence, or stalking victimization by an intimate partner perpetrator, stalking victimization by an acquaintance perpetrator, and sexual violence or stalking victimization by a family member perpetrator. Short-term lost productivity represents a minimum economic valuation of the immediate negative effects of intimate partner violence, sexual violence, and stalking. Victims' lost productivity affects family members, colleagues, and employers. Published by Elsevier Inc.

  12. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  13. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  14. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  15. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    Science.gov (United States)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  16. Hot big bang or slow freeze?

    Science.gov (United States)

    Wetterich, C.

    2014-09-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze - a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple ;crossover model; without a big bang singularity. In the infinite past space-time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  17. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  18. Clubbed fingers: the claws we lost?

    NARCIS (Netherlands)

    Brouwers, A.A.M.; Vermeij-Keers, C.; Zoelen, E.J.J. van; Gooren, L.J.G.

    2004-01-01

    Clubbed digits resemble the human embryonic fingers and toes, which took like the digits of a claw. Clubbed digits, thus, may represent the return of the embryonic claw and may even represent the claws man has lost during evolution, if ontogenesis realty recapitulates phylogenesis. We put forward

  19. Curating Big Data Made Simple: Perspectives from Scientific Communities.

    Science.gov (United States)

    Sowe, Sulayman K; Zettsu, Koji

    2014-03-01

    The digital universe is exponentially producing an unprecedented volume of data that has brought benefits as well as fundamental challenges for enterprises and scientific communities alike. This trend is inherently exciting for the development and deployment of cloud platforms to support scientific communities curating big data. The excitement stems from the fact that scientists can now access and extract value from the big data corpus, establish relationships between bits and pieces of information from many types of data, and collaborate with a diverse community of researchers from various domains. However, despite these perceived benefits, to date, little attention is focused on the people or communities who are both beneficiaries and, at the same time, producers of big data. The technical challenges posed by big data are as big as understanding the dynamics of communities working with big data, whether scientific or otherwise. Furthermore, the big data era also means that big data platforms for data-intensive research must be designed in such a way that research scientists can easily search and find data for their research, upload and download datasets for onsite/offsite use, perform computations and analysis, share their findings and research experience, and seamlessly collaborate with their colleagues. In this article, we present the architecture and design of a cloud platform that meets some of these requirements, and a big data curation model that describes how a community of earth and environmental scientists is using the platform to curate data. Motivation for developing the platform, lessons learnt in overcoming some challenges associated with supporting scientists to curate big data, and future research directions are also presented.

  20. Big data analytics in healthcare: promise and potential.

    Science.gov (United States)

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  1. Data warehousing in the age of big data

    CERN Document Server

    Krishnan, Krish

    2013-01-01

    Data Warehousing in the Age of the Big Data will help you and your organization make the most of unstructured data with your existing data warehouse. As Big Data continues to revolutionize how we use data, it doesn't have to create more confusion. Expert author Krish Krishnan helps you make sense of how Big Data fits into the world of data warehousing in clear and concise detail. The book is presented in three distinct parts. Part 1 discusses Big Data, its technologies and use cases from early adopters. Part 2 addresses data warehousing, its shortcomings, and new architecture

  2. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  3. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  4. Lost in Location:- on how (not) to situate aliens

    OpenAIRE

    Hansen, Lone Koefoed

    2009-01-01

    Udgivelsesdato: June The article investigates how users of personal satellite navigation devices (often referred to as sat-nav) are sometimes lost and led astray and argues that the satnav's aim to remove every insecurity about the correct route seems to remove the individual's conscious perception of the space traversed. While becoming destination aware, the individual loses her location awareness. The article proposes that the reason people get lost when using sat-nav is due to a wrong l...

  5. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  6. Big Data Analytics, Infectious Diseases and Associated Ethical Impacts

    OpenAIRE

    Garattini, C.; Raffle, J.; Aisyah, D. N.; Sartain, F.; Kozlakidis, Z.

    2017-01-01

    The exponential accumulation, processing and accrual of big data in healthcare are only possible through an equally rapidly evolving field of big data analytics. The latter offers the capacity to rationalize, understand and use big data to serve many different purposes, from improved services modelling to prediction of treatment outcomes, to greater patient and disease stratification. In the area of infectious diseases, the application of big data analytics has introduced a number of changes ...

  7. Evaluation of Data Management Systems for Geospatial Big Data

    OpenAIRE

    Amirian, Pouria; Basiri, Anahid; Winstanley, Adam C.

    2014-01-01

    Big Data encompasses collection, management, processing and analysis of the huge amount of data that varies in types and changes with high frequency. Often data component of Big Data has a positional component as an important part of it in various forms, such as postal address, Internet Protocol (IP) address and geographical location. If the positional components in Big Data extensively used in storage, retrieval, analysis, processing, visualization and knowledge discovery (geospatial Big Dat...

  8. Pawcatuck River and Narragansett Bay Drainage Basins Water and Related Land Resources Study. Big River Reservoir Project. Volume II. Appendix A-G.

    Science.gov (United States)

    1981-07-01

    actuarial rate are determined and flood plain zoning is enacted. A flood hazard analysis of the Pocasset River in Johnston has been completed by the Soil...RELATED LAND ESOURCES STUDY TPANOWMSON MAIN CNlmlONCOOT CUMV ENA ________ _____ AW B- ANONO o EOF " asmS TAPA MTIONN. COS6 PLATE NO 6-16 " : Ei 2 wm (L ca 0

  9. Years of life lost through Down's syndrome.

    Science.gov (United States)

    Jones, M B

    1979-10-01

    A congenital genetic condition does not act either as a cause of death or at the time of death only. Hence, years of life lost through such a conditon cannot be calculated in the same way as for a conventional cause of death. The main difference is that a cause of death acting at age x cuts off as many years of life as the dead person might otherwise have expected to live (life expectancy at age x), whereas a congenital genetic condition exposes an affected person to a different schedule of life-threatening risks from birth onwards. In the latter case, years of life lost is calculated as the difference in life expectancy at birth for affected and non-affected persons. This reasoning is worked out in algebraic form and then applied to Down's syndrome. The data base is provided by two large and recent studies, one in Massachusetts and the other in Denmark, of mortality rates among all cases of Down's syndrome, whether in an institution or not, born during a given period of years or living at a given point in time in a fixed geographical area. So calculated, years of life lost through Down's syndrome relative to the United States general population in 1970 was 53.6 years per 1000 livebirths. Prenatal mortality is also discussed.

  10. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  11. West Virginia's big trees: setting the record straight

    Science.gov (United States)

    Melissa Thomas-Van Gundy; Robert. Whetsell

    2016-01-01

    People love big trees, people love to find big trees, and people love to find big trees in the place they call home. Having been suspicious for years, my coauthor and historian Rob Whetsell, approached me with a species identification challenge. There are several photographs of giant trees used by many people to illustrate the past forests of West Virginia,...

  12. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  13. D-branes in a big bang/big crunch universe: Misner space

    International Nuclear Information System (INIS)

    Hikida, Yasuaki; Nayak, Rashmi R.; Panigrahi, Kamal L.

    2005-01-01

    We study D-branes in a two-dimensional lorentzian orbifold R 1,1 /Γ with a discrete boost Γ. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2→2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case

  14. D-branes in a big bang/big crunch universe: Misner space

    Energy Technology Data Exchange (ETDEWEB)

    Hikida, Yasuaki [Theory Group, High Energy Accelerator Research Organization (KEK), Tukuba, Ibaraki 305-0801 (Japan); Nayak, Rashmi R. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy); Panigrahi, Kamal L. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy)

    2005-09-01

    We study D-branes in a two-dimensional lorentzian orbifold R{sup 1,1}/{gamma} with a discrete boost {gamma}. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2{yields}2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case.

  15. Astroinformatics: the big data of the universe

    OpenAIRE

    Barmby, Pauline

    2016-01-01

    In astrophysics we like to think that our field was the originator of big data, back when it had to be carried around in big sky charts and books full of tables. These days, it's easier to move astrophysics data around, but we still have a lot of it, and upcoming telescope  facilities will generate even more. I discuss how astrophysicists approach big data in general, and give examples from some Western Physics & Astronomy research projects.  I also give an overview of ho...

  16. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  17. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  18. Hidden costs, value lost: uninsurance in America

    National Research Council Canada - National Science Library

    Committee on the Consequences of Uninsurance

    2003-01-01

    Hidden Cost, Value Lost , the fifth of a series of six books on the consequences of uninsurance in the United States, illustrates some of the economic and social losses to the country of maintaining...

  19. Inflated granularity: Spatial “Big Data” and geodemographics

    Directory of Open Access Journals (Sweden)

    Craig M Dalton

    2015-08-01

    Full Text Available Data analytics, particularly the current rhetoric around “Big Data”, tend to be presented as new and innovative, emerging ahistorically to revolutionize modern life. In this article, we situate one branch of Big Data analytics, spatial Big Data, through a historical predecessor, geodemographic analysis, to help develop a critical approach to current data analytics. Spatial Big Data promises an epistemic break in marketing, a leap from targeting geodemographic areas to targeting individuals. Yet it inherits characteristics and problems from geodemographics, including a justification through the market, and a process of commodification through the black-boxing of technology. As researchers develop sustained critiques of data analytics and its effects on everyday life, we must so with a grounding in the cultural and historical contexts from which data technologies emerged. This article and others (Barnes and Wilson, 2014 develop a historically situated, critical approach to spatial Big Data. This history illustrates connections to the critical issues of surveillance, redlining, and the production of consumer subjects and geographies. The shared histories and structural logics of spatial Big Data and geodemographics create the space for a continued critique of data analyses’ role in society.

  20. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  1. Floodplain simulation for Musi River using integrated 1D/2D hydrodynamic model

    Directory of Open Access Journals (Sweden)

    Al Amin Muhammad B.

    2017-01-01

    Full Text Available This paper presents the simulation of floodplain at Musi River using integrated 1D and 2D hydrodynamic model. The 1D flow simulation was applied for the river channel with flow hydrograph as upstream boundary condition. The result of 1D flow simulation was integrated into 2D flow simulation in order to know the area and characteristics of flood inundation. The input data of digital terrain model which was used in this research had grid resolution of 10m×10m, but for 2D simulation the resolution was with grid resolution 50 m × 50 m so as to limit simulation time since the model size was big enough. The result of the simulation showed that the inundated area surrounding Musi River is about 107.44 km2 with maximum flood depth is 3.24 m, water surface velocity ranges from 0.00 to 0.83 m/s. Most of floodplain areas varied from middle to high flood hazard level, and only few areas had very high level of flood hazard especially on river side. The structural flood control measurement to be recommended to Palembang is to construct flood dike and flood gate. The non structural measurement one is to improve watershed management and socialization of flood awareness.

  2. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  3. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  4. 29 CFR 779.251 - Goods that have lost their out-of-State identity.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Goods that have lost their out-of-State identity. 779.251... Coverage Interstate Inflow Test Under Prior Act § 779.251 Goods that have lost their out-of-State identity... been processed or manufactured so as to have lost their identity as out-of-State goods before they are...

  5. Toward a manifesto for the 'public understanding of big data'.

    Science.gov (United States)

    Michael, Mike; Lupton, Deborah

    2016-01-01

    In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article. © The Author(s) 2015.

  6. What do Big Data do in Global Governance?

    DEFF Research Database (Denmark)

    Krause Hansen, Hans; Porter, Tony

    2017-01-01

    Two paradoxes associated with big data are relevant to global governance. First, while promising to increase the capacities of humans in governance, big data also involve an increasingly independent role for algorithms, technical artifacts, the Internet of things, and other objects, which can...... reduce the control of human actors. Second, big data involve new boundary transgressions as data are brought together from multiple sources while also creating new boundary conflicts as powerful actors seek to gain advantage by controlling big data and excluding competitors. These changes are not just...... about new data sources for global decision-makers, but instead signal more profound changes in the character of global governance....

  7. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  8. 76 FR 7810 - Big Horn County Resource Advisory Committee

    Science.gov (United States)

    2011-02-11

    ..., Wyoming 82801. Comments may also be sent via e-mail to [email protected] , with the words Big... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee...

  9. Hot big bang or slow freeze?

    Energy Technology Data Exchange (ETDEWEB)

    Wetterich, C.

    2014-09-07

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  10. Hot big bang or slow freeze?

    International Nuclear Information System (INIS)

    Wetterich, C.

    2014-01-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe

  11. Hot big bang or slow freeze?

    Directory of Open Access Journals (Sweden)

    C. Wetterich

    2014-09-01

    Full Text Available We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  12. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  13. Pre-big bang cosmology and quantum fluctuations

    International Nuclear Information System (INIS)

    Ghosh, A.; Pollifrone, G.; Veneziano, G.

    2000-01-01

    The quantum fluctuations of a homogeneous, isotropic, open pre-big bang model are discussed. By solving exactly the equations for tensor and scalar perturbations we find that particle production is negligible during the perturbative Pre-Big Bang phase

  14. The 2014 water release into the arid Colorado River delta and associated water losses by evaporation

    Energy Technology Data Exchange (ETDEWEB)

    Daesslé, L.W., E-mail: walter@uabc.edu.mx [Instituto de Investigaciones Oceanológicas, Universidad Autónoma de Baja California, CarreteraTranspeninsular Tijuana-Ensenada No. 3917, Fraccionamiento Playitas, CP 22860 Ensenada, Baja California (Mexico); Friedrich-Alexander University of Erlangen-Nuremberg (FAU), Department of Geography and Geosciences, GeoZentrum Nordbayern, Schlossgarten 5, 91054 Erlangen (Germany); Geldern, R. van [Friedrich-Alexander University of Erlangen-Nuremberg (FAU), Department of Geography and Geosciences, GeoZentrum Nordbayern, Schlossgarten 5, 91054 Erlangen (Germany); Orozco-Durán, A. [Instituto de Investigaciones Oceanológicas, Universidad Autónoma de Baja California, CarreteraTranspeninsular Tijuana-Ensenada No. 3917, Fraccionamiento Playitas, CP 22860 Ensenada, Baja California (Mexico); Barth, J.A.C. [Friedrich-Alexander University of Erlangen-Nuremberg (FAU), Department of Geography and Geosciences, GeoZentrum Nordbayern, Schlossgarten 5, 91054 Erlangen (Germany)

    2016-01-15

    For the first time in history, water was intentionally released for environmental purposes into the final, otherwise dry, 160-km stretch of the Colorado River basin, south of the Mexican border. Between March and May 2014 three pulses of water with a total volume of 132 × 10{sup 6} m{sup 3} were released to assess the restoration potential of endemic flora along its course and to reach its estuary. The latter had not received a sustained input of fresh water and nutrients from its main fluvial source for over 50 years because of numerous upstream dam constructions. During this pulse flow large amounts of water were lost and negligible amounts reached the ocean. While some of these water losses can be attributed to plant uptake and infiltration, we were able to quantify evaporation losses between 16.1 to 17.3% of the original water mass % within the first 80 km after the Morels Dam with water stable isotope data. Our results showed no evidence for freshwater reaching the upper Colorado River estuary and it is assumed that the pulse flow had only negligible influences on the coastal ecosystem. Future water releases that aim on ecological restoration need to become more frequent and should have larger volumes if more significant effects are to be established on the area. - Highlights: • Isotope ratios of oxygen and hydrogen quantify water lost through evaporation. • Evaporation losses between 16.1 and 17.3% during the 2014 Colorado River • Larger water volumes are required to influence the estuary ecosystem.

  15. The 2014 water release into the arid Colorado River delta and associated water losses by evaporation

    International Nuclear Information System (INIS)

    Daesslé, L.W.; Geldern, R. van; Orozco-Durán, A.; Barth, J.A.C.

    2016-01-01

    For the first time in history, water was intentionally released for environmental purposes into the final, otherwise dry, 160-km stretch of the Colorado River basin, south of the Mexican border. Between March and May 2014 three pulses of water with a total volume of 132 × 10"6 m"3 were released to assess the restoration potential of endemic flora along its course and to reach its estuary. The latter had not received a sustained input of fresh water and nutrients from its main fluvial source for over 50 years because of numerous upstream dam constructions. During this pulse flow large amounts of water were lost and negligible amounts reached the ocean. While some of these water losses can be attributed to plant uptake and infiltration, we were able to quantify evaporation losses between 16.1 to 17.3% of the original water mass % within the first 80 km after the Morels Dam with water stable isotope data. Our results showed no evidence for freshwater reaching the upper Colorado River estuary and it is assumed that the pulse flow had only negligible influences on the coastal ecosystem. Future water releases that aim on ecological restoration need to become more frequent and should have larger volumes if more significant effects are to be established on the area. - Highlights: • Isotope ratios of oxygen and hydrogen quantify water lost through evaporation. • Evaporation losses between 16.1 and 17.3% during the 2014 Colorado River • Larger water volumes are required to influence the estuary ecosystem.

  16. MILTON’S PARADISE LOST AND A POSTCOLONIAL FALL

    Directory of Open Access Journals (Sweden)

    LUIZ FERNANDO FERREIRA DE SÁ

    2006-01-01

    Full Text Available In John Milton’s Paradise Lost epic and empire are dissociated. Contrary to many misreadings, this all-important work of the English Renaissance intersects postcolonial thinking in a number of ways. By using Gayatri Spivak’s circuit of postcolonial theory and practice, this paper enacts a counterpointal (misreading of Milton’s text: Paradise Lost may at last free its (post-colonial (discontent. Since every reading is a misreading, my (misreading of Milton’s paradise is amo(vement of resistance against and intervention in a so-called grand narrative of power (Milton’s epic with a view to proposing a postcolonial conversation with this text.

  17. Analysis of Big Data Maturity Stage in Hospitality Industry

    OpenAIRE

    Shabani, Neda; Munir, Arslan; Bose, Avishek

    2017-01-01

    Big data analytics has an extremely significant impact on many areas in all businesses and industries including hospitality. This study aims to guide information technology (IT) professionals in hospitality on their big data expedition. In particular, the purpose of this study is to identify the maturity stage of the big data in hospitality industry in an objective way so that hotels be able to understand their progress, and realize what it will take to get to the next stage of big data matur...

  18. A Multidisciplinary Perspective of Big Data in Management Research

    OpenAIRE

    Sheng, Jie; Amankwah-Amoah, J.; Wang, X.

    2017-01-01

    In recent years, big data has emerged as one of the prominent buzzwords in business and management. In spite of the mounting body of research on big data across the social science disciplines, scholars have offered little synthesis on the current state of knowledge. To take stock of academic research that contributes to the big data revolution, this paper tracks scholarly work's perspectives on big data in the management domain over the past decade. We identify key themes emerging in manageme...

  19. Water quality and trend analysis of Colorado--Big Thompson system reservoirs and related conveyances, 1969 through 2000

    Science.gov (United States)

    Stevens, Michael R.

    2003-01-01

    The U.S. Geological Survey, in an ongoing cooperative monitoring program with the Northern Colorado Water Conservancy District, Bureau of Reclamation, and City of Fort Collins, has collected water-quality data in north-central Colorado since 1969 in reservoirs and conveyances, such as canals and tunnels, related to the Colorado?Big Thompson Project, a water-storage, collection, and distribution system. Ongoing changes in water use among agricultural and municipal users on the eastern slope of the Rocky Mountains in Colorado, changing land use in reservoir watersheds, and other water-quality issues among Northern Colorado Water Conservancy District customers necessitated a reexamination of water-quality trends in the Colorado?Big Thompson system reservoirs and related conveyances. The sampling sites are on reservoirs, canals, and tunnels in the headwaters of the Colorado River (on the western side of the transcontinental diversion operations) and the headwaters of the Big Thompson River (on the eastern side of the transcontinental diversion operations). Carter Lake Reservoir and Horsetooth Reservoir are off-channel water-storage facilities, located in the foothills of the northern Colorado Front Range, for water supplied from the Colorado?Big Thompson Project. The length of water-quality record ranges from approximately 3 to 30 years depending on the site and the type of measurement or constituent. Changes in sampling frequency, analytical methods, and minimum reporting limits have occurred repeatedly over the period of record. The objective of this report was to complete a retrospective water-quality and trend analysis of reservoir profiles, nutrients, major ions, selected trace elements, chlorophyll-a, and hypolimnetic oxygen data from 1969 through 2000 in Lake Granby, Shadow Mountain Lake, and the Granby Pump Canal in Grand County, Colorado, and Horsetooth Reservoir, Carter Lake, Lake Estes, Alva B. Adams Tunnel, and Olympus Tunnel in Larimer County, Colorado

  20. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  1. Big Data as Governmentality in International Development

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    2017-01-01

    Statistics have long shaped the field of visibility for the governance of development projects. The introduction of big data has altered the field of visibility. Employing Dean's “analytics of government” framework, we analyze two cases—malaria tracking in Kenya and monitoring of food prices...... in Indonesia. Our analysis shows that big data introduces a bias toward particular types of visualizations. What problems are being made visible through big data depends to some degree on how the underlying data is visualized and who is captured in the visualizations. It is also influenced by technical factors...

  2. A Brief Review on Leading Big Data Models

    Directory of Open Access Journals (Sweden)

    Sugam Sharma

    2014-11-01

    Full Text Available Today, science is passing through an era of transformation, where the inundation of data, dubbed data deluge is influencing the decision making process. The science is driven by the data and is being termed as data science. In this internet age, the volume of the data has grown up to petabytes, and this large, complex, structured or unstructured, and heterogeneous data in the form of “Big Data” has gained significant attention. The rapid pace of data growth through various disparate sources, especially social media such as Facebook, has seriously challenged the data analytic capabilities of traditional relational databases. The velocity of the expansion of the amount of data gives rise to a complete paradigm shift in how new age data is processed. Confidence in the data engineering of the existing data processing systems is gradually fading whereas the capabilities of the new techniques for capturing, storing, visualizing, and analyzing data are evolving. In this review paper, we discuss some of the modern Big Data models that are leading contributors in the NoSQL era and claim to address Big Data challenges in reliable and efficient ways. Also, we take the potential of Big Data into consideration and try to reshape the original operationaloriented definition of “Big Science” (Furner, 2003 into a new data-driven definition and rephrase it as “The science that deals with Big Data is Big Science.”

  3. 75 FR 71069 - Big Horn County Resource Advisory Committee

    Science.gov (United States)

    2010-11-22

    ....us , with the words Big Horn County RAC in the subject line. Facsimilies may be sent to 307-674-2668... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee...

  4. 76 FR 26240 - Big Horn County Resource Advisory Committee

    Science.gov (United States)

    2011-05-06

    ... words Big Horn County RAC in the subject line. Facsimilies may be sent to 307-674-2668. All comments... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee...

  5. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  6. Commentary: Epidemiology in the era of big data.

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-05-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.

  7. Final Opportunity to Rehabilitate an Urban River as a Water Source for Mexico City

    Science.gov (United States)

    Mazari-Hiriart, Marisa; Pérez-Ortiz, Gustavo; Orta-Ledesma, María Teresa; Armas-Vargas, Felipe; Tapia, Marco A.; Solano-Ortiz, Rosa; Silva, Miguel A.; Yañez-Noguez, Isaura; López-Vidal, Yolanda; Díaz-Ávalos, Carlos

    2014-01-01

    The aim of this study was to evaluate the amount and quality of water in the Magdalena-Eslava river system and to propose alternatives for sustainable water use. The system is the last urban river in the vicinity of Mexico City that supplies surface water to the urban area. Historical flow data were analyzed (1973–2010), along with the physicochemical and bacteriological attributes, documenting the evolution of these variables over the course of five years (2008–2012) in both dry and rainy seasons. The analyses show that the flow regime has been significantly altered. The physicochemical variables show significant differences between the natural area, where the river originates, and the urban area, where the river receives untreated wastewater. Nutrient and conductivity concentrations in the river were equivalent to domestic wastewater. Fecal pollution indicators and various pathogens were present in elevated densities, demonstrating a threat to the population living near the river. Estimates of the value of the water lost as a result of mixing clean and contaminated water are presented. This urban river should be rehabilitated as a sustainability practice, and if possible, these efforts should be replicated in other areas. Because of the public health issues and in view of the population exposure where the river flows through the city, the river should be improved aesthetically and should be treated to allow its ecosystem services to recover. This river represents an iconic case for Mexico City because it connects the natural and urban areas in a socio-ecological system that can potentially provide clean water for human consumption. Contaminated water could be treated and reused for irrigation in one of the green areas of the city. Wastewater treatment plants and the operation of the existing purification plants are urgent priorities that could lead to better, more sustainable water use practices in Mexico City. PMID:25054805

  8. Final opportunity to rehabilitate an urban river as a water source for Mexico City.

    Directory of Open Access Journals (Sweden)

    Marisa Mazari-Hiriart

    Full Text Available The aim of this study was to evaluate the amount and quality of water in the Magdalena-Eslava river system and to propose alternatives for sustainable water use. The system is the last urban river in the vicinity of Mexico City that supplies surface water to the urban area. Historical flow data were analyzed (1973-2010, along with the physicochemical and bacteriological attributes, documenting the evolution of these variables over the course of five years (2008-2012 in both dry and rainy seasons. The analyses show that the flow regime has been significantly altered. The physicochemical variables show significant differences between the natural area, where the river originates, and the urban area, where the river receives untreated wastewater. Nutrient and conductivity concentrations in the river were equivalent to domestic wastewater. Fecal pollution indicators and various pathogens were present in elevated densities, demonstrating a threat to the population living near the river. Estimates of the value of the water lost as a result of mixing clean and contaminated water are presented. This urban river should be rehabilitated as a sustainability practice, and if possible, these efforts should be replicated in other areas. Because of the public health issues and in view of the population exposure where the river flows through the city, the river should be improved aesthetically and should be treated to allow its ecosystem services to recover. This river represents an iconic case for Mexico City because it connects the natural and urban areas in a socio-ecological system that can potentially provide clean water for human consumption. Contaminated water could be treated and reused for irrigation in one of the green areas of the city. Wastewater treatment plants and the operation of the existing purification plants are urgent priorities that could lead to better, more sustainable water use practices in Mexico City.

  9. Final opportunity to rehabilitate an urban river as a water source for Mexico City.

    Science.gov (United States)

    Mazari-Hiriart, Marisa; Pérez-Ortiz, Gustavo; Orta-Ledesma, María Teresa; Armas-Vargas, Felipe; Tapia, Marco A; Solano-Ortiz, Rosa; Silva, Miguel A; Yañez-Noguez, Isaura; López-Vidal, Yolanda; Díaz-Ávalos, Carlos

    2014-01-01

    The aim of this study was to evaluate the amount and quality of water in the Magdalena-Eslava river system and to propose alternatives for sustainable water use. The system is the last urban river in the vicinity of Mexico City that supplies surface water to the urban area. Historical flow data were analyzed (1973-2010), along with the physicochemical and bacteriological attributes, documenting the evolution of these variables over the course of five years (2008-2012) in both dry and rainy seasons. The analyses show that the flow regime has been significantly altered. The physicochemical variables show significant differences between the natural area, where the river originates, and the urban area, where the river receives untreated wastewater. Nutrient and conductivity concentrations in the river were equivalent to domestic wastewater. Fecal pollution indicators and various pathogens were present in elevated densities, demonstrating a threat to the population living near the river. Estimates of the value of the water lost as a result of mixing clean and contaminated water are presented. This urban river should be rehabilitated as a sustainability practice, and if possible, these efforts should be replicated in other areas. Because of the public health issues and in view of the population exposure where the river flows through the city, the river should be improved aesthetically and should be treated to allow its ecosystem services to recover. This river represents an iconic case for Mexico City because it connects the natural and urban areas in a socio-ecological system that can potentially provide clean water for human consumption. Contaminated water could be treated and reused for irrigation in one of the green areas of the city. Wastewater treatment plants and the operation of the existing purification plants are urgent priorities that could lead to better, more sustainable water use practices in Mexico City.

  10. Preventing customer defection and stimulating return of the lost customers

    Directory of Open Access Journals (Sweden)

    Senić Radoslav

    2013-01-01

    Full Text Available Customers represent company's most valuable asset. Company can assure its survival, further growth and development by retaining existing, attracting new and returning lost customers. Retaining existing, loyal customers is the most profitable business activity, attracting new ones is the most expensive, while returning lost and frequently forgotten customers is a type of business activity that still generates modest interest among researchers and practitioners. So far, marketing strategies have been mainly directed towards the first two categories of customers. The objective of this paper is dedicated to customer defection and returning lost customers. Paper discusses customer relationship life-cycle and the significance of managing customer return within it, types of customer defections, the process of managing return, as well as, the reasons that led to customer defection.

  11. Construction of a groundwater-flow model for the Big Sioux Aquifer using airborne electromagnetic methods, Sioux Falls, South Dakota

    Science.gov (United States)

    Valder, Joshua F.; Delzer, Gregory C.; Carter, Janet M.; Smith, Bruce D.; Smith, David V.

    2016-09-28

    The city of Sioux Falls is the fastest growing community in South Dakota. In response to this continued growth and planning for future development, Sioux Falls requires a sustainable supply of municipal water. Planning and managing sustainable groundwater supplies requires a thorough understanding of local groundwater resources. The Big Sioux aquifer consists of glacial outwash sands and gravels and is hydraulically connected to the Big Sioux River, which provided about 90 percent of the city’s source-water production in 2015. Managing sustainable groundwater supplies also requires an understanding of groundwater availability. An effective mechanism to inform water management decisions is the development and utilization of a groundwater-flow model. A groundwater-flow model provides a quantitative framework for synthesizing field information and conceptualizing hydrogeologic processes. These groundwater-flow models can support decision making processes by mapping and characterizing the aquifer. Accordingly, the city of Sioux Falls partnered with the U.S. Geological Survey to construct a groundwater-flow model. Model inputs will include data from advanced geophysical techniques, specifically airborne electromagnetic methods.

  12. Natural regeneration processes in big sagebrush (Artemisia tridentata)

    Science.gov (United States)

    Schlaepfer, Daniel R.; Lauenroth, William K.; Bradford, John B.

    2014-01-01

    Big sagebrush, Artemisia tridentata Nuttall (Asteraceae), is the dominant plant species of large portions of semiarid western North America. However, much of historical big sagebrush vegetation has been removed or modified. Thus, regeneration is recognized as an important component for land management. Limited knowledge about key regeneration processes, however, represents an obstacle to identifying successful management practices and to gaining greater insight into the consequences of increasing disturbance frequency and global change. Therefore, our objective is to synthesize knowledge about natural big sagebrush regeneration. We identified and characterized the controls of big sagebrush seed production, germination, and establishment. The largest knowledge gaps and associated research needs include quiescence and dormancy of embryos and seedlings; variation in seed production and germination percentages; wet-thermal time model of germination; responses to frost events (including freezing/thawing of soils), CO2 concentration, and nutrients in combination with water availability; suitability of microsite vs. site conditions; competitive ability as well as seedling growth responses; and differences among subspecies and ecoregions. Potential impacts of climate change on big sagebrush regeneration could include that temperature increases may not have a large direct influence on regeneration due to the broad temperature optimum for regeneration, whereas indirect effects could include selection for populations with less stringent seed dormancy. Drier conditions will have direct negative effects on germination and seedling survival and could also lead to lighter seeds, which lowers germination success further. The short seed dispersal distance of big sagebrush may limit its tracking of suitable climate; whereas, the low competitive ability of big sagebrush seedlings may limit successful competition with species that track climate. An improved understanding of the

  13. Digital humanitarians how big data is changing the face of humanitarian response

    CERN Document Server

    Meier, Patrick

    2015-01-01

    The Rise of Digital HumanitariansMapping Haiti LiveSupporting Search And Rescue EffortsPreparing For The Long Haul Launching An SMS Life Line Sending In The Choppers Openstreetmap To The Rescue Post-Disaster Phase The Human Story Doing Battle With Big Data Rise Of Digital Humanitarians This Book And YouThe Rise of Big (Crisis) DataBig (Size) Data Finding Needles In Big (Size) Data Policy, Not Simply Technology Big (False) Data Unpacking Big (False) Data Calling 991 And 999 Big (

  14. Big Data Provenance: Challenges, State of the Art and Opportunities.

    Science.gov (United States)

    Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay

    2015-01-01

    Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data.

  15. [Embracing medical innovation in the era of big data].

    Science.gov (United States)

    You, Suning

    2015-01-01

    Along with the advent of big data era worldwide, medical field has to place itself in it inevitably. The current article thoroughly introduces the basic knowledge of big data, and points out the coexistence of its advantages and disadvantages. Although the innovations in medical field are struggling, the current medical pattern will be changed fundamentally by big data. The article also shows quick change of relevant analysis in big data era, depicts a good intention of digital medical, and proposes some wise advices to surgeons.

  16. Big Data and Health Economics: Opportunities, Challenges and Risks

    Directory of Open Access Journals (Sweden)

    Diego Bodas-Sagi

    2018-03-01

    Full Text Available Big Data offers opportunities in many fields. Healthcare is not an exception. In this paper we summarize the possibilities of Big Data and Big Data technologies to offer useful information to policy makers. In a world with tight public budgets and ageing populations we feel necessary to save costs in any production process. The use of outcomes from Big Data could be in the future a way to improve decisions at a lower cost than today. In addition to list the advantages of properly using data and technologies from Big Data, we also show some challenges and risks that analysts could face. We also present an hypothetical example of the use of administrative records with health information both for diagnoses and patients.

  17. Morphodynamic Response of the Unregulated Yampa River at Deerlodge to the 2011 Flood

    Science.gov (United States)

    Wheaton, J. M.; Scott, M.; Perkins, D.; DeMeurichy, K.

    2011-12-01

    The Yampa River, a tributary to the Green River, is the last undammed major tributary in the upper Colorado River Basin. The Yampa River at Deerlodge is actively braiding in an unconfined park valley setting, just upstream of the confined Yampa Canyon in Dinosaur National Monument. Deerlodge is a critical indicator site, which is monitored closely for signs of potential channel narrowing and associated invasions of non-native tamarisk or salt cedar (Tamarix) by the National Park Service's Northern Colorado Plateau Network (NPS-NCPN). Like many rivers draining the Rockies, the Yampa was fed by record snowpack in this year's spring runoff and produced the second largest flood of record at 748 cms (largest food of record was 940 cms in1984). In contrast to most major rivers in the Colorado Basin, which are now dammed, the Yampa's natural, unregulated floods are thought to be of critical importance in rejuvenating the floodplain and reorganizing habitat in a manner favorable to native riparian vegetation and unfavorable to tamarisk. As part of the Big Rivers Monitoring Protocol, a 1.5 km reach of the braided river was surveyed with sub-centimeter resolution ground-based LiDaR and a total station in September of 2010 and was resurveyed after the 2011floods. The ground-based LiDaR captures the vegetation as well as topography. Additionally, vegetation surveys were performed to identify plant species present, percent covers and relative abundance before and after the flood. The Geomorphic Change Detection software was used to distinguish the real net changes from noise and segregate the budget by specific mechanisms of geomorphic change associated with different channel and vegetative patterns. This quantitative study of the morphodynamic response to a major flood highlights a critical potential positive feedback the flood plays on native riparian vegetation recruitment and potential negative feedback on non-native tamarisk.

  18. Speaking sociologically with big data: symphonic social science and the future for big data research

    OpenAIRE

    Halford, Susan; Savage, Mike

    2017-01-01

    Recent years have seen persistent tension between proponents of big data analytics, using new forms of digital data to make computational and statistical claims about ‘the social’, and many sociologists sceptical about the value of big data, its associated methods and claims to knowledge. We seek to move beyond this, taking inspiration from a mode of argumentation pursued by Putnam (2000), Wilkinson and Pickett (2009) and Piketty (2014) that we label ‘symphonic social science’. This bears bot...

  19. Application and Exploration of Big Data Mining in Clinical Medicine.

    Science.gov (United States)

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-03-20

    To review theories and technologies of big data mining and their application in clinical medicine. Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster-Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Big data mining has the potential to play an important role in clinical medicine.

  20. Big Data in Public Health: Terminology, Machine Learning, and Privacy.

    Science.gov (United States)

    Mooney, Stephen J; Pejaver, Vikas

    2018-04-01

    The digital world is generating data at a staggering and still increasing rate. While these "big data" have unlocked novel opportunities to understand public health, they hold still greater potential for research and practice. This review explores several key issues that have arisen around big data. First, we propose a taxonomy of sources of big data to clarify terminology and identify threads common across some subtypes of big data. Next, we consider common public health research and practice uses for big data, including surveillance, hypothesis-generating research, and causal inference, while exploring the role that machine learning may play in each use. We then consider the ethical implications of the big data revolution with particular emphasis on maintaining appropriate care for privacy in a world in which technology is rapidly changing social norms regarding the need for (and even the meaning of) privacy. Finally, we make suggestions regarding structuring teams and training to succeed in working with big data in research and practice.

  1. Columbia River: Terminal fisheries research project. 1994 Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Hirose, P.; Miller, M.; Hill, J.

    1996-12-01

    Columbia River terminal fisheries have been conducted in Youngs Bay, Oregon, since the early 1960`s targeting coho salmon produced at the state facility on the North Fork Klaskanine River. In 1977 the Clatsop County Economic Development Council`s (CEDC) Fisheries Project began augmenting the Oregon Department of Fish and Wildlife production efforts. Together ODFW and CEDC smolt releases totaled 5,060,000 coho and 411,300 spring chinook in 1993 with most of the releases from the net pen acclimation program. During 1980-82 fall commercial terminal fisheries were conducted adjacent to the mouth of Big Creek in Oregon. All past terminal fisheries were successful in harvesting surplus hatchery fish with minimal impact on nonlocal weak stocks. In 1993 the Northwest Power Planning Council recommended in its` Strategy for Salmon that terminal fishing sites be identified and developed. The Council called on the Bonneville Power Administration to fund a 10-year study to investigate the feasibility of creating and expanding terminal known stock fisheries in the Columbia River Basin. The findings of the initial year of the study are included in this report. The geographic area considered for study extends from Bonneville Dam to the river mouth. The initial year`s work is the beginning of a 2-year research stage to investigate potential sites, salmon stocks, and methodologies; a second 3-year stage will focus on expansion in Youngs Bay and experimental releases into sites with greatest potential; and a final 5-year phase establishing programs at full capacity at all acceptable sites. After ranking all possible sites using five harvest and five rearing criteria, four sites in Oregon (Tongue Point, Blind Slough, Clifton Channel and Wallace Slough) and three in Washington (Deep River, Steamboat Slough and Cathlamet Channel) were chosen for study.

  2. Big Sites, Big Questions, Big Data, Big Problems: Scales of Investigation and Changing Perceptions of Archaeological Practice in the Southeastern United States

    Directory of Open Access Journals (Sweden)

    Cameron B Wesson

    2014-08-01

    Full Text Available Since at least the 1930s, archaeological investigations in the southeastern United States have placed a priority on expansive, near-complete, excavations of major sites throughout the region. Although there are considerable advantages to such large–scale excavations, projects conducted at this scale are also accompanied by a series of challenges regarding the comparability, integrity, and consistency of data recovery, analysis, and publication. We examine the history of large–scale excavations in the southeast in light of traditional views within the discipline that the region has contributed little to the ‘big questions’ of American archaeology. Recently published analyses of decades old data derived from Southeastern sites reveal both the positive and negative aspects of field research conducted at scales much larger than normally undertaken in archaeology. Furthermore, given the present trend toward the use of big data in the social sciences, we predict an increased use of large pre–existing datasets developed during the New Deal and other earlier periods of archaeological practice throughout the region.

  3. A proposed framework of big data readiness in public sectors

    Science.gov (United States)

    Ali, Raja Haslinda Raja Mohd; Mohamad, Rosli; Sudin, Suhizaz

    2016-08-01

    Growing interest over big data mainly linked to its great potential to unveil unforeseen pattern or profiles that support organisation's key business decisions. Following private sector moves to embrace big data, the government sector has now getting into the bandwagon. Big data has been considered as one of the potential tools to enhance service delivery of the public sector within its financial resources constraints. Malaysian government, particularly, has considered big data as one of the main national agenda. Regardless of government commitment to promote big data amongst government agencies, degrees of readiness of the government agencies as well as their employees are crucial in ensuring successful deployment of big data. This paper, therefore, proposes a conceptual framework to investigate perceived readiness of big data potentials amongst Malaysian government agencies. Perceived readiness of 28 ministries and their respective employees will be assessed using both qualitative (interview) and quantitative (survey) approaches. The outcome of the study is expected to offer meaningful insight on factors affecting change readiness among public agencies on big data potentials and the expected outcome from greater/lower change readiness among the public sectors.

  4. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  5. Jungmann's translation of Paradise Lost

    OpenAIRE

    Janů, Karel

    2014-01-01

    This thesis examines Josef Jungmann's translation of John Milton's Paradise Lost. Josef Jungmann was one of the leading figures of the Czech National Revival and translated Milton 's poem between the years 1800 and 1804. The thesis covers Jungmann's theoretical model of translation and presents Jungmann's motives for translation of Milton's epic poem. The paper also describes the aims Jungmann had with his translation and whether he has achieved them. The reception Jungmann's translation rece...

  6. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  7. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  8. Sedimentation in Lake Onalaska, Navigation Pool 7, upper Mississippi River, since impoundment

    Science.gov (United States)

    Korschgen, C.E.; Jackson, G.A.; Muessig, L.F.; Southworth, D.C.

    1987-01-01

    Sediment accumulation was evaluated in Lake Onalaska, a 2800-ha backwater impoundment on the Upper Mississippi River. Computer programs were used to process fathometric charts and generate an extensive data set on water depth for the lake. Comparison of 1983 survey data with pre-impoundment (before 1937) data showed that Lake Onalaska had lost less than 10 percent of its original mean depth in the 46 years since impoundment. Previous estimates of sedimentation rates based on Cesium-137 sediment core analysis appear to have been too high. (DBO)

  9. Historical photogrammetry: Bird's Paluxy River dinosaur chase sequence digitally reconstructed as it was prior to excavation 70 years ago.

    Science.gov (United States)

    Falkingham, Peter L; Bates, Karl T; Farlow, James O

    2014-01-01

    It is inevitable that some important specimens will become lost or damaged over time, conservation is therefore of vital importance. The Paluxy River dinosaur tracksite is among the most famous in the world. In 1940, Roland T. Bird described and excavated a portion of the site containing associated theropod and sauropod trackways. This excavated trackway was split up and housed in different institutions, and during the process a portion was lost or destroyed. We applied photogrammetric techniques to photographs taken by Bird over 70 years ago, before the trackway was removed, to digitally reconstruct the site as it was prior to excavation. The 3D digital model offers the opportunity to corroborate maps drawn by R.T. Bird when the tracksite was first described. More broadly, this work demonstrates the exciting potential for digitally recreating palaeontological, geological, or archaeological specimens that have been lost to science, but for which photographic documentation exists.

  10. BIG´s italesættelse af BIG

    DEFF Research Database (Denmark)

    Brodersen, Anne Mygind; Sørensen, Britta Vilhelmine; Seiding, Mette

    2008-01-01

    Since Bjarke Ingels established the BIG (Bjarke Ingels Group) architectural firm in 2006, the company has succeeded in making itself heard and in attracting the attention of politicians and the media. BIG did so first and foremost by means of an overall approach to urban development that is both...... close to the political powers that be, and gain their support, but also to attract attention in the public debate. We present the issues this way: How does BIG speak out for itself? How can we explain the way the company makes itself heard, based on an analysis of the big.dk web site, the Clover Block...... by sidestepping the usual democratic process required for local plans. Politicians declared a positive interest in both the building project and a rapid decision process. However, local interest groups felt they were excluded from any influence regarding the proposal and launched a massive resistance campaign...

  11. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Veneziano, G.

    2000-01-01

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  12. CERN: A big year for LEP

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    In April this year's data-taking period for CERN's big LEP electron-positron collider got underway, and is scheduled to continue until November. The immediate objective of the four big experiments - Aleph, Delphi, L3 and Opal - will be to increase considerably their stock of carefully recorded Z decays, currently totalling about three-quarters of a million

  13. Impact of surface water recharge on the design of a groundwater monitoring system for the Radioactive Waste Management Complex, Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wood, T.R.

    1990-01-01

    Recent hydrogeologic studies have been initiated to characterize the hydrogeologic conditions at the Radioactive Waste Management Complex (RWMC) at the Idaho National Engineering Laboratory (INEL). Measured water levels in wells penetrating the Snake River Plain aquifer near the RWMC and the corresponding direction of flow show change over time. This change is related to water table mounding caused by recharge from excess water diverted from the Big Lost River for flood protection during high flows. Water levels in most wells near the RWMC rise on the order of 10 ft (3 m) in response to recharge, with water in one well rising over 60 ft (18 m). Recharge changes the normal south-southwest direction of flow to the east. Design of the proposed groundwater monitoring network for the RWMC must account for the variable directions of groundwater flow. 11 refs., 9 figs., 2 tabs

  14. Concern about Lost Talent: Support Document

    Science.gov (United States)

    Sikora, Joanna; Saha, Lawrence J.

    2011-01-01

    This document was produced by the authors based on their research for the report "Lost talent? The occupational ambitions and attainments of young Australians", and is an added resource for further information. The purpose of this supplement is to provide greater detail about the background of research into the topic of human talent in…

  15. Geospatial Characterization of Fluvial Wood Arrangement in a Semi-confined Alluvial River

    Science.gov (United States)

    Martin, D. J.; Harden, C. P.; Pavlowsky, R. T.

    2014-12-01

    Large woody debris (LWD) has become universally recognized as an integral component of fluvial systems, and as a result, has become increasingly common as a river restoration tool. However, "natural" processes of wood recruitment and the subsequent arrangement of LWD within the river network are poorly understood. This research used a suite of spatial statistics to investigate longitudinal arrangement patterns of LWD in a low-gradient, Midwestern river. First, a large-scale GPS inventory of LWD, performed on the Big River in the eastern Missouri Ozarks, resulted in over 4,000 logged positions of LWD along seven river segments that covered nearly 100 km of the 237 km river system. A global Moran's I analysis indicates that LWD density is spatially autocorrelated and displays a clustering tendency within all seven river segments (P-value range = 0.000 to 0.054). A local Moran's I analysis identified specific locations along the segments where clustering occurs and revealed that, on average, clusters of LWD density (high or low) spanned 400 m. Spectral analyses revealed that, in some segments, LWD density is spatially periodic. Two segments displayed strong periodicity, while the remaining segments displayed varying degrees of noisiness. Periodicity showed a positive association with gravel bar spacing and meander wavelength, although there were insufficient data to statistically confirm the relationship. A wavelet analysis was then performed to investigate periodicity relative to location along the segment. The wavelet analysis identified significant (α = 0.05) periodicity at discrete locations along each of the segments. Those reaches yielding strong periodicity showed stronger relationships between LWD density and the geomorphic/riparian independent variables tested. Analyses consistently identified valley width and sinuosity as being associated with LWD density. The results of these analyses contribute a new perspective on the longitudinal distribution of LWD in

  16. Climate change impact on the river runoff: regional study for the Central Asian Region

    International Nuclear Information System (INIS)

    Agaitseva, Natalya

    2004-01-01

    increase is expected in evaporation from water surfaces of 15-20%. The most severe and climate conditions in the watershed area were predicted under the CCCM model. According to this model, if CO 2 concentration in the atmosphere is doubled, then the runoffs of the Syrdarya and Amudarya rivers are expected to be reduced by 28 and 40%, respectively. According to GFDL and GISS scenarios, presented.(Author)e experiencethe catchment area would increase by 3-4 o C and average annual precipitation volume by 10-15%. Under these scenarios, one could expect that no significant reduction in the Amudarya and Syrdarya runoff would occur. An air temperature rise of 1-2 o C will intensify the process of ice degradation. In 1957-180 glaciers in the Aral Sea river basins lost 115.5 km 3 Of ice (approximately 104 km 3 of water), which constituted almost 20 per cent of the 1957 ice reserve. By 2000 another 14 per cent of the 1957 reserve were lost. By 2020 glaciers will lose at least another 10 per cent of their initial volume. Calculations of regional climatic scenarios by the year 2030 also indicate persistence of present runoff volumes accompanied by an increase in fluctuations from year. Longer-term assessments are more pessimistic, since, along with increasing evaporation, water resource inputs (snow and glaciers in the mountains) are continuously shrinking. (Author)

  17. Research on the Impact of Big Data on Logistics

    Directory of Open Access Journals (Sweden)

    Wang Yaxing

    2017-01-01

    Full Text Available In the context of big data development, a large amount of data will appear at logistics enterprises, especially in the aspect of logistics, such as transportation, warehousing, distribution and so on. Based on the analysis of the characteristics of big data, this paper studies the impact of big data on the logistics and its action mechanism, and gives reasonable suggestions. Through building logistics data center by using the big data technology, some hidden value information behind the data will be digged out, in which the logistics enterprises can benefit from it.

  18. Concurrence of big data analytics and healthcare: A systematic review.

    Science.gov (United States)

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  19. ATLAS BigPanDA Monitoring

    CERN Document Server

    Padolski, Siarhei; The ATLAS collaboration; Klimentov, Alexei; Korchuganova, Tatiana

    2017-01-01

    BigPanDA monitoring is a web based application which provides various processing and representation of the Production and Distributed Analysis (PanDA) system objects states. Analyzing hundreds of millions of computation entities such as an event or a job BigPanDA monitoring builds different scale and levels of abstraction reports in real time mode. Provided information allows users to drill down into the reason of a concrete event failure or observe system bigger picture such as tracking the computation nucleus and satellites performance or the progress of whole production campaign. PanDA system was originally developed for the Atlas experiment and today effectively managing more than 2 million jobs per day distributed over 170 computing centers worldwide. BigPanDA is its core component commissioned in the middle of 2014 and now is the primary source of information for ATLAS users about state of their computations and the source of decision support information for shifters, operators and managers. In this wor...

  20. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.